File size: 763 Bytes
a80e91a
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
---
license: other
license_name: meta
license_link: https://ai.meta.com/llama/license
---
pretrain dont use will ouput garbage

adjusted tensors for the new tokenizer.
Some weights of LlamaForCausalLM were not initialized from the model checkpoint at mylesgoose/Meta-Llama-3.1-8B-Instruct-goose-abliterated-reflection and are newly initialized because the shapes did not match:
- model.embed_tokens.weight: found shape torch.Size([128256, 4096]) in the checkpoint and torch.Size([128262, 4096]) in the model instantiated
- lm_head.weight: found shape torch.Size([128256, 4096]) in the checkpoint and torch.Size([128262, 4096]) in the model instantiated
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.