Edit model card

pretrain dont use will ouput garbage

adjusted tensors for the new tokenizer. Some weights of LlamaForCausalLM were not initialized from the model checkpoint at mylesgoose/Meta-Llama-3.1-8B-Instruct-goose-abliterated-reflection and are newly initialized because the shapes did not match:

  • model.embed_tokens.weight: found shape torch.Size([128256, 4096]) in the checkpoint and torch.Size([128262, 4096]) in the model instantiated
  • lm_head.weight: found shape torch.Size([128256, 4096]) in the checkpoint and torch.Size([128262, 4096]) in the model instantiated You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
Downloads last month
0
Safetensors
Model size
8.03B params
Tensor type
FP16
·
Inference API
Unable to determine this model's library. Check the docs .