File size: 581 Bytes
519ed14 43ff150 |
1 2 3 4 5 6 7 8 9 10 |
---
license: other
---
https://huggingface.co/chargoddard/llama2-22b-blocktriangular trained one one epoch of 52k rows of Stanford Alpaca. About 11 hours on a 3090.
I had trouble with training using the other 22b method with `BLOCK_DIAGONAL=True`, but with this method, this is the first time I've been able to target all modules without breaking the output.
`target_modules = ["q_proj", "k_proj", "v_proj", "o_proj", "up_proj", "gate_proj", "down_proj"]`
Trained at 5e-5 with r=32. For more info see https://wandb.ai/nkpz/huggingface/runs/3oy5nbtv/workspace?workspace=user-nkpz |