Could you share the training config?
#4
by
Lohse
- opened
I'm conducting a finetuning on alpaca-gpt4, could you share the training config (training script), like
batchsize
lr
lr_scheduler_type
seq_length
lora_rank
, and how to preprocess the input?
:)