Code for finetuneing/continued pretraining?
#3
by
rombodawg
- opened
I plan on getting a rtx a6000 48gb soon, and seeing how small the model is, id like to see if I can continue to finetune the model further. Do you have code published anywhere that shows how I can finetune the weights?
they are training a larger model