base model: ENERGY-DRINK-LOVE/solar_merge dataset:
- custom DPO dataset
- custom translation dataset based on self-trained translation model
- open DPO dataset
- self-made DPO dataset
Trained with deepspeed
base model: ENERGY-DRINK-LOVE/solar_merge dataset:
Trained with deepspeed