Qwen2-7B-Instruct-DPO-novel-beta0.5 / adapter_model.safetensors

Commit History

Upload adapter_model.safetensors with huggingface_hub
869d7c5
verified

XiaoY1 commited on