Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
Xiaodong
/
Next-DPO-iter2
like
0
Safetensors
Xiaodong/DPO-iter2-data-8k
Model card
Files
Files and versions
Community
main
Next-DPO-iter2
Commit History
Update README.md
c27162b
verified
Xiaodong
commited on
Oct 13
Update README.md
8a53b49
verified
Xiaodong
commited on
Oct 13
Create README.md
a82d0ff
verified
Xiaodong
commited on
Oct 13
Upload aug_f4_add_chosen_0_8000.jsonl
b40531c
verified
Xiaodong
commited on
Oct 13
upload ckpt
d4e8c62
Wang-Xiaodong1899
commited on
Oct 13
initial commit
1a5d7e7
verified
Xiaodong
commited on
Oct 13