Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
Xiaodong
/
Next-DPO-iter2
like
0
Safetensors
Xiaodong/DPO-iter2-data-8k
Model card
Files
Files and versions
Community
main
Next-DPO-iter2
2 contributors
History:
6 commits
Xiaodong
Update README.md
c27162b
verified
about 1 month ago
checkpoint-500
upload ckpt
about 1 month ago
.gitattributes
Safe
1.52 kB
initial commit
about 1 month ago
README.md
Safe
286 Bytes
Update README.md
about 1 month ago
aug_f4_add_chosen_0_8000.jsonl
Safe
6.17 MB
Upload aug_f4_add_chosen_0_8000.jsonl
about 1 month ago