requests / moreh /MoMo-72B-lora-1.8.7-DPO_eval_request_False_float32_Original.json

Commit History

Upload /moreh/MoMo-72B-lora-1.8.7-DPO_eval_request_False_float32_Original.json with huggingface_hub
442f173
verified

pminervini commited on

Add moreh/MoMo-72B-lora-1.8.7-DPO to eval queue
4ab98c1
verified

pminervini commited on