Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
Akram1990
's Collections
PR
DPO RLHF MISTRAL-7B
DPO RLHF MISTRAL-7B
updated
Apr 3
Upvote
1
mistralai/Mistral-7B-v0.1
Text Generation
•
Updated
Jul 24
•
415k
•
•
3.38k
Upvote
1
Share collection
View history
Collection guide
Browse collections