ChasapasK/ministral-3b-instruct-GGUF
This is gguf version of ministral/Ministral-3b-instruct created using llama.cpp
8-bit (Q8) quantized model
Original Model Card
Model Description
Ministral is a series of language model, build with same architecture as the famous Mistral model, but with less size.
- Model type: A 3B parameter GPT-like model fine-tuned on a mix of publicly available, synthetic datasets.
- Language(s) (NLP): Primarily English
- License: Apache 2.0
- Finetuned from model: mistralai/Mistral-7B-v0.1
- Downloads last month
- 53
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.