aashish1904's picture
Upload README.md with huggingface_hub
aef5608 verified
|
raw
history blame
1.31 kB
metadata
library_name: transformers
inference:
  parameters:
    temperature: 1
    top_p: 0.95
    top_k: 40
    repetition_penalty: 1.2
license: apache-2.0
language:
  - en
pipeline_tag: text-generation

QuantFactory Banner

QuantFactory/Ministral-4b-instruct-GGUF

This is quantized version of ministral/Ministral-4b-instruct created using llama.cpp

Original Model Card

image/jpeg

Model Description

Ministral is a series of language model, build with same architecture as the famous Mistral model, but with less size.

  • Model type: A 4B parameter GPT-like model fine-tuned on a mix of publicly available, synthetic datasets.
  • Language(s) (NLP): Primarily English
  • License: Apache 2.0
  • Finetuned from model: mistralai/Mistral-7B-v0.1