metadata
language:
- en
- da
tags:
- text-generation-inference
- transformers
- unsloth
- mistral
- trl
base_model: mistralai/Mistral-7B-v0.1
datasets:
- wikimedia/wikipedia
license: mit
Model description
Heidrun-Mistral-7B-base is a generative text model based on Mistral-7B. It has been further pretrained on a subset of the Danish corpus from Wikipedia, Wikibooks and small parts of Hestenettet for 2 epochs.
It is a foundational/completion model with potential for further finetuning.
For inference or chatting please check out Heidrun-Mistral-7B-chat.
Previous version
Please note that this has been updated since the original release. The old version can be found under branch v0.1.
Uploaded model
- Developed by: Mabeck
- Finetuned from model : mistralai/Mistral-7B-v0.1
This mistral model was trained 2x faster with Unsloth and Huggingface's TRL library.