Mabeck's picture
Update README.md
261b40c verified
|
raw
history blame
1.79 kB
metadata
language:
  - en
  - da
tags:
  - text-generation-inference
  - transformers
  - unsloth
  - mistral
  - trl
base_model: Mabeck/Heidrun-Mistral-7B-base
datasets:
  - oscar
  - Mabeck/danish-OpenHermes
  - kobprof/skolegpt-instruct
Heidrun Logo

Model description

Heidrun-Mistral-7B-chat is a chat-model based on Heidrun-Mistral-7B-base, finetuned on danish-OpenHermes and skoleGPT for a instruction/chat format.

Datasets

This model is trained on Danish instruction datasets, which have not been safeguarded or alligned.

Most of the data has been machine-translated and may contain incorrect responses.

Samples

This model uses the ChatML format. Using other formats will severely degrade the models performance. ChatML format:

<|im_start|>system
Du er en hjælpsom AI-assistent, der svarer på spørgsmål som en bruger stiller dig. Tænk over spørgsmålet og uddyb dit svar.
<|im_end|>
<|im_start|>user
How are you?<|im_end|>
<|im_start|>assistant
I am doing well!<|im_end|>
<|im_start|>user
Please tell me about how mistral winds have attracted super-orcas.<|im_end|>
<|im_start|>assistant

Uploaded model

  • Developed by: Mabeck
  • Finetuned from model : Mabeck/Heidrun-Mistral-7B-base

This mistral model was trained 2x faster with Unsloth and Huggingface's TRL library.