German Assistant Models
Collection
This models are trained during the development of the Atra project (https://github.com/flozi00/atra)
For Inference always use latest model from top
•
10 items
•
Updated
This model is an finetuned version for german instructions and conversations in style of Alpaca. "### Assistant:" "### User:" The dataset used is deduplicated and cleaned, with no codes inside. The focus is on instruction following and conversational tasks.
The model archictecture is based on codellama with 34B parameters, trained on 100% renewable energy powered hardware.
This work is contributed by private research of flozi00
Join discussions about german llm research, and plan larger training runs together: https://join.slack.com/t/slack-dtc7771/shared_invite/zt-219keplqu-hLwjm0xcFAOX7enERfBz0Q