--- license: apache-2.0 --- This repo contains SlovenianGPT - the best open-source base 7B LLM for Slovenian language developed by Aleksa Gordić (see [LinkedIn announcement](https://www.linkedin.com/posts/aleksagordic_sloveniangpt-7b-done-training-after-7-days-activity-7171819162397204480-8U4A?utm_source=share)). If you're interested in more powerful instruct models for Slovenian languages feel free to reach out via email (surname.name at gmail com)/[LinkedIn](https://www.linkedin.com/in/aleksagordic). SlovenianGPT SlovenianGPT eval results compared to Mistral 7B, LLaMA 2 7b, and Gemma (also see this [LinkedIn post](https://www.linkedin.com/posts/aleksagordic_happy-to-announce-slovenian-llm-eval-and-activity-7173332520757772288-RB0c?utm_source=share) for more info): SlovenianGPT eval results Instruct-SlovenianGPT eval results ([LinkedIn post](https://www.linkedin.com/posts/aleksagordic_very-happy-to-announce-sloveniangpt-instruct-activity-7181638113197248512-5jvs?utm_source=share)): SlovenianGPT eval results Eval was computed using https://github.com/gordicaleksa/slovenian-llm-eval The model was trained on tens of billions of Slovenian language tokens and is based off of [Mistral 7B](https://huggingface.co/mistralai/Mistral-7B-v0.1). ## Notes 1) SlovenianGPT is a base model and therefore does not have any moderation mechanisms. 2) Since it's a base model it won't follow your instructions as it's just a powerful autocomplete engine. 3) If you want an access to much more powerful Slovenian LLMs - feel free to reach out to me via email (surname.name at gmail com)/[LinkedIn](https://www.linkedin.com/in/aleksagordic). # Credits The data for the project was obtained with the help of [Nikola Ljubešić](https://nljubesi.github.io/). **Also a big thank you to the following individuals:** - [**Aleksander Segedi**](https://www.linkedin.com/in/aleksander-segedi-08430936/) - for help around bookkeeping! ## Citation ``` @article{SlovenianGPT, author = "Gordić Aleksa", title = "SlovenianGPT - an open-source LLM for Slovenian language", year = "2024" howpublished = {\url{https://huggingface.co/gordicaleksa/SlovenianGPT}}, } ```