---
license: apache-2.0
datasets:
- kobkrit/rd-taxqa
- iapp_wiki_qa_squad
- Thaweewat/alpaca-cleaned-52k-th
- Thaweewat/instruction-wild-52k-th
- Thaweewat/databricks-dolly-15k-th
- Thaweewat/hc3-24k-th
- Thaweewat/gpteacher-20k-th
- Thaweewat/onet-m6-social
- Thaweewat/alpaca-finance-43k-th
language:
- th
- en
library_name: adapter-transformers
pipeline_tag: text-generation
tags:
- openthaigpt
- llama
---
# 🇹🇠OpenThaiGPT 0.1.0-beta
OpenThaiGPT Version 0.1.0-beta is a 7B-parameter LLaMA model finetuned to follow Thai translated instructions below and makes use of the Huggingface LLaMA implementation.
## Support
- Official website: https://openthaigpt.aieat.or.th
- Facebook page: https://web.facebook.com/groups/openthaigpt
- A Discord server for discussion and support [here](https://discord.gg/rUTp6dfVUF)
- E-mail: kobkrit@iapp.co.th
## License
**Source Code**: License Apache Software License 2.0.
**Weight**: For research use only (due to the Facebook LLama's Weight LICENSE).
Note that: A commercial use license for OpenThaiGPT 0.1.0 weight will be released later soon!
## Code and Weight
**Finetune Code**: https://github.com/OpenThaiGPT/openthaigpt-finetune-010beta
**Inference Code**: https://github.com/OpenThaiGPT/openthaigpt
**Weight**: https://huggingface.co/kobkrit/openthaigpt-0.1.0-beta
## Sponsors
Pantip.com, ThaiSC
### Powered by
OpenThaiGPT Volunteers, Artificial Intelligence Entrepreneur Association of Thailand (AIEAT), and Artificial Intelligence Association of Thailand (AIAT)
### Authors
Kobkrit Viriyayudhakorn (kobkrit@iapp.co.th), Sumeth Yuenyong (sumeth.yue@mahidol.edu) and Thaweewat Ruksujarit (thaweewr@scg.com).
Disclaimer: Provided responses are not guaranteed.