---
license: apache-2.0
datasets:
- kobkrit/rd-taxqa
- iapp_wiki_qa_squad
- Thaweewat/alpaca-cleaned-52k-th
- Thaweewat/instruction-wild-52k-th
- Thaweewat/databricks-dolly-15k-th
- Thaweewat/hc3-24k-th
- Thaweewat/gpteacher-20k-th
- Thaweewat/onet-m6-social
- Thaweewat/alpaca-finance-43k-th
language:
- th
- en
library_name: transformers
pipeline_tag: text-generation
tags:
- openthaigpt
- llama
---
# 🇹🇠OpenThaiGPT 1.0.0-alpha
OpenThaiGPT Version 1.0.0-alpha is the first Thai implementation of a 7B-parameter LLaMA v2 Chat model finetuned to follow Thai translated instructions below and makes use of the Huggingface LLaMA implementation.
# ---- Full Huggingface Checkpoint Model ----
## Upgrade from OpenThaiGPT 0.1.0-beta
- Using Facebook LLama v2 model 7b chat as a base model which is pretrained on over 2 trillion token.
- Context Length is upgrade from 2048 token to 4096 token
- Allow research and commerical use.a
## Pretrain Model
- [https://huggingface.co/meta-llama/Llama-2-7b-chat](https://huggingface.co/meta-llama/Llama-2-7b-chat)
## Support
- Official website: https://openthaigpt.aieat.or.th
- Facebook page: https://web.facebook.com/groups/openthaigpt
- A Discord server for discussion and support [here](https://discord.gg/rUTp6dfVUF)
- E-mail: kobkrit@iapp.co.th
## License
**Source Code**: License Apache Software License 2.0.
**Weight**: Research and **Commercial uses**.
## Code and Weight
**Colab Demo**: https://colab.research.google.com/drive/1kDQidCtY9lDpk49i7P3JjLAcJM04lawu?usp=sharing
**Finetune Code**: https://github.com/OpenThaiGPT/openthaigpt-finetune-010beta
**Inference Code**: https://github.com/OpenThaiGPT/openthaigpt
**Weight (Lora Adapter)**: https://huggingface.co/openthaigpt/openthaigpt-1.0.0-alpha-7b-chat
**Weight (Huggingface Checkpoint)**: https://huggingface.co/openthaigpt/openthaigpt-1.0.0-alpha-7b-chat-ckpt-hf
**Weight (GGML)**: https://huggingface.co/openthaigpt/openthaigpt-1.0.0-alpha-7b-chat-ggml
**Weight (Quantized 4bit GGML)**: https://huggingface.co/openthaigpt/openthaigpt-1.0.0-alpha-7b-chat-ggml-q4
## Sponsors
Pantip.com, ThaiSC
### Powered by
OpenThaiGPT Volunteers, Artificial Intelligence Entrepreneur Association of Thailand (AIEAT), and Artificial Intelligence Association of Thailand (AIAT)
### Authors
* Kobkrit Viriyayudhakorn (kobkrit@aieat.or.th)
* Sumeth Yuenyong (sumeth.yue@mahidol.edu)
* Thaweewat Rugsujarit (thaweewr@scg.com)
* Jillaphat Jaroenkantasima (autsadang41@gmail.com)
* Norapat Buppodom (new@norapat.com)
* Koravich Sangkaew (kwankoravich@gmail.com)
* Peerawat Rojratchadakorn (peerawat.roj@gmail.com)
* Surapon Nonesung (nonesungsurapon@gmail.com)
* Chanon Utupon (chanon.utupon@gmail.com)
* Sadhis Wongprayoon (sadhis.tae@gmail.com)
* Nucharee Thongthungwong (nuchhub@hotmail.com)
* Chawakorn Phiantham (mondcha1507@gmail.com)
* Patteera Triamamornwooth (patt.patteera@gmail.com)
* Nattarika Juntarapaoraya (natt.juntara@gmail.com)
* Kriangkrai Saetan (kraitan.ss21@gmail.com)
* Pitikorn Khlaisamniang (pitikorn32@gmail.com)
Disclaimer: Provided responses are not guaranteed.