GPT2 Ukrainian
A generative language model for the Ukrainian language follows the GPT-2 architecture (124M parameters).
- hidden size: 768
- number of heads: 12
- number of layers: 12
- seq length: 1024
- tokens: 11238113280 (3 epochs)
- steps: 57167
Training data
- OSCAR
- Wikimedia dumps
License
MIT
- Downloads last month
- 43
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.