GPT2 Models
Collection
All gpt2 models were trained from scratch
•
3 items
•
Updated
This is the polish gpt2 model in medium architecture.
This model was released on 30.11.2023.
Data which are used to train this model:
It is about 30,5 GB of data which is 3 times more than the prevoius version.