license: mit | |
datasets: | |
- Vikhrmodels/GrandMaster-PRO-MAX | |
language: | |
- ru | |
- en | |
base_model: | |
- Qwen/Qwen2-VL-2B-Instruct | |
pipeline_tag: text2text-generation | |
tags: | |
- multimodal | |
library_name: transformers | |
# tvl-mini | |
## Description | |
This is finetune of Qwen2-VL-2B on russian language. | |
tvl was trained in bf16 | |
## Data | |
Train dataset contains: | |
- GrandMaster-PRO-MAX dataset (60k samples) | |
- Translated, humanized and merged by image subset of GQA (TODO) |