license: apache-2.0 | |
language: | |
- en | |
pipeline_tag: text-generation | |
tags: | |
- chat | |
# Qwen2-VL-7B-Instruct-MNN | |
## Introduction | |
This model is a 4-bit quantized version of the MNN model exported from Qwen2-VL-7B-Instruct using [llm-export](https://github.com/wangzhaode/llm-export). | |