license: apache-2.0 | |
language: | |
- en | |
pipeline_tag: text-generation | |
tags: | |
- chat | |
# internlm-chat-7b-MNN | |
## Introduction | |
This model is a 4-bit quantized version of the MNN model exported from internlm-chat-7b using [llm-export](https://github.com/wangzhaode/llm-export). | |