WaveCut's picture
Update README.md
e5b81aa verified
|
raw
history blame contribute delete
No virus
620 Bytes
---
language:
- ru
- en
tags:
- mlx
datasets:
- zjkarina/Vikhr_instruct
- dichspace/darulm
---
# mlx-community/Vikhr-7B-instruct_0.2-4bit
This model was converted to MLX format from [`Vikhrmodels/Vikhr-7B-instruct_0.2`]() using mlx-lm version **0.6.0**.
Refer to the [original model card](https://huggingface.co/Vikhrmodels/Vikhr-7B-instruct_0.2) for more details on the model.
## Use with mlx
```bash
pip install mlx-lm
```
```python
from mlx_lm import load, generate
model, tokenizer = load("mlx-community/Vikhr-7B-instruct_0.2-4bit")
response = generate(model, tokenizer, prompt="Привет", verbose=True)
```