Update README.md
Browse files
README.md
CHANGED
@@ -44,11 +44,11 @@ quantized_by: TheBloke
|
|
44 |
|
45 |
This repo contains **EXPERIMENTAL** GPTQ model files for [Mistral AI_'s Mixtral 8X7B Instruct v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1).
|
46 |
|
47 |
-
## Requires AutoGPTQ PR
|
48 |
|
49 |
These files were made with, and will currently only work with, this AutoGPTQ PR: https://github.com/LaaZa/AutoGPTQ/tree/Mixtral
|
50 |
|
51 |
-
To test, please build AutoGPTQ from source using that PR.
|
52 |
|
53 |
Transformers support has just arrived also via two PRs - and is expected in main Transformers + Optimum tomorrow (Dec 12th).
|
54 |
|
|
|
44 |
|
45 |
This repo contains **EXPERIMENTAL** GPTQ model files for [Mistral AI_'s Mixtral 8X7B Instruct v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1).
|
46 |
|
47 |
+
## Requires AutoGPTQ PR + transformers 4.36.0
|
48 |
|
49 |
These files were made with, and will currently only work with, this AutoGPTQ PR: https://github.com/LaaZa/AutoGPTQ/tree/Mixtral
|
50 |
|
51 |
+
To test, please build AutoGPTQ from source using that PR. You also need Transformers version 4.36.0, released December 11th.
|
52 |
|
53 |
Transformers support has just arrived also via two PRs - and is expected in main Transformers + Optimum tomorrow (Dec 12th).
|
54 |
|