Tess-M-34B-2bit / README.md
KnutJaegersberg's picture
Update README.md
ac691e9
|
raw
history blame
2.16 kB
metadata
license: other
license_name: yi-license
license_link: LICENSE
pipeline_tag: text-generation

This is a 2-bit quantization of @migtissera Tess-M-34b-v1.4 using quip# (https://cornell-relaxml.github.io/quip-sharp/) with hessian context lenght 8k.

"Tess, short for Tesoro (Treasure in Italian), is a general purpose Large Language Model series. Tess-M-v1.4 was trained on the Yi-34B-200K base."

Perplexity on the dev set as repoted by quip# was slightly below 7 compared to slightly below 6 of the original. Inference with the model is a bit slow, but with the long context length it should provide one of the best performing few-shot models for consumer and data science GPUs, especially if the instances are longer and the answers relatively short.

Prompt Format:

SYSTEM: <ANY SYSTEM CONTEXT>
USER: 
ASSISTANT:

image/png

The quip library updated whilst I ran the long running calculation of the hessians. That's why I have included the quip library I have used in this repo. I am able to use this model in the widely known textgen-webui. For installation I suggest to follow these steps:

  1. Download the quip folder from this repo and place it inside the repositories folder of the textgen-webui folder.
  2. install the requirements of quip#
  3. compile and install the quiptools cuda lib:
pip install fast-hadamard-transform glog==0.3.1 primefac==2.0.12
cd repositories/quip-sharp/quiptools
python setup.py install --force
  1. reinstall the requirements of textgen-webui
  2. load the model with the quip# integration of textgen-webui

You can use the library of this repo also for scripts. Within the quip# folder, after installing the library, use this command:

python interactive_gen.py --hf_path path_to_the_2bitmodel --max_length 500

License The Yi series models are fully open for academic research and free commercial usage with permission via applications. All usage must adhere to the Model License Agreement 2.0. To apply for the official commercial license, please contact us ([email protected]).