--- language: en license: apache-2.0 datasets: - sst2 - glue tags: - openvino --- ## distilbert-base-uncased-finetuned-sst-2-english [distilbert-base-uncased-finetuned-sst-2-english](https://huggingface.co/distilbert-base-uncased-finetuned-sst-2-english) quantized with NNCF PTQ and exported to OpenVINO IR. **Model Description:** This model reaches an accuracy of 90.0 on the validation set. See [ov\_config.json](./ov_config.json) for the quantization config. ## Usage example To install the requirements for using the OpenVINO backend, do: ``` pip install git+https://github.com/huggingface/optimum-intel.git#egg=optimum-intel[openvino] ``` This installs all necessary dependencies, including Transformers and OpenVINO. *NOTE: Python 3.7-3.9 are supported. A virtualenv is recommended.* You can use this model with Transformers *pipeline*. ```python from transformers import AutoTokenizer, pipeline from optimum.intel.openvino import OVModelForSequenceClassification model_id = "helenai/distilbert-base-uncased-finetuned-sst-2-english-ov-int8" model = OVModelForSequenceClassification.from_pretrained(model_id) tokenizer = AutoTokenizer.from_pretrained(model_id) cls_pipe = pipeline("text-classification", model=model, tokenizer=tokenizer) text = "OpenVINO is awesome!" outputs = cls_pipe(text) print(outputs) ``` Example output: ```sh [{'label': 'POSITIVE', 'score': 0.9998594522476196}] ```