File size: 1,410 Bytes
fcff3e3
 
 
 
 
 
 
 
 
 
 
 
e125708
fcff3e3
e125708
fcff3e3
 
 
f7bdc0e
 
 
 
 
 
 
 
 
 
fcff3e3
 
 
 
 
 
 
 
 
 
f7bdc0e
fcff3e3
 
 
 
 
 
f7bdc0e
 
fcff3e3
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
---
language: en
license: apache-2.0
datasets:
- sst2
- glue
tags:
- openvino
---

## distilbert-base-uncased-finetuned-sst-2-english

[distilbert-base-uncased-finetuned-sst-2-english](https://huggingface.co/distilbert-base-uncased-finetuned-sst-2-english) quantized with NNCF PTQ and exported to OpenVINO IR.

**Model Description:** This model reaches an accuracy of 90.0 on the validation set. See [ov\_config.json](./ov_config.json) for the quantization config.

## Usage example

To install the requirements for using the OpenVINO backend, do: 

```
pip install git+https://github.com/huggingface/optimum-intel.git#egg=optimum-intel[openvino]
```

This installs all necessary dependencies, including Transformers and OpenVINO.

*NOTE: Python 3.7-3.9 are supported. A virtualenv is recommended.*

You can use this model with Transformers *pipeline*.

```python
from transformers import AutoTokenizer, pipeline
from optimum.intel.openvino import OVModelForSequenceClassification

model_id = "helenai/distilbert-base-uncased-finetuned-sst-2-english-ov-int8"
model = OVModelForSequenceClassification.from_pretrained(model_id)
tokenizer = AutoTokenizer.from_pretrained(model_id)
cls_pipe = pipeline("text-classification", model=model, tokenizer=tokenizer)
text = "OpenVINO is awesome!"
outputs = cls_pipe(text)
print(outputs)
```

Example output:

```sh
[{'label': 'POSITIVE', 'score': 0.9998594522476196}]
```