Does the model take our data and use our data to improve itself or for some other purpose?
Does the model take our data and use our data to improve itself or for some other purpose?
The model is downloaded directly to your local machine when you use it via Transformers, so there is no interchange of data.
After declaring the variables "model=.....from_pretrained("sentence-transformers/..."), you can even turn off your wifi connection, the model will still work. This happens because you use it directly on a downstream task.
However, if you found a way to use this model from an API, it is not sure if it does it or not.
Cheers
@yuriyvnv
is right!
You can indeed use this model via an API as well: https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2?inference_api=true
Regardless of whether you use the model locally or via the API, no input/output data will be tracked. When running locally, the only internet connection required is for downloading the model itself, and when running via the API, the following code will be ran on our side: https://github.com/huggingface/api-inference-community/blob/main/docker_images/sentence_transformers/app/pipelines/feature_extraction.py
With no additional tracking on the inputs/outputs. I believe we only keep track of the number of requests.
- Tom Aarsen