David Chu
Create README.md
d3cd641
|
raw
history blame
272 Bytes
metadata
language: multilingual
pipeline_tag: zero-shot-classification

DistilBERT base model (uncased)

This model is the Multi-Genre Natural Language Inference (MNLI) fine-turned version of the uncased DistilBERT model.