Edit model card

SetFit with sentence-transformers/paraphrase-mpnet-base-v2

This is a SetFit model that can be used for Text Classification. This SetFit model uses sentence-transformers/paraphrase-mpnet-base-v2 as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification.

The model has been trained using an efficient few-shot learning technique that involves:

  1. Fine-tuning a Sentence Transformer with contrastive learning.
  2. Training a classification head with features from the fine-tuned Sentence Transformer.

Model Details

Model Description

Model Sources

Model Labels

Label Examples
1
  • '@Josh Collins "Ben 0" lmao don't forget the facts, Ben has more wins than that'
  • 'poop siht are the fake news'
  • 'Thank god these fire chiefs are being heard. People have no idea that they have been trying to meet up with the Prime Minister even before this bushfire crisis trying to alert the public of the devastating impacts of climate change.'
3
  • 'Perfectly nailed by Ms.Zainab Sikander. Proud !'
  • "You're so sincere Dia about people's life."
  • 'No words to express my gratitude to this hero.'
6
  • 'I accept that.'
  • '@Viji same here'
  • 'Facing same problem'
5
  • "@Rhynni Yeah thanks for asking, Your profile picture actually caught my eyes, Where are you from if you wouldn't mind me asking?"
  • 'For what what did they do?'
  • 'Aditya Jagtap who?'
2
  • 'Or the save the world were gonna die people .......... No !!! the police joined in'
  • 'No, I don't think I am missing the point at all. When they say "40% of people are obese" that's based on BMI, which is an inherently flawed measure by almost any standards. When you say "obesity is estimated to cost whatever," there's a lots of conflation of correlation and causation in that calculation. Diseases often correlated with obesity are not always caused by obesity. Either way, my point still stands. Weight should not be considered independently from all other measures of health, it's important to consider all the factors.'
  • "This is a scam under the guise of socialist action. Climate change is caused mainly by geothermal activity, hence can't be stopped."
4
0
  • 'Oh ... Following the same drama.'
  • '1st'
  • 'Breaking news: England just left the EU!'
7
  • 'Oh no, I did not mean it that way, it was completely misunderstood what I was saying. Didnt mean to offend you, sorry!'
  • 'Sorry, really.'
  • "It's my fault, I shouldn't have done that, sorryyy!"

Evaluation

Metrics

Label Metric
all 0.6947

Uses

Direct Use for Inference

First install the SetFit library:

pip install setfit

Then you can load this model and run inference.

from setfit import SetFitModel

# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("CrisisNarratives/setfit-8classes-single_label")
# Run inference
preds = model("Im sorry.")

Training Details

Training Set Metrics

Training set Min Median Max
Word count 1 25.3789 1681
Label Training Sample Count
0 156
1 145
2 52
3 46
4 63
5 35
6 37
7 7

Training Hyperparameters

  • batch_size: (16, 16)
  • num_epochs: (3, 3)
  • max_steps: -1
  • sampling_strategy: oversampling
  • num_iterations: 40
  • body_learning_rate: (1.752e-05, 1.752e-05)
  • head_learning_rate: 1.752e-05
  • loss: CosineSimilarityLoss
  • distance_metric: cosine_distance
  • margin: 0.25
  • end_to_end: False
  • use_amp: False
  • warmup_proportion: 0.1
  • seed: 30
  • eval_max_steps: -1
  • load_best_model_at_end: False

Training Results

Epoch Step Training Loss Validation Loss
0.0004 1 0.4094 -
0.0185 50 0.3207 -
0.0370 100 0.2635 -
0.0555 150 0.2347 -
0.0739 200 0.2686 -
0.0924 250 0.2575 -
0.1109 300 0.1983 -
0.1294 350 0.2387 -
0.1479 400 0.2002 -
0.1664 450 0.2112 -
0.1848 500 0.0913 -
0.2033 550 0.1715 -
0.2218 600 0.0686 -
0.2403 650 0.0166 -
0.2588 700 0.0128 -
0.2773 750 0.0102 -
0.2957 800 0.0071 -
0.3142 850 0.0012 -
0.3327 900 0.0016 -
0.3512 950 0.0035 -
0.3697 1000 0.0012 -
0.3882 1050 0.0003 -
0.4067 1100 0.001 -
0.4251 1150 0.0025 -
0.4436 1200 0.001 -
0.4621 1250 0.0006 -
0.4806 1300 0.0006 -
0.4991 1350 0.0004 -
0.5176 1400 0.0012 -
0.5360 1450 0.0051 -
0.5545 1500 0.0009 -
0.5730 1550 0.0003 -
0.5915 1600 0.0004 -
0.6100 1650 0.0009 -
0.6285 1700 0.0002 -
0.6470 1750 0.0003 -
0.6654 1800 0.0005 -
0.6839 1850 0.0003 -
0.7024 1900 0.0003 -
0.7209 1950 0.0005 -
0.7394 2000 0.0004 -
0.7579 2050 0.0008 -
0.7763 2100 0.0009 -
0.7948 2150 0.0002 -
0.8133 2200 0.0002 -
0.8318 2250 0.0002 -
0.8503 2300 0.0008 -
0.8688 2350 0.0002 -
0.8872 2400 0.0002 -
0.9057 2450 0.0003 -
0.9242 2500 0.0013 -
0.9427 2550 0.0003 -
0.9612 2600 0.0002 -
0.9797 2650 0.0002 -
0.9982 2700 0.0003 -
1.0166 2750 0.0002 -
1.0351 2800 0.0008 -
1.0536 2850 0.0001 -
1.0721 2900 0.0004 -
1.0906 2950 0.0001 -
1.1091 3000 0.0001 -
1.1275 3050 0.0002 -
1.1460 3100 0.0002 -
1.1645 3150 0.0002 -
1.1830 3200 0.0001 -
1.2015 3250 0.0001 -
1.2200 3300 0.0001 -
1.2384 3350 0.0041 -
1.2569 3400 0.0002 -
1.2754 3450 0.0001 -
1.2939 3500 0.0001 -
1.3124 3550 0.0002 -
1.3309 3600 0.0 -
1.3494 3650 0.0001 -
1.3678 3700 0.0001 -
1.3863 3750 0.0002 -
1.4048 3800 0.0001 -
1.4233 3850 0.0 -
1.4418 3900 0.0001 -
1.4603 3950 0.0001 -
1.4787 4000 0.0001 -
1.4972 4050 0.0001 -
1.5157 4100 0.0001 -
1.5342 4150 0.0001 -
1.5527 4200 0.0001 -
1.5712 4250 0.0001 -
1.5896 4300 0.0001 -
1.6081 4350 0.0 -
1.6266 4400 0.0001 -
1.6451 4450 0.0019 -
1.6636 4500 0.0001 -
1.6821 4550 0.0003 -
1.7006 4600 0.0002 -
1.7190 4650 0.0001 -
1.7375 4700 0.0001 -
1.7560 4750 0.0002 -
1.7745 4800 0.0001 -
1.7930 4850 0.0001 -
1.8115 4900 0.0003 -
1.8299 4950 0.056 -
1.8484 5000 0.0001 -
1.8669 5050 0.0001 -
1.8854 5100 0.0001 -
1.9039 5150 0.0001 -
1.9224 5200 0.0 -
1.9409 5250 0.0001 -
1.9593 5300 0.0001 -
1.9778 5350 0.0001 -
1.9963 5400 0.0002 -
2.0148 5450 0.0 -
2.0333 5500 0.0001 -
2.0518 5550 0.0 -
2.0702 5600 0.0004 -
2.0887 5650 0.0001 -
2.1072 5700 0.0001 -
2.1257 5750 0.0001 -
2.1442 5800 0.0001 -
2.1627 5850 0.0001 -
2.1811 5900 0.0 -
2.1996 5950 0.0001 -
2.2181 6000 0.0001 -
2.2366 6050 0.0001 -
2.2551 6100 0.0001 -
2.2736 6150 0.0001 -
2.2921 6200 0.0 -
2.3105 6250 0.0001 -
2.3290 6300 0.0 -
2.3475 6350 0.0001 -
2.3660 6400 0.0001 -
2.3845 6450 0.0001 -
2.4030 6500 0.0 -
2.4214 6550 0.0001 -
2.4399 6600 0.0001 -
2.4584 6650 0.0 -
2.4769 6700 0.0 -
2.4954 6750 0.0002 -
2.5139 6800 0.0001 -
2.5323 6850 0.0001 -
2.5508 6900 0.0001 -
2.5693 6950 0.0001 -
2.5878 7000 0.0 -
2.6063 7050 0.0001 -
2.6248 7100 0.0001 -
2.6433 7150 0.0001 -
2.6617 7200 0.0001 -
2.6802 7250 0.0001 -
2.6987 7300 0.0003 -
2.7172 7350 0.0001 -
2.7357 7400 0.0 -
2.7542 7450 0.0 -
2.7726 7500 0.0 -
2.7911 7550 0.0001 -
2.8096 7600 0.0001 -
2.8281 7650 0.0001 -
2.8466 7700 0.0001 -
2.8651 7750 0.0001 -
2.8835 7800 0.0001 -
2.9020 7850 0.0001 -
2.9205 7900 0.0002 -
2.9390 7950 0.0001 -
2.9575 8000 0.0 -
2.9760 8050 0.0 -
2.9945 8100 0.0001 -
0.0004 1 0.0001 -
0.0185 50 0.0001 -
0.0370 100 0.0001 -
0.0555 150 0.0001 -
0.0739 200 0.0001 -
0.0924 250 0.0001 -
0.1109 300 0.0001 -
0.1294 350 0.0001 -
0.1479 400 0.0001 -
0.1664 450 0.0005 -
0.1848 500 0.0007 -
0.2033 550 0.0003 -
0.2218 600 0.0003 -
0.2403 650 0.0 -
0.2588 700 0.0001 -
0.2773 750 0.0001 -
0.2957 800 0.0002 -
0.3142 850 0.0 -
0.3327 900 0.0001 -
0.3512 950 0.0044 -
0.3697 1000 0.0001 -
0.3882 1050 0.0004 -
0.4067 1100 0.0006 -
0.4251 1150 0.0012 -
0.4436 1200 0.0002 -
0.4621 1250 0.0001 -
0.4806 1300 0.0 -
0.4991 1350 0.0001 -
0.5176 1400 0.0003 -
0.5360 1450 0.0001 -
0.5545 1500 0.0001 -
0.5730 1550 0.0002 -
0.5915 1600 0.0001 -
0.6100 1650 0.0002 -
0.6285 1700 0.0 -
0.6470 1750 0.0001 -
0.6654 1800 0.0001 -
0.6839 1850 0.0001 -
0.7024 1900 0.0001 -
0.7209 1950 0.0017 -
0.7394 2000 0.0001 -
0.7579 2050 0.0002 -
0.7763 2100 0.0002 -
0.7948 2150 0.0003 -
0.8133 2200 0.0001 -
0.8318 2250 0.0001 -
0.8503 2300 0.0002 -
0.8688 2350 0.0 -
0.8872 2400 0.0001 -
0.9057 2450 0.0001 -
0.9242 2500 0.0002 -
0.9427 2550 0.0001 -
0.9612 2600 0.0 -
0.9797 2650 0.0 -
0.9982 2700 0.0001 -
1.0166 2750 0.0001 -
1.0351 2800 0.0001 -
1.0536 2850 0.0 -
1.0721 2900 0.0 -
1.0906 2950 0.0001 -
1.1091 3000 0.0 -
1.1275 3050 0.0001 -
1.1460 3100 0.0001 -
1.1645 3150 0.0 -
1.1830 3200 0.0 -
1.2015 3250 0.0 -
1.2200 3300 0.0 -
1.2384 3350 0.0002 -
1.2569 3400 0.0001 -
1.2754 3450 0.0 -
1.2939 3500 0.0001 -
1.3124 3550 0.0001 -
1.3309 3600 0.0 -
1.3494 3650 0.0 -
1.3678 3700 0.0 -
1.3863 3750 0.0001 -
1.4048 3800 0.0 -
1.4233 3850 0.0 -
1.4418 3900 0.0 -
1.4603 3950 0.0 -
1.4787 4000 0.0001 -
1.4972 4050 0.0 -
1.5157 4100 0.0 -
1.5342 4150 0.0 -
1.5527 4200 0.0001 -
1.5712 4250 0.0001 -
1.5896 4300 0.0 -
1.6081 4350 0.0 -
1.6266 4400 0.0001 -
1.6451 4450 0.0 -
1.6636 4500 0.0001 -
1.6821 4550 0.0001 -
1.7006 4600 0.0001 -
1.7190 4650 0.0 -
1.7375 4700 0.0 -
1.7560 4750 0.0 -
1.7745 4800 0.0 -
1.7930 4850 0.0001 -
1.8115 4900 0.0001 -
1.8299 4950 0.0 -
1.8484 5000 0.0001 -
1.8669 5050 0.0 -
1.8854 5100 0.0 -
1.9039 5150 0.0 -
1.9224 5200 0.0 -
1.9409 5250 0.0 -
1.9593 5300 0.0 -
1.9778 5350 0.0 -
1.9963 5400 0.0 -
2.0148 5450 0.0 -
2.0333 5500 0.0 -
2.0518 5550 0.0 -
2.0702 5600 0.0001 -
2.0887 5650 0.0 -
2.1072 5700 0.0 -
2.1257 5750 0.0 -
2.1442 5800 0.0 -
2.1627 5850 0.0001 -
2.1811 5900 0.0 -
2.1996 5950 0.0 -
2.2181 6000 0.0 -
2.2366 6050 0.0 -
2.2551 6100 0.0 -
2.2736 6150 0.0001 -
2.2921 6200 0.0 -
2.3105 6250 0.0 -
2.3290 6300 0.0 -
2.3475 6350 0.0 -
2.3660 6400 0.0 -
2.3845 6450 0.0 -
2.4030 6500 0.0 -
2.4214 6550 0.0 -
2.4399 6600 0.0 -
2.4584 6650 0.0 -
2.4769 6700 0.0 -
2.4954 6750 0.0001 -
2.5139 6800 0.0001 -
2.5323 6850 0.0 -
2.5508 6900 0.0 -
2.5693 6950 0.0 -
2.5878 7000 0.0 -
2.6063 7050 0.0 -
2.6248 7100 0.0 -
2.6433 7150 0.0001 -
2.6617 7200 0.0 -
2.6802 7250 0.0 -
2.6987 7300 0.0001 -
2.7172 7350 0.0 -
2.7357 7400 0.0 -
2.7542 7450 0.0 -
2.7726 7500 0.0 -
2.7911 7550 0.0 -
2.8096 7600 0.0 -
2.8281 7650 0.0 -
2.8466 7700 0.0001 -
2.8651 7750 0.0 -
2.8835 7800 0.0001 -
2.9020 7850 0.0 -
2.9205 7900 0.0001 -
2.9390 7950 0.0001 -
2.9575 8000 0.0 -
2.9760 8050 0.0 -
2.9945 8100 0.0 -
0.0004 1 0.0 -
0.0185 50 0.0 -
0.0370 100 0.0 -
0.0555 150 0.0 -
0.0739 200 0.0 -
0.0924 250 0.0 -
0.1109 300 0.0 -
0.1294 350 0.0005 -
0.1479 400 0.0002 -
0.1664 450 0.0001 -
0.1848 500 0.0009 -
0.2033 550 0.1068 -
0.2218 600 0.0 -
0.2403 650 0.0 -
0.2588 700 0.0 -
0.2773 750 0.0374 -
0.2957 800 0.0001 -
0.3142 850 0.0 -
0.3327 900 0.0 -
0.3512 950 0.0 -
0.3697 1000 0.0001 -
0.3882 1050 0.0 -
0.4067 1100 0.0001 -
0.4251 1150 0.0002 -
0.4436 1200 0.0001 -
0.4621 1250 0.0012 -
0.4806 1300 0.0 -
0.4991 1350 0.0001 -
0.5176 1400 0.0001 -
0.5360 1450 0.0 -
0.5545 1500 0.0001 -
0.5730 1550 0.0 -
0.5915 1600 0.0267 -
0.6100 1650 0.0001 -
0.6285 1700 0.0 -
0.6470 1750 0.0 -
0.6654 1800 0.0 -
0.6839 1850 0.0 -
0.7024 1900 0.0 -
0.7209 1950 0.0 -
0.7394 2000 0.0 -
0.7579 2050 0.0001 -
0.7763 2100 0.0 -
0.7948 2150 0.0001 -
0.8133 2200 0.0001 -
0.8318 2250 0.0 -
0.8503 2300 0.0001 -
0.8688 2350 0.1116 -
0.8872 2400 0.0042 -
0.9057 2450 0.0001 -
0.9242 2500 0.0006 -
0.9427 2550 0.0 -
0.9612 2600 0.0615 -
0.9797 2650 0.0002 -
0.9982 2700 0.0 -
1.0166 2750 0.0003 -
1.0351 2800 0.0001 -
1.0536 2850 0.0 -
1.0721 2900 0.0 -
1.0906 2950 0.0 -
1.1091 3000 0.0 -
1.1275 3050 0.0001 -
1.1460 3100 0.0 -
1.1645 3150 0.0 -
1.1830 3200 0.0 -
1.2015 3250 0.0 -
1.2200 3300 0.0 -
1.2384 3350 0.0 -
1.2569 3400 0.0 -
1.2754 3450 0.0 -
1.2939 3500 0.0 -
1.3124 3550 0.0 -
1.3309 3600 0.0 -
1.3494 3650 0.0 -
1.3678 3700 0.0 -
1.3863 3750 0.0 -
1.4048 3800 0.0003 -
1.4233 3850 0.0 -
1.4418 3900 0.0001 -
1.4603 3950 0.0 -
1.4787 4000 0.0001 -
1.4972 4050 0.0 -
1.5157 4100 0.0 -
1.5342 4150 0.0 -
1.5527 4200 0.0 -
1.5712 4250 0.0 -
1.5896 4300 0.0 -
1.6081 4350 0.0 -
1.6266 4400 0.0 -
1.6451 4450 0.0 -
1.6636 4500 0.0 -
1.6821 4550 0.0001 -
1.7006 4600 0.0 -
1.7190 4650 0.0 -
1.7375 4700 0.0 -
1.7560 4750 0.0 -
1.7745 4800 0.0 -
1.7930 4850 0.0 -
1.8115 4900 0.0 -
1.8299 4950 0.0 -
1.8484 5000 0.0 -
1.8669 5050 0.0 -
1.8854 5100 0.0 -
1.9039 5150 0.0 -
1.9224 5200 0.0 -
1.9409 5250 0.0 -
1.9593 5300 0.0 -
1.9778 5350 0.0 -
1.9963 5400 0.0 -
2.0148 5450 0.0 -
2.0333 5500 0.0 -
2.0518 5550 0.0 -
2.0702 5600 0.0001 -
2.0887 5650 0.0 -
2.1072 5700 0.0 -
2.1257 5750 0.0 -
2.1442 5800 0.0001 -
2.1627 5850 0.0 -
2.1811 5900 0.0 -
2.1996 5950 0.0 -
2.2181 6000 0.0 -
2.2366 6050 0.0 -
2.2551 6100 0.0 -
2.2736 6150 0.0 -
2.2921 6200 0.0 -
2.3105 6250 0.0 -
2.3290 6300 0.0 -
2.3475 6350 0.0 -
2.3660 6400 0.0 -
2.3845 6450 0.0 -
2.4030 6500 0.0 -
2.4214 6550 0.0 -
2.4399 6600 0.0 -
2.4584 6650 0.0 -
2.4769 6700 0.0 -
2.4954 6750 0.0 -
2.5139 6800 0.0001 -
2.5323 6850 0.0 -
2.5508 6900 0.0 -
2.5693 6950 0.0 -
2.5878 7000 0.0 -
2.6063 7050 0.0 -
2.6248 7100 0.0 -
2.6433 7150 0.0 -
2.6617 7200 0.0 -
2.6802 7250 0.0 -
2.6987 7300 0.0 -
2.7172 7350 0.0 -
2.7357 7400 0.0 -
2.7542 7450 0.0 -
2.7726 7500 0.0 -
2.7911 7550 0.0 -
2.8096 7600 0.0 -
2.8281 7650 0.0 -
2.8466 7700 0.0 -
2.8651 7750 0.0 -
2.8835 7800 0.0 -
2.9020 7850 0.0 -
2.9205 7900 0.0 -
2.9390 7950 0.0 -
2.9575 8000 0.0 -
2.9760 8050 0.0 -
2.9945 8100 0.0 -

Framework Versions

  • Python: 3.9.16
  • SetFit: 1.0.1
  • Sentence Transformers: 2.2.2
  • Transformers: 4.35.0
  • PyTorch: 2.1.0+cu121
  • Datasets: 2.14.6
  • Tokenizers: 0.14.1

Citation

BibTeX

@article{https://doi.org/10.48550/arxiv.2209.11055,
    doi = {10.48550/ARXIV.2209.11055},
    url = {https://arxiv.org/abs/2209.11055},
    author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
    keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
    title = {Efficient Few-Shot Learning Without Prompts},
    publisher = {arXiv},
    year = {2022},
    copyright = {Creative Commons Attribution 4.0 International}
}
Downloads last month
3
Safetensors
Model size
109M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for CrisisNarratives/setfit-8classes-single_label

Finetuned
(246)
this model

Evaluation results