File size: 188,958 Bytes
ef03932
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
1001
1002
1003
1004
1005
1006
1007
1008
1009
1010
1011
1012
1013
1014
1015
1016
1017
1018
1019
1020
1021
1022
1023
1024
1025
1026
1027
1028
1029
1030
1031
1032
1033
1034
1035
1036
1037
1038
1039
1040
1041
1042
1043
1044
1045
1046
1047
1048
1049
1050
1051
1052
1053
1054
1055
1056
1057
1058
1059
1060
1061
1062
1063
1064
1065
1066
1067
1068
1069
1070
1071
1072
1073
1074
1075
1076
1077
1078
1079
1080
1081
1082
1083
1084
1085
1086
1087
1088
1089
1090
1091
1092
1093
1094
1095
1096
1097
1098
1099
1100
1101
1102
1103
1104
1105
1106
1107
1108
1109
1110
1111
1112
1113
1114
1115
1116
1117
1118
1119
1120
1121
1122
1123
1124
1125
1126
1127
1128
1129
1130
1131
1132
1133
1134
1135
1136
1137
1138
1139
1140
1141
1142
1143
1144
1145
1146
1147
1148
1149
1150
1151
1152
1153
1154
1155
1156
1157
1158
1159
1160
1161
1162
1163
1164
1165
1166
1167
1168
1169
1170
1171
1172
1173
1174
1175
1176
1177
1178
1179
1180
1181
1182
1183
1184
1185
1186
1187
1188
1189
1190
1191
1192
1193
1194
1195
1196
1197
1198
1199
1200
1201
1202
1203
1204
1205
1206
1207
1208
1209
1210
1211
1212
1213
1214
1215
1216
1217
1218
1219
1220
1221
1222
1223
1224
1225
1226
1227
1228
1229
1230
1231
1232
1233
1234
1235
1236
1237
1238
1239
1240
1241
1242
1243
1244
1245
1246
1247
1248
1249
1250
1251
1252
1253
1254
1255
1256
1257
1258
1259
1260
1261
1262
1263
1264
1265
1266
1267
1268
1269
1270
1271
1272
1273
1274
1275
1276
1277
1278
1279
1280
1281
1282
1283
1284
1285
1286
1287
1288
1289
1290
1291
1292
1293
1294
1295
1296
1297
1298
1299
1300
1301
1302
1303
1304
1305
1306
1307
1308
1309
1310
1311
1312
1313
1314
1315
1316
1317
1318
1319
1320
1321
1322
1323
1324
1325
1326
1327
1328
1329
1330
1331
1332
1333
1334
1335
1336
1337
1338
1339
1340
1341
1342
1343
1344
1345
1346
1347
1348
1349
1350
1351
1352
1353
1354
1355
1356
1357
1358
1359
1360
1361
1362
1363
1364
1365
1366
1367
1368
1369
1370
1371
1372
---
base_model: microsoft/deberta-v3-small
datasets:
- jinaai/negation-dataset-v2
- tals/vitaminc
- allenai/scitail
- allenai/sciq
- allenai/qasc
- sentence-transformers/msmarco-msmarco-distilbert-base-v3
- sentence-transformers/natural-questions
- sentence-transformers/trivia-qa
- sentence-transformers/gooaq
- google-research-datasets/paws
language:
- en
library_name: sentence-transformers
metrics:
- pearson_cosine
- spearman_cosine
- pearson_manhattan
- spearman_manhattan
- pearson_euclidean
- spearman_euclidean
- pearson_dot
- spearman_dot
- pearson_max
- spearman_max
- cosine_accuracy
- dot_accuracy
- manhattan_accuracy
- euclidean_accuracy
- max_accuracy
- cosine_accuracy_threshold
- cosine_f1
- cosine_f1_threshold
- cosine_precision
- cosine_recall
- cosine_ap
- dot_accuracy_threshold
- dot_f1
- dot_f1_threshold
- dot_precision
- dot_recall
- dot_ap
- manhattan_accuracy_threshold
- manhattan_f1
- manhattan_f1_threshold
- manhattan_precision
- manhattan_recall
- manhattan_ap
- euclidean_accuracy_threshold
- euclidean_f1
- euclidean_f1_threshold
- euclidean_precision
- euclidean_recall
- euclidean_ap
- max_accuracy_threshold
- max_f1
- max_f1_threshold
- max_precision
- max_recall
- max_ap
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:36500
- loss:CachedGISTEmbedLoss
widget:
- source_sentence: what are brake drums made of
  sentences:
  - Stereotactic radiosurgery (SRS) is a non-surgical radiation therapy used to treat
    functional abnormalities and small tumors of the brain.It can deliver precisely-targeted
    radiation in fewer high-dose treatments than traditional therapy, which can help
    preserve healthy tissue.tereotactic radiosurgery (SRS) is a non-surgical radiation
    therapy used to treat functional abnormalities and small tumors of the brain.
  - Human rights in Germany - Law. 1  The constitution of Germany, the Grundgesetz,
    which came into effect in May 8, 1949, puts a particular emphasis on human rights.
    Its first sentence, Human dignity is inviolable, is being interpreted as protecting
    the sum of human rights. This paragraph is protected by an eternity clause and
    cannot be changed.
  - The brake drum mounts on the axle or wheel hub, and some drums incorporate the
    hub. Most brake drums are made of solid cast iron, but there are also steel and
    aluminum drums with cast iron liners. The machined friction surface on all drums
    is cast iron.
- source_sentence: More than 169 countries had reported over 212,000 COVID-19 cases
    before March 19 , 2020 .
  sentences:
  - As of 23 March , more than 341,000 cases of COVID-19 have been reported in 192
    countries and territories , resulting in more than 14,700 deaths and 99,000 recoveries
    .
  - As of 21 March , more than 278,000 cases of COVID-19 have been reported in over
    186 countries and territories , resulting in more than 11,500 deaths and 92,000
    recoveries.  virus seems to mostly spread between people via respiratory droplets
    .
  - As of 18 March 2020 , more than 212,000 cases of COVID-19 have been reported in
    at least 170 countries and territories , with major outbreaks in China , Iran
    and the European Union .
- source_sentence: 'The images were captured on the morning of 14 November by CCTV
    cameras at a French petrol station, a day after the attacks in which 130 were
    killed.

    In them, Salah Abdeslam seems relaxed, walking with his hands in his pockets.

    He is thought to have been in charge of logistics for the groups of gunmen who
    carried out the attacks.

    Salah Abdeslam is said to have called his two friends, Mohammed Amri and Salah
    Hamza Attou, from Paris early on 14 November to come and pick him up and take
    him to Belgium.

    En route from Paris to Brussels, the three men stopped at a petrol station near
    the Belgian border for about 15 minutes, where a CCTV camera filmed them, BFM
    reports.

    At that point, the three men had already been through three police checks, but
    had not been stopped as Salah Abdeslam had not yet been connected to the Paris
    attacks.

    Mohammed Amri and Salah Hamza Attou later dropped off Salah Abdeslam in the district
    of Laeken in Brussels.

    The two were arrested in Molenbeek the next day and face terror charges, while
    Salah Abdeslam is still on the run.

    Who were the Paris attackers?

    Paris attacks: The investigation so far

    Paris attacks: Who were the victims?

    Paris attacks: What happened on the night

    The Paris attacks are believed to have been at least partly planned in Brussels.
    Belgian police have arrested 10 people as part of their investigation.

    The suspected ringleader was Abdelhamid Abaaoud, a Belgian national. He and his
    cousin Hasna Aitboulahcen died in a fierce gun battle five days after the attacks,
    when police raided a flat in Paris where they were hiding, heavily armed.'
  sentences:
  - The first images of the fugitive Paris attacks suspect Salah Abdeslam are said
    to have emerged, according to French news channel BFM TV.
  - Excess Army food supplies should be given to the "army of the homeless", a senior
    MP says.
  - Head coach Philippe Montanier has said Nottingham Forest's second-half display
    against Derby County was poor in so many areas of the pitch.
- source_sentence: Electrical energy can be converted into kinetic energy and heat
    energy by an electric motor.
  sentences:
  - Solution is the term for a homogeneous mixture of two or more substances.
  - Solution is the term for a homogeneous mixture of two or more substances.
  - Electric motors transform electrical energy into kinetic energy.
- source_sentence: who plays the predator in the movie predator
  sentences:
  - Kevin Peter Hall Kevin Peter Hall (May 9, 1955  April 10, 1991) was an American
    actor best known for his roles as the title character in the first two films in
    the Predator franchise and the title character of Harry in the film and television
    series, Harry and the Hendersons. He also appeared in the television series Misfits
    of Science and 227, along with the film Without Warning.
  - The Secret Daughter The Secret Daughter is an Australian television drama series
    which premiered on the Seven Network on 3 October 2016.[1] The series is written
    by Justin Monjo, Greg Haddrick, Louise Bowes and Keith Thompson and directed by
    Leah Purcell, Geoff Bennett and Paul Moloney. The drama centres around part-time
    country pub singer Billie Carter (Jessica Mauboy), who has a chance meeting with
    a wealthy city hotelier and rediscovers information about her family and history.
    The second season premiered on 8 November 2017.[2]
  - The Hunchback of Notre-Dame The story is set in Paris in 1482 during the reign
    of Louis XI. The gypsy Esmeralda (born as Agnes) captures the hearts of many men,
    including those of Captain Phoebus and Pierre Gringoire, but especially Quasimodo
    and his guardian Archdeacon Claude Frollo. Frollo is torn between his obsessive
    lust for Esmeralda and the rules of Notre Dame Cathedral. He orders Quasimodo
    to kidnap her, but Quasimodo is captured by Phoebus and his guards, who save Esmeralda.
    Gringoire, who attempted to help Esmeralda but was knocked out by Quasimodo, is
    about to be hanged by beggars when Esmeralda saves him by agreeing to marry him
    for four years.
model-index:
- name: SentenceTransformer based on microsoft/deberta-v3-small
  results:
  - task:
      type: semantic-similarity
      name: Semantic Similarity
    dataset:
      name: sts test
      type: sts-test
    metrics:
    - type: pearson_cosine
      value: 0.1771905788413257
      name: Pearson Cosine
    - type: spearman_cosine
      value: 0.2225047095682771
      name: Spearman Cosine
    - type: pearson_manhattan
      value: 0.18333784202455092
      name: Pearson Manhattan
    - type: spearman_manhattan
      value: 0.20700380700256382
      name: Spearman Manhattan
    - type: pearson_euclidean
      value: 0.17465663546843996
      name: Pearson Euclidean
    - type: spearman_euclidean
      value: 0.19950882182232507
      name: Spearman Euclidean
    - type: pearson_dot
      value: 0.27769356372160947
      name: Pearson Dot
    - type: spearman_dot
      value: 0.30350268106992373
      name: Spearman Dot
    - type: pearson_max
      value: 0.27769356372160947
      name: Pearson Max
    - type: spearman_max
      value: 0.30350268106992373
      name: Spearman Max
  - task:
      type: triplet
      name: Triplet
    dataset:
      name: NLI v2
      type: NLI-v2
    metrics:
    - type: cosine_accuracy
      value: 1.0
      name: Cosine Accuracy
    - type: dot_accuracy
      value: 0.078125
      name: Dot Accuracy
    - type: manhattan_accuracy
      value: 1.0
      name: Manhattan Accuracy
    - type: euclidean_accuracy
      value: 1.0
      name: Euclidean Accuracy
    - type: max_accuracy
      value: 1.0
      name: Max Accuracy
  - task:
      type: binary-classification
      name: Binary Classification
    dataset:
      name: VitaminC
      type: VitaminC
    metrics:
    - type: cosine_accuracy
      value: 0.55078125
      name: Cosine Accuracy
    - type: cosine_accuracy_threshold
      value: 0.9663318395614624
      name: Cosine Accuracy Threshold
    - type: cosine_f1
      value: 0.648936170212766
      name: Cosine F1
    - type: cosine_f1_threshold
      value: 0.8216172456741333
      name: Cosine F1 Threshold
    - type: cosine_precision
      value: 0.48221343873517786
      name: Cosine Precision
    - type: cosine_recall
      value: 0.991869918699187
      name: Cosine Recall
    - type: cosine_ap
      value: 0.5331049287016129
      name: Cosine Ap
    - type: dot_accuracy
      value: 0.5546875
      name: Dot Accuracy
    - type: dot_accuracy_threshold
      value: 468.02716064453125
      name: Dot Accuracy Threshold
    - type: dot_f1
      value: 0.6507936507936508
      name: Dot F1
    - type: dot_f1_threshold
      value: 358.44915771484375
      name: Dot F1 Threshold
    - type: dot_precision
      value: 0.4823529411764706
      name: Dot Precision
    - type: dot_recall
      value: 1.0
      name: Dot Recall
    - type: dot_ap
      value: 0.5202948869489447
      name: Dot Ap
    - type: manhattan_accuracy
      value: 0.54296875
      name: Manhattan Accuracy
    - type: manhattan_accuracy_threshold
      value: 112.2923355102539
      name: Manhattan Accuracy Threshold
    - type: manhattan_f1
      value: 0.6540540540540541
      name: Manhattan F1
    - type: manhattan_f1_threshold
      value: 210.52694702148438
      name: Manhattan F1 Threshold
    - type: manhattan_precision
      value: 0.4898785425101215
      name: Manhattan Precision
    - type: manhattan_recall
      value: 0.983739837398374
      name: Manhattan Recall
    - type: manhattan_ap
      value: 0.5237821619629712
      name: Manhattan Ap
    - type: euclidean_accuracy
      value: 0.546875
      name: Euclidean Accuracy
    - type: euclidean_accuracy_threshold
      value: 5.734438896179199
      name: Euclidean Accuracy Threshold
    - type: euclidean_f1
      value: 0.6507936507936508
      name: Euclidean F1
    - type: euclidean_f1_threshold
      value: 13.58138656616211
      name: Euclidean F1 Threshold
    - type: euclidean_precision
      value: 0.4823529411764706
      name: Euclidean Precision
    - type: euclidean_recall
      value: 1.0
      name: Euclidean Recall
    - type: euclidean_ap
      value: 0.5282344411865632
      name: Euclidean Ap
    - type: max_accuracy
      value: 0.5546875
      name: Max Accuracy
    - type: max_accuracy_threshold
      value: 468.02716064453125
      name: Max Accuracy Threshold
    - type: max_f1
      value: 0.6540540540540541
      name: Max F1
    - type: max_f1_threshold
      value: 358.44915771484375
      name: Max F1 Threshold
    - type: max_precision
      value: 0.4898785425101215
      name: Max Precision
    - type: max_recall
      value: 1.0
      name: Max Recall
    - type: max_ap
      value: 0.5331049287016129
      name: Max Ap
---

# SentenceTransformer based on microsoft/deberta-v3-small

This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [microsoft/deberta-v3-small](https://huggingface.co/microsoft/deberta-v3-small) on the [negation-triplets](https://huggingface.co/datasets/jinaai/negation-dataset-v2), [vitaminc-pairs](https://huggingface.co/datasets/tals/vitaminc), [scitail-pairs-qa](https://huggingface.co/datasets/allenai/scitail), [scitail-pairs-pos](https://huggingface.co/datasets/allenai/scitail), xsum-pairs, [sciq_pairs](https://huggingface.co/datasets/allenai/sciq), [qasc_pairs](https://huggingface.co/datasets/allenai/qasc), openbookqa_pairs, [msmarco_pairs](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3), [nq_pairs](https://huggingface.co/datasets/sentence-transformers/natural-questions), [trivia_pairs](https://huggingface.co/datasets/sentence-transformers/trivia-qa), [gooaq_pairs](https://huggingface.co/datasets/sentence-transformers/gooaq) and [paws-pos](https://huggingface.co/datasets/google-research-datasets/paws) datasets. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

## Model Details

### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [microsoft/deberta-v3-small](https://huggingface.co/microsoft/deberta-v3-small) <!-- at revision a36c739020e01763fe789b4b85e2df55d6180012 -->
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 768 tokens
- **Similarity Function:** Cosine Similarity
- **Training Datasets:**
    - [negation-triplets](https://huggingface.co/datasets/jinaai/negation-dataset-v2)
    - [vitaminc-pairs](https://huggingface.co/datasets/tals/vitaminc)
    - [scitail-pairs-qa](https://huggingface.co/datasets/allenai/scitail)
    - [scitail-pairs-pos](https://huggingface.co/datasets/allenai/scitail)
    - xsum-pairs
    - [sciq_pairs](https://huggingface.co/datasets/allenai/sciq)
    - [qasc_pairs](https://huggingface.co/datasets/allenai/qasc)
    - openbookqa_pairs
    - [msmarco_pairs](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3)
    - [nq_pairs](https://huggingface.co/datasets/sentence-transformers/natural-questions)
    - [trivia_pairs](https://huggingface.co/datasets/sentence-transformers/trivia-qa)
    - [gooaq_pairs](https://huggingface.co/datasets/sentence-transformers/gooaq)
    - [paws-pos](https://huggingface.co/datasets/google-research-datasets/paws)
- **Language:** en
<!-- - **License:** Unknown -->

### Model Sources

- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)

### Full Model Architecture

```
SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: DebertaV2Model 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
```

## Usage

### Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

```bash
pip install -U sentence-transformers
```

Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("bobox/DeBERTa-small-ST-v1-toytest-checkpoints-tmp")
# Run inference
sentences = [
    'who plays the predator in the movie predator',
    'Kevin Peter Hall Kevin Peter Hall (May 9, 1955\xa0– April 10, 1991) was an American actor best known for his roles as the title character in the first two films in the Predator franchise and the title character of Harry in the film and television series, Harry and the Hendersons. He also appeared in the television series Misfits of Science and 227, along with the film Without Warning.',
    'The Hunchback of Notre-Dame The story is set in Paris in 1482 during the reign of Louis XI. The gypsy Esmeralda (born as Agnes) captures the hearts of many men, including those of Captain Phoebus and Pierre Gringoire, but especially Quasimodo and his guardian Archdeacon Claude Frollo. Frollo is torn between his obsessive lust for Esmeralda and the rules of Notre Dame Cathedral. He orders Quasimodo to kidnap her, but Quasimodo is captured by Phoebus and his guards, who save Esmeralda. Gringoire, who attempted to help Esmeralda but was knocked out by Quasimodo, is about to be hanged by beggars when Esmeralda saves him by agreeing to marry him for four years.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```

<!--
### Direct Usage (Transformers)

<details><summary>Click to see the direct usage in Transformers</summary>

</details>
-->

<!--
### Downstream Usage (Sentence Transformers)

You can finetune this model on your own dataset.

<details><summary>Click to expand</summary>

</details>
-->

<!--
### Out-of-Scope Use

*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->

## Evaluation

### Metrics

#### Semantic Similarity
* Dataset: `sts-test`
* Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)

| Metric              | Value      |
|:--------------------|:-----------|
| pearson_cosine      | 0.1772     |
| **spearman_cosine** | **0.2225** |
| pearson_manhattan   | 0.1833     |
| spearman_manhattan  | 0.207      |
| pearson_euclidean   | 0.1747     |
| spearman_euclidean  | 0.1995     |
| pearson_dot         | 0.2777     |
| spearman_dot        | 0.3035     |
| pearson_max         | 0.2777     |
| spearman_max        | 0.3035     |

#### Triplet
* Dataset: `NLI-v2`
* Evaluated with [<code>TripletEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.TripletEvaluator)

| Metric             | Value   |
|:-------------------|:--------|
| cosine_accuracy    | 1.0     |
| dot_accuracy       | 0.0781  |
| manhattan_accuracy | 1.0     |
| euclidean_accuracy | 1.0     |
| **max_accuracy**   | **1.0** |

#### Binary Classification
* Dataset: `VitaminC`
* Evaluated with [<code>BinaryClassificationEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.BinaryClassificationEvaluator)

| Metric                       | Value      |
|:-----------------------------|:-----------|
| cosine_accuracy              | 0.5508     |
| cosine_accuracy_threshold    | 0.9663     |
| cosine_f1                    | 0.6489     |
| cosine_f1_threshold          | 0.8216     |
| cosine_precision             | 0.4822     |
| cosine_recall                | 0.9919     |
| cosine_ap                    | 0.5331     |
| dot_accuracy                 | 0.5547     |
| dot_accuracy_threshold       | 468.0272   |
| dot_f1                       | 0.6508     |
| dot_f1_threshold             | 358.4492   |
| dot_precision                | 0.4824     |
| dot_recall                   | 1.0        |
| dot_ap                       | 0.5203     |
| manhattan_accuracy           | 0.543      |
| manhattan_accuracy_threshold | 112.2923   |
| manhattan_f1                 | 0.6541     |
| manhattan_f1_threshold       | 210.5269   |
| manhattan_precision          | 0.4899     |
| manhattan_recall             | 0.9837     |
| manhattan_ap                 | 0.5238     |
| euclidean_accuracy           | 0.5469     |
| euclidean_accuracy_threshold | 5.7344     |
| euclidean_f1                 | 0.6508     |
| euclidean_f1_threshold       | 13.5814    |
| euclidean_precision          | 0.4824     |
| euclidean_recall             | 1.0        |
| euclidean_ap                 | 0.5282     |
| max_accuracy                 | 0.5547     |
| max_accuracy_threshold       | 468.0272   |
| max_f1                       | 0.6541     |
| max_f1_threshold             | 358.4492   |
| max_precision                | 0.4899     |
| max_recall                   | 1.0        |
| **max_ap**                   | **0.5331** |

<!--
## Bias, Risks and Limitations

*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->

<!--
### Recommendations

*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->

## Training Details

### Training Datasets

#### negation-triplets

* Dataset: [negation-triplets](https://huggingface.co/datasets/jinaai/negation-dataset-v2)
* Size: 3,250 training samples
* Columns: <code>anchor</code>, <code>entailment</code>, and <code>negative</code>
* Approximate statistics based on the first 1000 samples:
  |         | anchor                                                                            | entailment                                                                        | negative                                                                         |
  |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                            | string                                                                           |
  | details | <ul><li>min: 6 tokens</li><li>mean: 22.3 tokens</li><li>max: 154 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 14.08 tokens</li><li>max: 46 tokens</li></ul> | <ul><li>min: 5 tokens</li><li>mean: 14.4 tokens</li><li>max: 44 tokens</li></ul> |
* Samples:
  | anchor                                                                                                            | entailment                                                             | negative                                                              |
  |:------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------|:----------------------------------------------------------------------|
  | <code>While some organizations made the dramatic change look effortless, for others, it did not come easy.</code> | <code>Dramatic changes within organizations seldom come simply.</code> | <code>Dramatic changes within organizations often come simply.</code> |
  | <code>A cook mixing a meal at a restaurant.</code>                                                                | <code>A chef preparing food in a metal bowl</code>                     | <code>A chef throwing away food in a metal bowl</code>                |
  | <code>In addition, the women wear various heavy rings.</code>                                                     | <code>The women wear heavy jewelry. </code>                            | <code>The women do not wear heavy jewelry.</code>                     |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
  ```json
  {'guide': SentenceTransformer(
    (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
    (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
    (2): Normalize()
  ), 'temperature': 0.05}
  ```

#### vitaminc-pairs

* Dataset: [vitaminc-pairs](https://huggingface.co/datasets/tals/vitaminc) at [be6febb](https://huggingface.co/datasets/tals/vitaminc/tree/be6febb761b0b2807687e61e0b5282e459df2fa0)
* Size: 3,000 training samples
* Columns: <code>claim</code> and <code>evidence</code>
* Approximate statistics based on the first 1000 samples:
  |         | claim                                                                             | evidence                                                                           |
  |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                             |
  | details | <ul><li>min: 6 tokens</li><li>mean: 16.64 tokens</li><li>max: 46 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 37.26 tokens</li><li>max: 224 tokens</li></ul> |
* Samples:
  | claim                                                               | evidence                                                                                                                                                                   |
  |:--------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>Karl Marginson was the manager until October 2017 .</code>    | <code>The team was managed by Karl Marginson since its formation in 2005 until October 2017 ; the current manager is Tom Greaves .</code>                                  |
  | <code>Jerry Lee Lewis married his 13-year-old first cousin .</code> | <code>However , Lewis 's rock and roll career faltered in the wake of his marriage to his 13-year-old first cousin once removed when he was 23 years old .</code>          |
  | <code>Estádio do Morumbi is also known as Panetone .</code>         | <code>The Estádio Cícero Pompeu de Toledo , widely known as Morumbi ( ) or Panetone , is a football stadium located in the Morumbi district in São Paulo , Brazil .</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
  ```json
  {'guide': SentenceTransformer(
    (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
    (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
    (2): Normalize()
  ), 'temperature': 0.05}
  ```

#### scitail-pairs-qa

* Dataset: [scitail-pairs-qa](https://huggingface.co/datasets/allenai/scitail) at [0cc4353](https://huggingface.co/datasets/allenai/scitail/tree/0cc4353235b289165dfde1c7c5d1be983f99ce44)
* Size: 2,750 training samples
* Columns: <code>sentence2</code> and <code>sentence1</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence2                                                                         | sentence1                                                                         |
  |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                            |
  | details | <ul><li>min: 7 tokens</li><li>mean: 16.51 tokens</li><li>max: 41 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 15.27 tokens</li><li>max: 34 tokens</li></ul> |
* Samples:
  | sentence2                                                                                                  | sentence1                                                                                              |
  |:-----------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------|
  | <code>All matter in the universe is composed of one or more unique pure substances called elements.</code> | <code>All matter in the universe is composed of one or more unique pure substances called what?</code> |
  | <code>Corals build hard exoskeletons that grow to become coral reefs.</code>                               | <code>Corals build hard exoskeletons that grow to become what?</code>                                  |
  | <code>Insulin is made up of two polypeptide chains.</code>                                                 | <code>Insulin is made up of how many polypeptide chains?</code>                                        |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
  ```json
  {'guide': SentenceTransformer(
    (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
    (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
    (2): Normalize()
  ), 'temperature': 0.05}
  ```

#### scitail-pairs-pos

* Dataset: [scitail-pairs-pos](https://huggingface.co/datasets/allenai/scitail) at [0cc4353](https://huggingface.co/datasets/allenai/scitail/tree/0cc4353235b289165dfde1c7c5d1be983f99ce44)
* Size: 2,750 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                         |
  |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                            |
  | details | <ul><li>min: 7 tokens</li><li>mean: 24.44 tokens</li><li>max: 71 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 15.48 tokens</li><li>max: 40 tokens</li></ul> |
* Samples:
  | sentence1                                                                                                   | sentence2                                                                                                                                 |
  |:------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>Prokaryotes are organisms that lack a cell nucleus and the other membrane bound organelles.</code>    | <code>Most organelles are not found in prokaryotic cells.</code>                                                                          |
  | <code>Vitamins and minerals are needed in small quantities for the adequate functioning of the body.</code> | <code>Vitamins are the organic compounds that the body needs in small amounts to function properly; humans need 13 different ones.</code> |
  | <code>Saturn has a thick atmosphere made up of mostly hydrogen and helium.</code>                           | <code>Saturn is made mostly of helium and hydrogen.</code>                                                                                |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
  ```json
  {'guide': SentenceTransformer(
    (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
    (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
    (2): Normalize()
  ), 'temperature': 0.05}
  ```

#### xsum-pairs

* Dataset: xsum-pairs
* Size: 3,000 training samples
* Columns: <code>document</code> and <code>summary</code>
* Approximate statistics based on the first 1000 samples:
  |         | document                                                                             | summary                                                                          |
  |:--------|:-------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
  | type    | string                                                                               | string                                                                           |
  | details | <ul><li>min: 48 tokens</li><li>mean: 250.33 tokens</li><li>max: 445 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 25.7 tokens</li><li>max: 43 tokens</li></ul> |
* Samples:
  | document                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                        | summary                                                                                                                                                         |
  |:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>It is claimed the incident took place after New Zealand's 30-12 defeat by Australia in Canberra on Friday.<br>The allegations emerged in a court case and, although Melbourne Storm's Bromwich and Gold Coast Titans' Proctor were named, neither have been charged.<br>NZRL says it will take immediate action if the allegations are proven.<br>The court heard that a local man was captured on CCTV preparing a white powder on his phone. He then handed it to Bromwich and Proctor, who were said to have rolled up bank notes and taken the substance.<br>"We are working with the NRL (the Australia-based National Rugby League) while investigations into the alleged incident are ongoing and New Zealand Rugby League will not be making any comment until more information becomes available," said an NZRL statement.<br>The news came after Damian Keogh, chairman of NRL side Cronulla Sharks, stood down after being arrested for alleged drug possession.<br>Keogh is a former basketball player for Australia and played in three Olympic Games. He is scheduled to appear in court on 30 June.<br>New Zealand international Shaun Kenny-Dowall was also charged over allegations of drug possession in Sydney.</code>                                                                                                                                                                                 | <code>New Zealand Rugby League (NZRL) are investigating allegations national captain Jesse Bromwich and team-mate Kevin Proctor bought and took cocaine.</code> |
  | <code>Madeleine Bridle said the wall, which runs behind gardens in a cul-de-sac, was "integral" to Blandford Forum.<br>The town council had decided to replace the section with a wooden fence due to its poor condition.<br>Historic England said the entire wall, cemetery gateway and two chapels had been granted a Grade ll listing.<br>The wall, which was built in the mid-1800s, has been damaged by the roots of several lime and sycamore trees which are subject to preservation orders, Blandford Town Council said.<br>The authority said it had made a failed attempt to list the chapels following an arson attack in September 2013.<br>It had planned to replace the wall with a wooden fence at a cost of £13,525, some of which would be offset by the sale of the bricks.<br>Ms Bridle, who lodged the application, said: "Why should we sell town property?"<br>"We don't want a fence replacing this beautiful 19th Century brick wall.<br>"It is integral to the character of the town."<br>Blandford town clerk Linda Scott-Giles said the wall would now be preserved.<br>She said a builder originally contracted to repair one section had given an estimate of £150,000 to rebuild the entire wall.<br>She said: "This is money we do not have. I don't know what we're going to do.<br>"We may have to put buttresses in people's gardens, but that's not something residents will want."</code> | <code>A Victorian cemetery wall has been saved from demolition after a Dorset resident succeeded in having it listed by Historic England.</code>                |
  | <code>The show, with singer Adam Lambert, will be the band's debut performance at a UK music festival and their only UK show in 2016, organisers said.<br>Guitarist Brian May said former frontman Freddie Mercury "would have loved it".<br>The rock legends will close the four-day festival at Seaclose Park, Newport, on 12 June.<br>Queen drummer Roger Taylor said: "When I think of The Isle of Wight Festival I think of Hendrix, Dylan and The Who. What immortal company to be in.<br>"Queen are thrilled to be there and can promise a special night."<br>The band recently celebrated the 40th anniversary of their record-breaking worldwide hit single Bohemian Rhapsody.<br>They are the first headliners to be announced for the festival which will be marking its 15th year since it relaunched in 2002.</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               | <code>Queen have been revealed as the Sunday night headliners for The Isle of Wight Festival next year.</code>                                                  |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
  ```json
  {'guide': SentenceTransformer(
    (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
    (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
    (2): Normalize()
  ), 'temperature': 0.05}
  ```

#### sciq_pairs

* Dataset: [sciq_pairs](https://huggingface.co/datasets/allenai/sciq) at [2c94ad3](https://huggingface.co/datasets/allenai/sciq/tree/2c94ad3e1aafab77146f384e23536f97a4849815)
* Size: 2,750 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                          |
  |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                             |
  | details | <ul><li>min: 7 tokens</li><li>mean: 17.06 tokens</li><li>max: 66 tokens</li></ul> | <ul><li>min: 2 tokens</li><li>mean: 85.89 tokens</li><li>max: 512 tokens</li></ul> |
* Samples:
  | sentence1                                                                                                       | sentence2                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               |
  |:----------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>In experiments with garden peas, austrian monk gregor mendel described the basic patterns of what?</code> | <code>For thousands of years, humans have understood that characteristics such as eye color, hair color, or even flower color are passed from one generation to the next. The passing of characteristics from parent to offspring is called heredity . Humans have long been interested in understanding heredity. Many hereditary mechanisms were developed by scholars but were not properly tested or quantified. The scientific study of genetics did not begin until the late 19 th century. In experiments with garden peas, Austrian monk Gregor Mendel described the basic patterns of inheritance. Keep in mind that while we know about DNA and its role as the genetic material, Mendel did not know of the existence of DNA. Nor did he understand the concept of the chromosome or the process of meiosis, and yet, he was still able to correctly describe basic inheritance patterns.</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                             |
  | <code>What is the most effective color in interrupting the nighttime portion of the photoperiod?</code>         | <code></code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           |
  | <code>This process of combining the wave functions for atomic orbitals is called what?</code>                   | <code>Quantum-mechanical calculations suggest why the observed bond angles in H2O differ from those predicted by the overlap of the 1s orbital of the hydrogen atoms with the 2p orbitals of the oxygen atom. The mathematical expression known as the wave function, ψ, contains information about each orbital and the wavelike properties of electrons in an isolated atom. When atoms are bound together in a molecule, the wave functions combine to produce new mathematical descriptions that have different shapes. This process of combining the wave functions for atomic orbitals is called hybridization and is mathematically accomplished by the linear combination of atomic orbitals, LCAO, (a technique that we will encounter again later). The new orbitals that result are called hybrid orbitals. The valence orbitals in an isolated oxygen atom are a 2s orbital and three 2p orbitals. The valence orbitals in an oxygen atom in a water molecule differ; they consist of four equivalent hybrid orbitals that point approximately toward the corners of a tetrahedron (Figure 8.7). Consequently, the overlap of the O and H orbitals should result in a tetrahedral bond angle (109.5°). The observed angle of 104.5° is experimental evidence for which quantummechanical calculations give a useful explanation: Valence bond theory must include a hybridization component to give accurate predictions. Note that orbitals may sometimes be drawn in an elongated “balloon” shape rather than in a more realistic “plump” shape in order to make the geometry easier to visualize.</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
  ```json
  {'guide': SentenceTransformer(
    (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
    (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
    (2): Normalize()
  ), 'temperature': 0.05}
  ```

#### qasc_pairs

* Dataset: [qasc_pairs](https://huggingface.co/datasets/allenai/qasc) at [a34ba20](https://huggingface.co/datasets/allenai/qasc/tree/a34ba204eb9a33b919c10cc08f4f1c8dae5ec070)
* Size: 2,750 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                        | sentence2                                                                          |
  |:--------|:---------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
  | type    | string                                                                           | string                                                                             |
  | details | <ul><li>min: 5 tokens</li><li>mean: 11.3 tokens</li><li>max: 26 tokens</li></ul> | <ul><li>min: 16 tokens</li><li>mean: 34.59 tokens</li><li>max: 63 tokens</li></ul> |
* Samples:
  | sentence1                                                                          | sentence2                                                                                                                                                                                                                     |
  |:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>What can HIV infect and destroy part of?</code>                              | <code>HIV infects and destroys helper T cells.. Helper T Cells Helper T cells are the brains behind immune response.. HIV infects and destroys part of the immune response.</code>                                            |
  | <code>what does a renewable, economical source of electricity require?</code>      | <code>hydropower requires damming a river. Hydropower is a renewable, economical source of electricity.. a renewable, economical source of electricity requires damming a river</code>                                        |
  | <code>What may cause animals to fight towards members of their own species?</code> | <code>competition may cause animals to fight towards members of their own species. Competition Animals compete for food and shelter.. food and shelter may cause animals to fight towards members of their own species</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
  ```json
  {'guide': SentenceTransformer(
    (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
    (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
    (2): Normalize()
  ), 'temperature': 0.05}
  ```

#### openbookqa_pairs

* Dataset: openbookqa_pairs
* Size: 2,500 training samples
* Columns: <code>question</code> and <code>fact</code>
* Approximate statistics based on the first 1000 samples:
  |         | question                                                                         | fact                                                                             |
  |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|
  | type    | string                                                                           | string                                                                           |
  | details | <ul><li>min: 3 tokens</li><li>mean: 13.8 tokens</li><li>max: 78 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 11.5 tokens</li><li>max: 30 tokens</li></ul> |
* Samples:
  | question                                                                     | fact                                                                                  |
  |:-----------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|
  | <code>What is animal competition?</code>                                     | <code>if two animals eat the same prey then those animals compete for that pey</code> |
  | <code>If you wanted to make a metal bed frame, where would you start?</code> | <code>alloys are made of two or more metals</code>                                    |
  | <code>Places lacking warmth have few what</code>                             | <code>cold environments contain few organisms</code>                                  |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
  ```json
  {'guide': SentenceTransformer(
    (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
    (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
    (2): Normalize()
  ), 'temperature': 0.05}
  ```

#### msmarco_pairs

* Dataset: [msmarco_pairs](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3) at [28ff31e](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3/tree/28ff31e4c97cddd53d298497f766e653f1e666f9)
* Size: 2,750 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                        | sentence2                                                                           |
  |:--------|:---------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
  | type    | string                                                                           | string                                                                              |
  | details | <ul><li>min: 4 tokens</li><li>mean: 8.67 tokens</li><li>max: 28 tokens</li></ul> | <ul><li>min: 17 tokens</li><li>mean: 76.45 tokens</li><li>max: 211 tokens</li></ul> |
* Samples:
  | sentence1                                                              | sentence2                                                                                                                                                                                                                                                                                                                                 |
  |:-----------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>what is bunco</code>                                             | <code>'Bunco, also known as Bonko or Bunko, is a popular game played with nine dice and a whole lot of luck. Play bunco at parties, with family, or with your 11 other friends that you got stranded on an island with. Follow these steps to learn how to play.</code>                                                                   |
  | <code>what is the tropical zone</code>                                 | <code>Report Abuse. The tropical zone is characterised by strongly monsoonal weather patterns, distinct wet and dry periods which are highly reliable, large quantities of rain and high intensity rainfall, high temperatures, and high rates of energy transformation.</code>                                                           |
  | <code>what is a potential drawback for having a student council</code> | <code>The advantages of having a student council are: 1  To create a positive school atmosphere; 2  To create a caring school environment, which is supportive and inclusive; 3  To act as a vehicle for student participation; 4  To have a beneficial impact on issues such as discipline, bullying and staff-student relations;</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
  ```json
  {'guide': SentenceTransformer(
    (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
    (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
    (2): Normalize()
  ), 'temperature': 0.05}
  ```

#### nq_pairs

* Dataset: [nq_pairs](https://huggingface.co/datasets/sentence-transformers/natural-questions) at [f9e894e](https://huggingface.co/datasets/sentence-transformers/natural-questions/tree/f9e894e1081e206e577b4eaa9ee6de2b06ae6f17)
* Size: 2,750 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                            |
  |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                               |
  | details | <ul><li>min: 9 tokens</li><li>mean: 11.88 tokens</li><li>max: 23 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 133.45 tokens</li><li>max: 512 tokens</li></ul> |
* Samples:
  | sentence1                                                                    | sentence2                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                          |
  |:-----------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>link between get him to the greek and forgetting sarah marshall</code> | <code>Get Him to the Greek Get Him to the Greek is a 2010 American black comedy film written, produced and directed by Nicholas Stoller and starring Russell Brand and Jonah Hill. Released on June 4, 2010, the film is a spin-off sequel of Stoller's 2008 film Forgetting Sarah Marshall, reuniting director Stoller with stars Hill and Brand and producer Judd Apatow. Brand reprises his role as character Aldous Snow from Forgetting Sarah Marshall, while Hill plays an entirely new character referred to as Aaron Green instead of Matthew Van Der Wyk. The film also stars Elisabeth Moss, Rose Byrne, Sean "Diddy" Combs, and Colm Meaney.</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                     |
  | <code>who said nothing short of state is the actualization of freedom</code> | <code>Fascism and ideology During the Enlightenment, a number of ideological influences arose that would shape the development of fascism. The development of the study of universal histories by Johann Gottfried Herder resulted in Herder's analysis of the development of nations, Herder developed the term Nationalismus ("nationalism") to describe this cultural phenomenon. At this time nationalism did not refer to the political ideology of nationalism that was later developed during the French Revolution.[24] Herder also developed the theory that Europeans are the descendants of Indo-Aryan people based on language studies. Herder argued that the Germanic peoples held close racial connections with the ancient Indians and ancient Persians, who he claimed were advanced peoples possessing a great capacity for wisdom, nobility, restraint and science.[25] Contemporaries of Herder utilized the concept of the Aryan race to draw a distinction between what they deemed "high and noble" Aryan culture versus that of "parasitic" Semitic culture and this anti-Semitic variant view of Europeans' Aryan roots formed the basis of Nazi racial views.[25][25] Another major influence on fascism came from the political theories of Georg Wilhelm Friedrich Hegel.[7] Hegel promoted the absolute authority of the state[7] and said "nothing short of the state is the actualization of freedom" and that the "state is the march of God on earth".[17]</code> |
  | <code>where does the mass number go in isotopic notation</code>              | <code>Mass number The mass number is written either after the element name or as a superscript to the left of an element's symbol. For example, the most common isotope of carbon is carbon-12, or 12C, which has 6 protons and 6 neutrons. The full isotope symbol would also have the atomic number (Z) as a subscript to the left of the element symbol directly below the mass number: 12 6C.[2] This is technically redundant, as each element is defined by its atomic number, so it is often omitted.</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
  ```json
  {'guide': SentenceTransformer(
    (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
    (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
    (2): Normalize()
  ), 'temperature': 0.05}
  ```

#### trivia_pairs

* Dataset: [trivia_pairs](https://huggingface.co/datasets/sentence-transformers/trivia-qa) at [a7c36e3](https://huggingface.co/datasets/sentence-transformers/trivia-qa/tree/a7c36e3c8c8c01526bc094d79bf80d4c848b0ad0)
* Size: 2,500 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                            |
  |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                               |
  | details | <ul><li>min: 8 tokens</li><li>mean: 16.44 tokens</li><li>max: 48 tokens</li></ul> | <ul><li>min: 53 tokens</li><li>mean: 455.12 tokens</li><li>max: 512 tokens</li></ul> |
* Samples:
  | sentence1                                                                                                         | sentence2                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                     |
  |:------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>What is Frodo's second name?</code>                                                                         | <code>Frodo Baggins | The One Wiki to Rule Them All | Fandom powered by Wikia Childhood Bilbo talking to Frodo before he goes off to meet with Gandalf the Grey Much of Frodo's youth was spent at Brandy Hall in Buckland , the ancestral home of the Brandybuck family, including his mother ( Primula Brandybuck ). Frodo was known as something of a rascal, befriending Meriadoc (Merry) Brandybuck and Peregrin (Pippin) Took  and causing trouble wherever they went. They would often steal mushrooms from Farmer Maggot 's farm Bamfurlong .   In TA 2980 , when Frodo was only 12 years old, his parents drowned in a boating accident on the Brandywine River . An only child, Frodo stayed in Brandy Hall until his 99-year-old "uncle"  Bilbo Baggins adopted him in TA 2989 . Bilbo took Frodo to live with him in his home at Bag End and made him his heir. Frodo with Bilbo during his 111th birthday The two grew very close in the following years; Frodo learned much of the Elvish language during his time with Bilbo, as well as much of the lore of Middle-earth. The two shared the same birthday, September 22 by Shire Reckoning (around September 12–14 of our calendar), [1] and a party of special magnificence was held at the beginning of The Fellowship of the Ring when Frodo came of age of thirty-three and Bilbo hit the peculiar year of 111. Bilbo gave a memorable Birthday Speech before playing a joke on his fellow hobbits by using the One Ring to disappear, at which Gandalf quickly reacted and used his staff to create a blinding flash where Bilbo had been standing. The hobbits at the Party were left confused and disgruntled, and Bilbo was never again seen in the Shire. Before departing for his journey to Rivendell, Bilbo had a long conversation with Gandalf, who finally persuaded him to voluntarily surrender the One Ring. Bilbo left it on the fireplace mantel with a note for Frodo, who would now become the next Ring-bearer. Coming of Age and Quest Beginning Gandalf telling Frodo the story about the One Ring After the party finished, Frodo returned home and discovered that he was now the master of Bag End and the recipient of Bilbo's magic ring. Gandalf , ever more curious about the ring's origin, power, and purpose (but not yet positive it was the One Ring), advised the young hobbit against the using the ring. For the next seventeen years, Frodo complied with the wizard 's request and hid the Ring in a safe place. However, on April 12 , 3018 , Gandalf returned to Bag End and warned Frodo that the Ring was actually the One Ring, which the evil lord Sauron needed to rule over Middle-earth. Realizing that Sauron would be looking for the Ring, Gandalf advised the Hobbit to secretly follow Bilbo's journey to Rivendell. After Frodo's discussion with Gandalf, a rumor started that he was running out of money. This rumor, although not begun by Frodo, was encouraged by him. Merry helped Frodo to purchase a small house at Crickhollow . With the exception of his gardener Sam Gamgee , who had agreed to accompany him to Rivendell , Frodo told the other Hobbits of the Shire that he intended to move to Buckland . He sold his home to the Sackville-Baggins , and, on the September 23, 3018, the day after his fiftieth birthday, Frodo left from Bag End, taking with him Sam and Pippin. They left in the early morning for Bree , and just in time, as Sauron's most powerful servants, the nine Nazgûl , had entered the Shire dressed as Black riders searching for a hobbit with the name of Baggins. To Bree Frodo was unable to find much information about his pursuers from his conversations with the High Elves and Farmer Maggot , but what they were told was less than encouraging. When Frodo arrived at Buckland, where Merry was waiting, he found that Merry and Pippin already knew about Frodo's "secret" journey. Frodo was left with no alternative but to bring the two youngsters with him. They cut through the Old Forest and the Barrow-downs in hopes of losing the Black Riders, which did succeed. They met other troubles in those places though, at the hands of Old Man Willow and the Barrow-Wi</code> |
  | <code>Israel was proclaimed an independent state in 1948. Who was its prime minister from then until 1963?</code> | <code>The Declaration of the State of Israel The Declaration of the State of Israel May 14, 1948 donations Introduction As the British forces pulled out of Palestine and the mandate came to an end, the Executive Committee of the Jewish "Yishuv" (community) in Palestine met to decide whether or not to declare a state, as has been envisioned under UN Resolution 181. The Arab states had declared that if such a state was declared, they would invade it. Nonetheless, the committee decided to declare a state, armed with the promise of US President Harry S. Truman that he would recognize such a state if it was declared. The Israeli Declaration of Independence was read out on Friday, the 14th of May 1948 by   David Ben Gurion, who then became the first Prime Minister of the new state. The State was quickly recognized by the United States and the USSR. The Palestinians did not declare a state immediately, and though several attempts were made to do so, they were blocked by the Jordanians and then by the Egyptians. The Egyptians later allowed the declaration of such a state in Gaza in September 1948, but it was recognized by no-one and had no resources and no real existence. Arab states had no interest in the formation of a separate state in Palestine, both because each state had territorial ambitions in Palestine, and because they feared the radical influence of Palestinian leadership under Haj Amin El-Husseini, the Grand Mufti of Jerusalem. The declaration stated that Israel  "will uphold the full social and political equality of all its citizens, without distinction of race, creed or sex; will guarantee full freedom of conscience, worship, education and culture; will safeguard the sanctity and inviolability of the shrines and Holy Places of all religions; and will dedicate itself to the principles of the Charter of the United Nations. " The last sentence of the declaration refers to "the rock of Israel" (tsur Yisrael). This is one of the synonyms for God used in Hebrew. According to Tom Segev, in The First Israelis, the wording represents a compromise between the demand of Moshe Shapira representing the religious party that the declaration incorporate a reference to the Lord of Israel, and the demand of the leftist Mapam party representative that the declaration must not incorporate such a reference. The compromise formula made it possible to approve the declaration and publish it before the Sabbath and before the British left the country. May 15, 1948 was a Sabbath. David Ben Gurion, the first Prime Minister, who was a deist or possibly a polite atheist, was agreeable to this compromise. He said on other occasion that for him "the rock of Israel" was the Old Testament with its history and traditions.  Ami Isseroff Notice - Copyright This introduction is Copyright 2001-2003 by MidEastWeb http://www.mideastweb.org and the author. Please tell your friends about MidEastWeb and link to this page. Please do not copy this page to your Web site. You may print this page out for classroom use provided that this notice is appended, and you may cite this material in the usual way. Other uses by permission only.  The source material below is placed in the public domain  and is free of copy restrictions. MidEastWeb is a non-profit organization dedicated to promoting peace and coexistence in the Middle East. We provide balanced and complete information, news and views to promote understanding and dialog. We cannot continue without your help! If peace in the Middle East is important to you, please help us by making a tax-deductible donation . If you don't help us, who will? Thank you!   Declaration of Israel's Independence 1948 Issued at Tel Aviv on May 14, 1948 (5th of Iyar, 5708) The land of Israel was the birthplace of the Jewish people. Here their spiritual, religious and national identity was formed. Here they achieved independence and created a culture of national and universal significance. Here they wrote and gave the Bible to the world. Exiled from Palestine, the Jewish people remained faithful to it in all the</code>                               |
  | <code>What was the first artificial satellite?</code>                                                             | <code>Sputnik NASA Main Page Multimedia Interactive Feature on 50th Anniversary of the Space Age Sputnik and The Dawn of the Space Age History changed on October 4, 1957, when the Soviet Union successfully launched     Sputnik I. The world's first artificial satellite was about the size of     a beach ball (58 cm.or 22.8 inches in diameter), weighed only 83.6 kg. or 183.9 pounds, and took about 98 minutes to orbit     the Earth on its elliptical path. That launch ushered in new political,     military, technological, and scientific developments. While the Sputnik     launch was a single event, it marked the start of the space age and the     U.S.-U.S.S.R space race.  The story begins in 1952, when the International Council of Scientific  Unions decided to establish July 1, 1957, to December 31, 1958, as the International  Geophysical Year (IGY) because the scientists knew that the cycles  of solar activity would be at a high point then. In October 1954, the council  adopted a resolution calling for artificial satellites to be launched during  the IGY to map the Earth's surface.  In July 1955, the White House announced plans to launch an Earth-orbiting satellite for the IGY and solicited proposals  from various Government research agencies to undertake development. In  September 1955, the Naval Research Laboratory's Vanguard proposal was chosen to represent the U.S. during the IGY.  The Sputnik launch changed everything. As a technical achievement, Sputnik  caught the world's attention and the American public off-guard. Its size  was more impressive than Vanguard's intended 3.5-pound payload. In addition,  the public feared that the Soviets' ability to launch satellites also translated  into the capability to launch ballistic missiles that could carry nuclear weapons from Europe to the U.S. Then the Soviets struck again; on November  3, Sputnik II was launched, carrying a much heavier payload, including  a dog named Laika.  Immediately after the Sputnik I launch in October, the U.S. Defense  Department responded to the political furor by approving funding for another  U.S. satellite project. As a simultaneous alternative to Vanguard, Wernher  von Braun and his Army Redstone Arsenal team began work on the Explorer project.  On January 31, 1958, the tide changed, when the United States successfully  launched Explorer I. This satellite carried a small scientific payload  that eventually discovered the magnetic radiation belts around the Earth,  named after principal investigator James Van Allen. The Explorer program  continued as a successful ongoing series of lightweight, scientifically  useful spacecraft.  The Sputnik launch also led directly to the creation of National  Aeronautics and Space Administration (NASA). In July 1958, Congress passed  the National  Aeronautics and Space Act (commonly called the "Space Act") , which  created NASA as of October 1, 1958 from the National Advisory Committee  for Aeronautics (NACA) and other government agencies.  Updated October 10, 2007</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                  |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
  ```json
  {'guide': SentenceTransformer(
    (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
    (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
    (2): Normalize()
  ), 'temperature': 0.05}
  ```

#### gooaq_pairs

* Dataset: [gooaq_pairs](https://huggingface.co/datasets/sentence-transformers/gooaq) at [b089f72](https://huggingface.co/datasets/sentence-transformers/gooaq/tree/b089f728748a068b7bc5234e5bcf5b25e3c8279c)
* Size: 2,500 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                           |
  |:--------|:----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                              |
  | details | <ul><li>min: 8 tokens</li><li>mean: 11.46 tokens</li><li>max: 22 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 56.69 tokens</li><li>max: 154 tokens</li></ul> |
* Samples:
  | sentence1                                                           | sentence2                                                                                                                                                                                                                                                                                                                      |
  |:--------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>what is the main responsibility of a registered nurse?</code> | <code>Registered Nurse Responsibilities: Maintaining accurate, complete health care records and reports. Administering medications to patients and monitoring them for side effects and reactions. Prescribing assistive medical devices and related treatments. Recording patient vital signs and medical information.</code> |
  | <code>how to calculate your salary increase percentage?</code>      | <code>['First, determine the difference between their old and new salary: $52,000 – $50,000 = $2,000.', 'Next, divide the raise amount by their old salary: $2,000 / $50,000 = . ... ', 'To turn the decimal into a percentage, multiply by 100: 100 X . 04 = 4%']</code>                                                      |
  | <code>how does hodgkin lymphoma affect the body?</code>             | <code>Hodgkin lymphoma most often spreads through the lymph vessels from lymph node to lymph node. Rarely, late in the disease, it can invade the bloodstream and spread to other parts of the body, such as the liver, lungs, and/or bone marrow.</code>                                                                      |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
  ```json
  {'guide': SentenceTransformer(
    (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
    (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
    (2): Normalize()
  ), 'temperature': 0.05}
  ```

#### paws-pos

* Dataset: [paws-pos](https://huggingface.co/datasets/google-research-datasets/paws) at [161ece9](https://huggingface.co/datasets/google-research-datasets/paws/tree/161ece9501cf0a11f3e48bd356eaa82de46d6a09)
* Size: 3,250 training samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                         |
  |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                            |
  | details | <ul><li>min: 9 tokens</li><li>mean: 25.67 tokens</li><li>max: 45 tokens</li></ul> | <ul><li>min: 9 tokens</li><li>mean: 25.62 tokens</li><li>max: 44 tokens</li></ul> |
* Samples:
  | sentence1                                                                                                                      | sentence2                                                                                                                 |
  |:-------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------|
  | <code>He was drafted by the Chicago Cardinals and also played for the Philadelphia Eagles and the Washington Redskins .</code> | <code>He was drafted by the Chicago Cardinals and played for the Washington Redskins and the Philadelphia Eagles .</code> |
  | <code>A Jewish full fast takes the following night from sunset to darkness : there are two Jewish full days :</code>           | <code>A Jewish full fast lasts from sunset to darkness the following night . There are two Jewish full days :</code>      |
  | <code>Chad Ochocinco ( born 1978 ; formerly Chad Johnson ) is an American football wide receiver .</code>                      | <code>Chad Ochocinco ( born 1978 ; formerly Chad Johnson ) is an American - American - football receiver .</code>         |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
  ```json
  {'guide': SentenceTransformer(
    (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
    (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
    (2): Normalize()
  ), 'temperature': 0.05}
  ```

### Evaluation Datasets

#### vitaminc-pairs

* Dataset: [vitaminc-pairs](https://huggingface.co/datasets/tals/vitaminc) at [be6febb](https://huggingface.co/datasets/tals/vitaminc/tree/be6febb761b0b2807687e61e0b5282e459df2fa0)
* Size: 108 evaluation samples
* Columns: <code>claim</code> and <code>evidence</code>
* Approximate statistics based on the first 1000 samples:
  |         | claim                                                                             | evidence                                                                           |
  |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                             |
  | details | <ul><li>min: 9 tokens</li><li>mean: 21.36 tokens</li><li>max: 41 tokens</li></ul> | <ul><li>min: 11 tokens</li><li>mean: 36.11 tokens</li><li>max: 79 tokens</li></ul> |
* Samples:
  | claim                                                                               | evidence                                                                                                                                                                                                                                                                                                                                               |
  |:------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>Dragon Con had over 5000 guests .</code>                                      | <code>Among the more than 6000 guests and musical performers at the 2009 convention were such notables as Patrick Stewart , William Shatner , Leonard Nimoy , Terry Gilliam , Bruce Boxleitner , James Marsters , and Mary McDonnell .</code>                                                                                                          |
  | <code>COVID-19 has reached more than 185 countries .</code>                         | <code>As of , more than cases of COVID-19 have been reported in more than 190 countries and 200 territories , resulting in more than deaths .</code>                                                                                                                                                                                                   |
  | <code>In March , Italy had 3.6x times more cases of coronavirus than China .</code> | <code>As of 12 March , among nations with at least one million citizens , Italy has the world 's highest per capita rate of positive coronavirus cases at 206.1 cases per million people ( 3.6x times the rate of China ) and is the country with the second-highest number of positive cases as well as of deaths in the world , after China .</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
  ```json
  {'guide': SentenceTransformer(
    (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
    (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
    (2): Normalize()
  ), 'temperature': 0.05}
  ```

#### negation-triplets

* Dataset: [negation-triplets](https://huggingface.co/datasets/jinaai/negation-dataset-v2)
* Size: 64 evaluation samples
* Columns: <code>anchor</code>, <code>entailment</code>, and <code>negative</code>
* Approximate statistics based on the first 1000 samples:
  |         | anchor                                                                             | entailment                                                                        | negative                                                                           |
  |:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
  | type    | string                                                                             | string                                                                            | string                                                                             |
  | details | <ul><li>min: 10 tokens</li><li>mean: 13.67 tokens</li><li>max: 19 tokens</li></ul> | <ul><li>min: 9 tokens</li><li>mean: 13.08 tokens</li><li>max: 21 tokens</li></ul> | <ul><li>min: 10 tokens</li><li>mean: 13.33 tokens</li><li>max: 21 tokens</li></ul> |
* Samples:
  | anchor                                                                              | entailment                                                                    | negative                                                                          |
  |:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
  | <code>A clean white bathroom with a simple mirror above the vanity.</code>          | <code>A bathroom with a white sink and mirror.</code>                         | <code>A bathroom with a black sink and mirror.</code>                             |
  | <code>Many sheep grazing in a large, green pasture.</code>                          | <code>Many sheep graze in a grassy pasture in a valley. </code>               | <code>Few sheep graze in a grassy pasture in a valley.</code>                     |
  | <code>A group of older people sitting on a park bench with a dog on a leash.</code> | <code>Three elderly people on a bench gazing into the middle distance.</code> | <code>Three elderly people not on a bench gazing into the middle distance.</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
  ```json
  {'guide': SentenceTransformer(
    (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
    (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
    (2): Normalize()
  ), 'temperature': 0.05}
  ```

#### scitail-pairs-pos

* Dataset: [scitail-pairs-pos](https://huggingface.co/datasets/allenai/scitail) at [0cc4353](https://huggingface.co/datasets/allenai/scitail/tree/0cc4353235b289165dfde1c7c5d1be983f99ce44)
* Size: 54 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                          |
  |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                             |
  | details | <ul><li>min: 9 tokens</li><li>mean: 20.81 tokens</li><li>max: 45 tokens</li></ul> | <ul><li>min: 10 tokens</li><li>mean: 15.48 tokens</li><li>max: 23 tokens</li></ul> |
* Samples:
  | sentence1                                                                                                                                                                                     | sentence2                                                                              |
  |:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------|
  | <code>humans normally have 23 pairs of chromosomes.</code>                                                                                                                                    | <code>Humans typically have 23 pairs pairs of chromosomes.</code>                      |
  | <code>A solution is a homogenous mixture of two or more substances that exist in a single phase.</code>                                                                                       | <code>Solution is the term for a homogeneous mixture of two or more substances.</code> |
  | <code>Upwelling The physical process in near-shore ocean systems of rising of nutrients and colder bottom waters to the surface because of constant wind patterns along the shoreline.</code> | <code>Upwelling is the term for when deep ocean water rises to the surface.</code>     |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
  ```json
  {'guide': SentenceTransformer(
    (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
    (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
    (2): Normalize()
  ), 'temperature': 0.05}
  ```

#### xsum-pairs

* Dataset: xsum-pairs
* Size: 128 evaluation samples
* Columns: <code>document</code> and <code>summary</code>
* Approximate statistics based on the first 1000 samples:
  |         | document                                                                             | summary                                                                            |
  |:--------|:-------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
  | type    | string                                                                               | string                                                                             |
  | details | <ul><li>min: 70 tokens</li><li>mean: 263.54 tokens</li><li>max: 419 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 25.77 tokens</li><li>max: 38 tokens</li></ul> |
* Samples:
  | document                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                            | summary                                                                                                                                                              |
  |:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>Wallace made 49 appearances for the Owls last season, having joined the club on a one-year deal in July 2015 following his release by Burnley.<br>The 31-year-old old, who also had spells with Preston, Sunderland and Celtic, scored six goals last term.<br>Dundee-born Wallace won his only senior Scotland cap in a 2-0 defeat by Japan in October 2009.</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                          | <code>Sheffield Wednesday winger Ross Wallace has signed a new contract to stay with the Championship club until 2018.</code>                                        |
  | <code>Media playback is not supported on this device<br>The Swans were bottom of the table before beating fellow strugglers Sunderland 3-0 on Saturday.<br>Bradley had been under pressure having won only one of his first seven games in charge - but insisted he was not worried about his own future.<br>"It's not about me, it's about the work," Bradley said.<br>"I don't spend all week worrying about myself. I only know one way to work, and that's to think about the team, engage the staff, engage the players.<br>"Criticism is part of the job for a manager in the Premier League. I don't think I was the only one to be criticised in the last week."<br>The win against Sunderland was a fine response from Bradley and his players after they were humiliated 5-0 at Tottenham a week earlier.<br>Victory over the Black Cats means Swansea are now above the Premier League's bottom three by virtue of goal difference.<br>"We did get a good response. The players deserve full credit. That's the part of the job, a result gets a little bit out of hand, you can cry about it but you have to look at it in a strong way," said Bradley.<br>"This is a step but we have to build upon it, there's still plenty of work to do. It's a nice bonus to be out of the bottom three, but the work is still there and we can't get ahead of ourselves.<br>"The word many players used when we talked this week was 'pride' and the only thing I did was I tried to get back at them and say: 'What does pride look like actually on the pitch?'<br>"Pride has to turn into intensity, pride has to turn into clean sheets. Don't just talk about pride - put it into something more. At the end of that, for a few seconds you can look at the table and say you're not there yet, but it looks better than last week and we can continue move forward."</code> | <code>Swansea City manager Bob Bradley has warned his side they still have "plenty of work to do" despite climbing out of the Premier League relegation zone.</code> |
  | <code>The level indicates that Americans expect the economy to remain strong through the second half of the year.<br>According to the Conference Board, which tracks consumer sentiment, the index reading for August was 101.1, up from 96.7 in July.<br>The index has not reached such a high point since September 2015.<br>"Consumers' assessment of both current business and labour market conditions was considerably more favourable than last month," Lynn Franco, the Conference Board's head of economic indicators.<br>"Short-term expectations regarding business and employment conditions, as well as personal income prospects, also improved, suggesting the possibility of a moderate pick-up in growth in the coming months."<br>Increases in consumer confidence typically indicate more people are willing to spend money. As more than two-thirds of the US economy is generated by consumer spending, the increase signals likely economic growth.<br>The percentage of Americans who expect business conditions to continue to improve over the in the next six months rose from 15.7% to 17.3%, while the number who expected worsening conditions fell from 12.4% to 11.1%.<br>The figure beat expectations by analysts, who predicted consumer confidence would stand at 97 on the index.<br>Earlier on Tuesday, the Federal Reserve's vice-chairman, Stanley Fischer, said in an interview with Bloomberg News that the US labour market was close to full strength.<br>Mr Fischer did not say whether the improvements in the labour market meant the Fed would increase interest rates at its upcoming meeting in September.</code>                                                                                                                                                                                                                   | <code>US consumer confidence reached its highest point in nearly a year in August as economic conditions continue to improve.</code>                                 |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
  ```json
  {'guide': SentenceTransformer(
    (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
    (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
    (2): Normalize()
  ), 'temperature': 0.05}
  ```

#### sciq_pairs

* Dataset: [sciq_pairs](https://huggingface.co/datasets/allenai/sciq) at [2c94ad3](https://huggingface.co/datasets/allenai/sciq/tree/2c94ad3e1aafab77146f384e23536f97a4849815)
* Size: 128 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                          |
  |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                             |
  | details | <ul><li>min: 7 tokens</li><li>mean: 16.91 tokens</li><li>max: 42 tokens</li></ul> | <ul><li>min: 2 tokens</li><li>mean: 79.91 tokens</li><li>max: 433 tokens</li></ul> |
* Samples:
  | sentence1                                                                                                            | sentence2                                                                                                                                                                                                                                                                                                                                                                                                          |
  |:---------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>How many forces do objects on earth have acting on them at all times?</code>                                   | <code>More than one force may act on an object at the same time. In fact, just about all objects on Earth have at least two forces acting on them at all times. One force is gravity, which pulls objects down toward the center of Earth. The other force is an upward force that may be provided by the ground or other surface.</code>                                                                          |
  | <code>A rusty bike has been left outside in damp weather too many times, so the iron in the metal parts have?</code> | <code>Look at this rusty bike. It has been left outside in damp weather too many times, so the iron in the metal parts has rusted. Iron rusts when it combines with oxygen in the air. Iron rusting is an example of a chemical reaction. In a chemical reaction, substances change into entirely different substances. For example, the iron in the bike and the oxygen in the air have changed into rust.</code> |
  | <code>What are the smallest type of blood vessel?</code>                                                             | <code>Further away from the heart, the aorta branches into smaller arteries, which eventually branch into capillaries. Capillaries are the smallest type of blood vessel; they connect very small arteries and veins. Gases and other substances are exchanged between cells and the blood across the very thin walls of capillaries.</code>                                                                       |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
  ```json
  {'guide': SentenceTransformer(
    (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
    (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
    (2): Normalize()
  ), 'temperature': 0.05}
  ```

#### qasc_pairs

* Dataset: [qasc_pairs](https://huggingface.co/datasets/allenai/qasc) at [a34ba20](https://huggingface.co/datasets/allenai/qasc/tree/a34ba204eb9a33b919c10cc08f4f1c8dae5ec070)
* Size: 128 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                          |
  |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                             |
  | details | <ul><li>min: 5 tokens</li><li>mean: 10.95 tokens</li><li>max: 19 tokens</li></ul> | <ul><li>min: 19 tokens</li><li>mean: 34.16 tokens</li><li>max: 56 tokens</li></ul> |
* Samples:
  | sentence1                                                       | sentence2                                                                                                                                                  |
  |:----------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>what have a circulatory system?</code>                    | <code>Mollusks have a circulatory system with one or two hearts that pump blood.. Mussels are bivalve mollusks.. mussels have a circulatory system</code>  |
  | <code>what can the eye sense?</code>                            | <code>Sight is the ability to sense light, and the eye is the organ that senses light.. Colors use the sense of sight.. eyes can sense colors</code>       |
  | <code>If a person does what it may be due to a pathogen?</code> | <code>bacteria can cause people to become ill. Bacteria that cause disease are called pathogens.. If a person falls ill it may be due to a pathogen</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
  ```json
  {'guide': SentenceTransformer(
    (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
    (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
    (2): Normalize()
  ), 'temperature': 0.05}
  ```

#### openbookqa_pairs

* Dataset: openbookqa_pairs
* Size: 128 evaluation samples
* Columns: <code>question</code> and <code>fact</code>
* Approximate statistics based on the first 1000 samples:
  |         | question                                                                          | fact                                                                              |
  |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                            |
  | details | <ul><li>min: 3 tokens</li><li>mean: 13.98 tokens</li><li>max: 47 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 11.78 tokens</li><li>max: 28 tokens</li></ul> |
* Samples:
  | question                                                               | fact                                                                         |
  |:-----------------------------------------------------------------------|:-----------------------------------------------------------------------------|
  | <code>The thermal production of a stove is generically used for</code> | <code>a stove generates heat for cooking usually</code>                      |
  | <code>What creates a valley?</code>                                    | <code>a valley is formed by a river flowing</code>                           |
  | <code>when it turns day and night on a planet, what cause this?</code> | <code>a planet rotating causes cycles of day and night on that planet</code> |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
  ```json
  {'guide': SentenceTransformer(
    (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
    (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
    (2): Normalize()
  ), 'temperature': 0.05}
  ```

#### msmarco_pairs

* Dataset: [msmarco_pairs](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3) at [28ff31e](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3/tree/28ff31e4c97cddd53d298497f766e653f1e666f9)
* Size: 128 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                        | sentence2                                                                           |
  |:--------|:---------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
  | type    | string                                                                           | string                                                                              |
  | details | <ul><li>min: 4 tokens</li><li>mean: 8.64 tokens</li><li>max: 18 tokens</li></ul> | <ul><li>min: 27 tokens</li><li>mean: 72.08 tokens</li><li>max: 184 tokens</li></ul> |
* Samples:
  | sentence1                                     | sentence2                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                            |
  |:----------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>where is the rib king located</code>    | <code>Carolina Rib King is located in Georgetown, South Carolina. This organization primarily operates in the Eating Places business / industry within the Eating and Drinking Places sector. This organization has been operating for approximately 8 years.</code>                                                                                                                                                                                                                                                                                                 |
  | <code>what dosage does vyvanse come in</code> | <code>Vyvanse (Lisdexamfetamine) Dosage. Vyvanse comes in capsules of 10 milligrams (mg), 20 mg, 30 mg, 40 mg, 50 mg, 60 mg, and 70 mg. A typical starting dose for adults is 30 mg every morning. Your doctor will monitor your results every week or so and may adjust your dose by 10 to 20 mg, depending on your response to it.yvanse is the brand name of the prescription drug lisdexamfetamine dimesylate. Vyvanse is used to treat attention deficit hyperactivity disorder (ADHD) in children and adults and binge-eating disorder (BED) in adults.</code> |
  | <code>what is a rose engine?</code>           | <code>g{x `ÉwxÜÇ eÉáx. Definition: “A rose engine lathe is a specialized kind of ornamental lathe. The headstock rocks back. and forth, controlled by a rubber moving against a rosette or cam-like pattern mounted on. the spindle, while the lathe spindle rotates. Rose engine work can make flower patterns, as.</code>                                                                                                                                                                                                                                   |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
  ```json
  {'guide': SentenceTransformer(
    (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
    (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
    (2): Normalize()
  ), 'temperature': 0.05}
  ```

#### nq_pairs

* Dataset: [nq_pairs](https://huggingface.co/datasets/sentence-transformers/natural-questions) at [f9e894e](https://huggingface.co/datasets/sentence-transformers/natural-questions/tree/f9e894e1081e206e577b4eaa9ee6de2b06ae6f17)
* Size: 128 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                          | sentence2                                                                            |
  |:--------|:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
  | type    | string                                                                             | string                                                                               |
  | details | <ul><li>min: 10 tokens</li><li>mean: 11.94 tokens</li><li>max: 19 tokens</li></ul> | <ul><li>min: 25 tokens</li><li>mean: 126.44 tokens</li><li>max: 329 tokens</li></ul> |
* Samples:
  | sentence1                                                                                          | sentence2                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                            |
  |:---------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>how many episodes does the last ship have</code>                                             | <code>List of The Last Ship episodes On August 11, 2015, The Last Ship was renewed for a 13-episode third season,[4] which was scheduled to premiere on June 12, 2016, but postponed following the 2016 Orlando nightclub shooting due to the plot of the episode also containing a mass shooting in a nightclub.[5][6] As of October 8, 2017,[update] 46 episodes of The Last Ship have aired. In September 2016, TNT renewed the series for a 10-episode fifth season.[7]</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                  |
  | <code>who was given the task of illuminating the original document of constitution of india</code> | <code>Constitution of India The assembly met in sessions open to the public, for 166 days, spread over a period of 2 years, 11 months and 18 days before adopting the Constitution, the 308 members of the assembly signed two copies of the document (one each in Hindi and English) on 24 January 1950. The original Constitution of India is hand-written with beautiful calligraphy, each page beautified and decorated by artists from Shantiniketan including Beohar Rammanohar Sinha and Nandalal Bose. The illustrations on the cover and pages represent styles from the different civilisations of the subcontinent, ranging from the prehistoric Mohenjodaro civilisation, in the Indus Valley, to the present. The calligraphy in the book was done by Prem Behari Narain Raizda. It was published in Dehra Dun, and photolithographed at the offices of Survey of India. The entire exercise to produce the original took nearly five years. Two days later, on 26 January 1950, the Constitution of India became the law of all the States and territories of India.[17] Rs.1,00,00,000 was official estimate of expenditure on constituent assembly. It has undergone many amendments since its enactment.[18]</code> |
  | <code>what amendments were added after the bill of rights</code>                                   | <code>List of amendments to the United States Constitution Thirty-three amendments to the United States Constitution have been proposed by the United States Congress and sent to the states for ratification since the Constitution was put into operation on March 4, 1789. Twenty-seven of these, having been ratified by the requisite number of states, are part of the Constitution. The first ten amendments were adopted and ratified simultaneously and are known collectively as the Bill of Rights. Six amendments adopted by Congress and sent to the states have not been ratified by the required number of states. Four of these amendments are still technically open and pending, one is closed and has failed by its own terms, and one is closed and has failed by the terms of the resolution proposing it.</code>                                                                                                                                                                                                                                                                                                                                                                                               |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
  ```json
  {'guide': SentenceTransformer(
    (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
    (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
    (2): Normalize()
  ), 'temperature': 0.05}
  ```

#### trivia_pairs

* Dataset: [trivia_pairs](https://huggingface.co/datasets/sentence-transformers/trivia-qa) at [a7c36e3](https://huggingface.co/datasets/sentence-transformers/trivia-qa/tree/a7c36e3c8c8c01526bc094d79bf80d4c848b0ad0)
* Size: 128 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                            |
  |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                               |
  | details | <ul><li>min: 8 tokens</li><li>mean: 16.59 tokens</li><li>max: 40 tokens</li></ul> | <ul><li>min: 77 tokens</li><li>mean: 441.18 tokens</li><li>max: 512 tokens</li></ul> |
* Samples:
  | sentence1                                                                                                | sentence2                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                     |
  |:---------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>In which country did Argentina first win soccer's World Cup?</code>                                | <code>World Cup winners list: A complete history - SBNation.com World Cup winners list: A complete history Rec Dean Mouhtaropoulos In 1930, thirteen teams participated in the first World Cup held in Uruguay. Since then, the countries of the world have come together every four years (except in the 1940's-yes Germany, looking at you here) to play in the tournament, with 77 countries having participated in 20 tournaments as of 2014. Despite, the large number of countries to participate, only eight of them have enjoyed the glory of actually winning it. Brazil are on the top with five (don't mention this to Brazilians right now, though), and Germany are next on the list with four, their most recent having been secured against Argentina on Sunday. Here's a quick tour of each winning nation. Brazil 2014: Germany Germany became the first ever European team to win a World Cup in South America, and lifted the trophy for the first time since reunification. Fittingly, in a tournament in which nothing was predictable, Germany didn't look completely convincing en route to their final against Argentina, and notably needed extra time to get past the unfavoured Algeria in the first knockout round. However, Die Mannschaft grew into the tournament, and inflicted a historic 7-1 thrashing on tournament hosts Brazil in the semis before Mario Götze's last-gasp extra time strike settled a close final. Argentina captain Lionel Messi earned the Golden Ball as a consolation which was really none at all. South Africa 2010: Spain The Spanish team in 2010 was special, which makes its early exit in Brazil even more of a mystery. In South Africa, Andrés Iniesta scored in the 116th minute agaist the Netherlands to give Spain their first World Cup. Six members of the team, along with their coach Vincente del Bosque, were voted onto the team of the tournament. Iker Casillas, the goalkeeper, won the Golden Glove award (previously the Yashin Award), shutting out his opponents in five of the seven matches. The team also won the FIFA Fair Play Trophy. Germany 2006: Italy Italy's victory over France in the final was one for the memories. Not only did Italy win 5-3 on penalty kicks, but France's captain Zinedine Zidane was red-carded for head-butting Marco Materazzi in extra-time.  Italy's goalkeeper, Gianluigi Buffon won the Yashin Award given to the best goalkeeper, and was one of seven Italian players voted to the All-Star team. The victory gave Italy their fourth World Cup title, then second only to Brazil's five, but matched by Germany this year. Korea-Japan 2002: Brazil This World Cup was Ronaldo's World Cup. The old one. The Brazilian striker won the Golden Boot award (highest scoring player), scoring eight goals in the tournament. Two of those came in the final, as Brazil shut out Germany 2-0 and won their record fifth World Cup. Ronaldo was voted to the team of the tournament along with teammates Rivaldo, Ronaldinho, and Roberto Carlos finished with a 7-0-0 record and a plus-14 goal differential. France 1998: France If you think the header is a typo, you are mistaken! When France won the tournament in France they became the sixth country to win the tournament on home soil. France's goalkeeper won the inaugural Yashin Award, letting in only two goals, and eight French players scored in the tournament. Zinedine Zidane headlined the French attack, as France ended with a plus-13 goal differential. They were also given the FIFA Fair Play Trophy and voted the Most Entertaining Team. USA 1994: Brazil When Brazil faced Italy in the '94 final both teams were looking for their record fourth title. Brazil defeated Italy 3-2 on penalty kicks, becoming the first country to win the final via a shootout. Romário scored five goals and won the Golden Ball award (best player), and Brazil won the FIFA Fair Play Trophy and was voted the Most Entertaining Team. On a side note, the US chose this as the the mascot for the tournament. #Fifa #WorldCup World Cup In honor of this amazing month of soccer, #tbt World Cup '94 with the mascot Striker #b ... pic.twitter.com/ri1nVPC4iT — FI</code> |
  | <code>When Elisha Graves Otis invented it, he called it the safety hoist. What do we call it now?</code> | <code>Inventor Elisha Otis Biography Inventor: Elisha Graves Otis Criteria; First to invent. First       to patent. First practical. Entrepreneur. Birth: August 3, 1811 in       Halifax, Vermont Death: April 8, 1861 in       Yonkers, New York Nationality: American Invention: elevator,       safety brake in 1852 Function: noun /       el�e�va�tor Definition: A platform       or an enclosure raised and lowered in a vertical shaft to transport       people or freight. The shaft contains the operating equipment,       motor, cables, and accessories. Patent: 31,128 (US)       issued January 15, 1861 Milestones: 1852 invents a safety latch for     hoisting equipment 1853 starts a company to manufacture safe elevators. Sells elevator to     hoist freight 1854 Otis demonstrates the elevator at the World's Fair, Crystal Palace     exposition in New York City 1857 Installs the first passenger safe elevator in a New York department     store 1861 receives patent for improvements to hoisting apparatus, safety     brake 1861 after his death his sons form Otis Brothers & Company 1873 over 2,000 Otis elevators were     in use in office buildings, hotels and department stores 1898 Otis Brothers merged with 14     other elevator entities to form the Otis Elevator Company 1903 introduced the gearless traction electric elevator 1931 first Otis double-deck elevator was installed elevator, safety elevator, safety brake for elevators, elisha graves     otis, otis elevatorm UTC, patent 31128, invention, history,      inventor of, history of, who invented, invention of, fascinating      facts. The Story: Imagine the skyline of a modern city if the elevator did not exist.   Buildings would be limited to five or six stories. Most of the architecture   of the 20th and 21st century would be impossible. Office towers, hotels and   high-rise apartments would hardly stand in their present form. The need for vertical transport is as old as   civilization. Over the centuries, mankind has employed ingenious forms of   lifting. The earliest lifts used man, animal and water power to raise the   load. Lifting devices relied on these basic forms of power from the early   agricultural societies until the dawn of the Industrial Revolution. From ancient times through the     Middle Ages, and into the 13th century, man or animal power was the driving force behind     hoisting devices. In ancient Greece, Archimedes developed an improved   lifting device operated by ropes and pulleys, in which the hoisting ropes   were coiled around a winding drum by a capstan and levers. By A.D. 80,   gladiators and wild animals rode crude elevators up to the arena level of   the Roman Coliseum. Medieval records contain numerous drawings of hoists lifting men and   supplies to isolated locations. Among the most famous is the hoist at the   monastery of St. Barlaam in Greece. The monastery stood on a pinnacle   approximately (200 ft) above the ground. Its hoist, which employed a basket   or cargo net, was the only means up or down. The first elevator designed for a   passemger was built in 1743 for King Louis XV at his palace in     France. The one-person contraption went up only one floor, from the first to the second.     Known as the "Flying Chair," it was on the outside of the building, and was     entered by the king via his balcony. The mechanism consisted of a carefully balanced     arrangement of weights and pulleys hanging inside a chimney. Men stationed inside the     chimney then raised or lowered the Flying Chair at the king's command.  By 1850 steam and hydraulic elevators had been introduced,     but it was in 1852 that the landmark event in elevator history occurred: the invention of     the world's first safety elevator by Elisha Graves Otis. The first passenger elevator was installed     by Otis in New York in 1857. After Otis' death in 1861, his sons, Charles and</code>                                                                                                                                                                                                             |
  | <code>What colour of flag should a ship fly to show it is in quarantine?</code>                          | <code>Quarantine Flag Quarantine flag Posted to Maritime Musings (by Dennis Bryant ) on January 6, 2012 A visible warning to stay clear. The quarantine flag, also called the “Yellow Jack”, is the international signal flag LIMA.  It is square in shape.  Its display is divided into four smaller squares, with two on top and two on the bottom.  The smaller squares are alternately yellow and black in color.  The flag is flown from a ship that is either arriving in port with known serious health problems or that has been placed under quarantine by the local port authorities.  Once the local authorities have determined that the ship’s health problems have been resolved and removed the quarantine order, the ship may fly the free pratique flag (e.g., the international signal flag QUEBEC), which is solid yellow.  The concept of quarantine is ancient and is mentioned in the Old Testament.  The term itself is derived from the practice of the city-state of Venice during the Middle Ages of requiring ships arriving from locations known to being experiencing diseases such as the plague to anchor or moor off the port for 40 days (quaranta giorni) so that any disease on board might run its course.  The practice of quarantine has varied over the centuries, but the concept of protecting the public health by restricting the movements of individuals who are suspected of possibly harboring serious disease has remained constant.  The World Health Organization (WHO) provides guidelines on how and when quarantine should be used, but its actual implementation is left to the discretion of individual nations.  In the United States, the Center for Disease Control and Prevention (CDC) administers the federal quarantine program, but the separate states and local communities also have broad powers.  Ships arriving in a US port with serious disease on board are required to provide advance notification.  The ship may be required to undertake certain sanitary measures and to exercise various controls over all persons on board to prevent them from serving as disease vectors potentially infecting the local populace.  The closest we have come recently to a general quarantine affecting the maritime industry was during the 2002 SARS epidemic, which heavily impacted southeast Asia.  A future pandemic, whether the result of avian flu or otherwise, may see widespread implementation of quarantine measures and flying of the quarantine flag.</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                 |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
  ```json
  {'guide': SentenceTransformer(
    (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
    (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
    (2): Normalize()
  ), 'temperature': 0.05}
  ```

#### gooaq_pairs

* Dataset: [gooaq_pairs](https://huggingface.co/datasets/sentence-transformers/gooaq) at [b089f72](https://huggingface.co/datasets/sentence-transformers/gooaq/tree/b089f728748a068b7bc5234e5bcf5b25e3c8279c)
* Size: 128 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                         | sentence2                                                                          |
  |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                             |
  | details | <ul><li>min: 8 tokens</li><li>mean: 11.25 tokens</li><li>max: 19 tokens</li></ul> | <ul><li>min: 21 tokens</li><li>mean: 56.9 tokens</li><li>max: 127 tokens</li></ul> |
* Samples:
  | sentence1                                                                            | sentence2                                                                                                                                                                                                                                                                                                                                    |
  |:-------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>what is the difference between lemon eucalyptus oil and eucalyptus oil?</code> | <code>What Is the Difference between Eucalyptus Oil and Lemon Eucalyptus Oil? Lemon Eucalyptus oil comes from a different kind of tree than Eucalyptus oil. Lemon eucalyptus is a common nickname for the tree, but it is also called the lemon-scented gum and blue spotted gum. Despite its name, Lemon Eucalyptus is not a citrus.</code> |
  | <code>pokemon sword and shield will pokemon follow you?</code>                       | <code>Any Pokémon that can be used in Sword and Shield is eligible to follow you around, even Legendaries. This feature only works while you're inside the Isle of Armor area, however. Once you return to Galar proper, the Pokémon will return to its ball.</code>                                                                         |
  | <code>how long does it take to get a naturalization certificate replacement?</code>  | <code>After filing Form N-565, Application for Replacement Naturalization/Citizenship Document, the N-565 processing time will take 5-12 months in most cases.</code>                                                                                                                                                                        |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
  ```json
  {'guide': SentenceTransformer(
    (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
    (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
    (2): Normalize()
  ), 'temperature': 0.05}
  ```

#### paws-pos

* Dataset: [paws-pos](https://huggingface.co/datasets/google-research-datasets/paws) at [161ece9](https://huggingface.co/datasets/google-research-datasets/paws/tree/161ece9501cf0a11f3e48bd356eaa82de46d6a09)
* Size: 128 evaluation samples
* Columns: <code>sentence1</code> and <code>sentence2</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence1                                                                          | sentence2                                                                          |
  |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
  | type    | string                                                                             | string                                                                             |
  | details | <ul><li>min: 10 tokens</li><li>mean: 25.72 tokens</li><li>max: 42 tokens</li></ul> | <ul><li>min: 10 tokens</li><li>mean: 25.55 tokens</li><li>max: 41 tokens</li></ul> |
* Samples:
  | sentence1                                                                                                                                                      | sentence2                                                                                                                                                      |
  |:---------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>They were there to enjoy us and they were there to pray for us .</code>                                                                                  | <code>They were there for us to enjoy and they were there for us to pray .</code>                                                                              |
  | <code>After the end of the war in June 1902 , Higgins left Southampton in the `` SSBavarian '' in August , returning to Cape Town the following month .</code> | <code>In August , after the end of the war in June 1902 , Higgins Southampton left the `` SSBavarian '' and returned to Cape Town the following month .</code> |
  | <code>From the merger of the Four Rivers Council and the Audubon Council , the Shawnee Trails Council was born .</code>                                        | <code>Shawnee Trails Council was formed from the merger of the Four Rivers Council and the Audubon Council .</code>                                            |
* Loss: [<code>CachedGISTEmbedLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters:
  ```json
  {'guide': SentenceTransformer(
    (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
    (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
    (2): Normalize()
  ), 'temperature': 0.05}
  ```

### Training Hyperparameters
#### Non-Default Hyperparameters

- `eval_strategy`: steps
- `per_device_train_batch_size`: 224
- `per_device_eval_batch_size`: 64
- `gradient_accumulation_steps`: 5
- `learning_rate`: 4e-05
- `weight_decay`: 0.0001
- `num_train_epochs`: 1
- `lr_scheduler_type`: cosine_with_min_lr
- `lr_scheduler_kwargs`: {'num_cycles': 0.5, 'min_lr': 1e-05}
- `warmup_ratio`: 0.33
- `save_safetensors`: False
- `fp16`: True
- `push_to_hub`: True
- `hub_model_id`: bobox/DeBERTa-small-ST-v1-toytest-checkpoints-tmp
- `hub_strategy`: all_checkpoints
- `batch_sampler`: no_duplicates

#### All Hyperparameters
<details><summary>Click to expand</summary>

- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: steps
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 224
- `per_device_eval_batch_size`: 64
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 5
- `eval_accumulation_steps`: None
- `learning_rate`: 4e-05
- `weight_decay`: 0.0001
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 1
- `max_steps`: -1
- `lr_scheduler_type`: cosine_with_min_lr
- `lr_scheduler_kwargs`: {'num_cycles': 0.5, 'min_lr': 1e-05}
- `warmup_ratio`: 0.33
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: False
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: True
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: True
- `resume_from_checkpoint`: None
- `hub_model_id`: bobox/DeBERTa-small-ST-v1-toytest-checkpoints-tmp
- `hub_strategy`: all_checkpoints
- `hub_private_repo`: False
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`: 
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `batch_sampler`: no_duplicates
- `multi_dataset_batch_sampler`: proportional

</details>

### Training Logs
| Epoch  | Step | Training Loss | openbookqa pairs loss | qasc pairs loss | xsum-pairs loss | msmarco pairs loss | trivia pairs loss | negation-triplets loss | sciq pairs loss | nq pairs loss | gooaq pairs loss | scitail-pairs-pos loss | vitaminc-pairs loss | paws-pos loss | NLI-v2_max_accuracy | VitaminC_max_ap | sts-test_spearman_cosine |
|:------:|:----:|:-------------:|:---------------------:|:---------------:|:---------------:|:------------------:|:-----------------:|:----------------------:|:---------------:|:-------------:|:----------------:|:----------------------:|:-------------------:|:-------------:|:-------------------:|:---------------:|:------------------------:|
| 0.0291 | 1    | 6.7536        | -                     | -               | -               | -                  | -                 | -                      | -               | -             | -                | -                      | -                   | -             | -                   | -               | -                        |
| 0.0581 | 2    | 6.6203        | 4.7439                | 3.9689          | 6.3278          | 10.5136            | 3.8610            | 5.0942                 | 0.3654          | 4.9690        | 8.0411           | 1.9184                 | 2.7266              | 2.2190        | 1.0                 | 0.5178          | 0.0712                   |
| 0.0872 | 3    | 6.7963        | -                     | -               | -               | -                  | -                 | -                      | -               | -             | -                | -                      | -                   | -             | -                   | -               | -                        |
| 0.1163 | 4    | 6.4488        | 4.6508                | 3.6622          | 6.1990          | 9.4879             | 3.5246            | 5.0816                 | 0.3414          | 4.4714        | 7.3951           | 1.9187                 | 2.7045              | 2.2332        | 1.0                 | 0.5220          | 0.0777                   |
| 0.1453 | 5    | 6.5567        | -                     | -               | -               | -                  | -                 | -                      | -               | -             | -                | -                      | -                   | -             | -                   | -               | -                        |
| 0.1744 | 6    | 7.994         | 4.4811                | 3.3633          | 6.0795          | 8.0488             | 3.2845            | 5.0681                 | 0.3208          | 3.7927        | 6.6778           | 1.9320                 | 2.6922              | 2.2626        | 1.0                 | 0.5220          | 0.0909                   |
| 0.2035 | 7    | 7.1037        | -                     | -               | -               | -                  | -                 | -                      | -               | -             | -                | -                      | -                   | -             | -                   | -               | -                        |
| 0.2326 | 8    | 6.6239        | 4.3260                | 3.1746          | 6.0509          | 6.9898             | 3.2417            | 5.0856                 | 0.3155          | 3.3527        | 6.2884           | 1.9701                 | 2.7007              | 2.3511        | 1.0                 | 0.5262          | 0.1031                   |
| 0.2616 | 9    | 6.7359        | -                     | -               | -               | -                  | -                 | -                      | -               | -             | -                | -                      | -                   | -             | -                   | -               | -                        |
| 0.2907 | 10   | 7.0187        | 4.2138                | 3.0288          | 5.9589          | 6.4430             | 3.1168            | 5.1371                 | 0.3123          | 3.1352        | 6.0863           | 2.0432                 | 2.7152              | 2.5095        | 1.0                 | 0.5267          | 0.1129                   |
| 0.3198 | 11   | 6.4394        | -                     | -               | -               | -                  | -                 | -                      | -               | -             | -                | -                      | -                   | -             | -                   | -               | -                        |
| 0.3488 | 12   | 6.2215        | 4.1649                | 2.9194          | 5.7986          | 6.2098             | 2.9758            | 5.1811                 | 0.3086          | 3.0086        | 5.9661           | 2.0938                 | 2.7210              | 2.6383        | 1.0                 | 0.5256          | 0.1307                   |
| 0.3779 | 13   | 6.2269        | -                     | -               | -               | -                  | -                 | -                      | -               | -             | -                | -                      | -                   | -             | -                   | -               | -                        |
| 0.4070 | 14   | 6.3233        | 4.1147                | 2.8540          | 5.7386          | 6.0936             | 2.9295            | 5.1623                 | 0.3054          | 2.9382        | 5.9160           | 2.0478                 | 2.7146              | 2.6020        | 1.0                 | 0.5312          | 0.1577                   |
| 0.4360 | 15   | 6.2096        | -                     | -               | -               | -                  | -                 | -                      | -               | -             | -                | -                      | -                   | -             | -                   | -               | -                        |
| 0.4651 | 16   | 6.0716        | 4.0301                | 2.7810          | 5.6698          | 5.9039             | 2.8899            | 5.0852                 | 0.2981          | 2.8589        | 5.8009           | 1.9004                 | 2.6984              | 2.4280        | 1.0                 | 0.5349          | 0.1952                   |
| 0.4942 | 17   | 5.9185        | -                     | -               | -               | -                  | -                 | -                      | -               | -             | -                | -                      | -                   | -             | -                   | -               | -                        |
| 0.5233 | 18   | 5.7074        | 3.8950                | 2.6600          | 5.5475          | 5.6801             | 2.8672            | 4.9267                 | 0.2845          | 2.7670        | 5.6335           | 1.5994                 | 2.6848              | 2.0202        | 1.0                 | 0.5331          | 0.2225                   |


### Framework Versions
- Python: 3.10.13
- Sentence Transformers: 3.0.1
- Transformers: 4.42.3
- PyTorch: 2.1.2
- Accelerate: 0.32.1
- Datasets: 2.20.0
- Tokenizers: 0.19.1

## Citation

### BibTeX

#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}
```

<!--
## Glossary

*Clearly define terms in order to be accessible across audiences.*
-->

<!--
## Model Card Authors

*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->

<!--
## Model Card Contact

*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
-->