--- base_model: microsoft/deberta-v3-small datasets: - jinaai/negation-dataset-v2 - tals/vitaminc - allenai/scitail - allenai/sciq - allenai/qasc - sentence-transformers/msmarco-msmarco-distilbert-base-v3 - sentence-transformers/natural-questions - sentence-transformers/trivia-qa - sentence-transformers/gooaq - google-research-datasets/paws language: - en library_name: sentence-transformers metrics: - pearson_cosine - spearman_cosine - pearson_manhattan - spearman_manhattan - pearson_euclidean - spearman_euclidean - pearson_dot - spearman_dot - pearson_max - spearman_max - cosine_accuracy - dot_accuracy - manhattan_accuracy - euclidean_accuracy - max_accuracy - cosine_accuracy_threshold - cosine_f1 - cosine_f1_threshold - cosine_precision - cosine_recall - cosine_ap - dot_accuracy_threshold - dot_f1 - dot_f1_threshold - dot_precision - dot_recall - dot_ap - manhattan_accuracy_threshold - manhattan_f1 - manhattan_f1_threshold - manhattan_precision - manhattan_recall - manhattan_ap - euclidean_accuracy_threshold - euclidean_f1 - euclidean_f1_threshold - euclidean_precision - euclidean_recall - euclidean_ap - max_accuracy_threshold - max_f1 - max_f1_threshold - max_precision - max_recall - max_ap pipeline_tag: sentence-similarity tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:36500 - loss:CachedGISTEmbedLoss widget: - source_sentence: what are brake drums made of sentences: - Stereotactic radiosurgery (SRS) is a non-surgical radiation therapy used to treat functional abnormalities and small tumors of the brain.It can deliver precisely-targeted radiation in fewer high-dose treatments than traditional therapy, which can help preserve healthy tissue.tereotactic radiosurgery (SRS) is a non-surgical radiation therapy used to treat functional abnormalities and small tumors of the brain. - Human rights in Germany - Law. 1 The constitution of Germany, the Grundgesetz, which came into effect in May 8, 1949, puts a particular emphasis on human rights. Its first sentence, Human dignity is inviolable, is being interpreted as protecting the sum of human rights. This paragraph is protected by an eternity clause and cannot be changed. - The brake drum mounts on the axle or wheel hub, and some drums incorporate the hub. Most brake drums are made of solid cast iron, but there are also steel and aluminum drums with cast iron liners. The machined friction surface on all drums is cast iron. - source_sentence: More than 169 countries had reported over 212,000 COVID-19 cases before March 19 , 2020 . sentences: - As of 23 March , more than 341,000 cases of COVID-19 have been reported in 192 countries and territories , resulting in more than 14,700 deaths and 99,000 recoveries . - As of 21 March , more than 278,000 cases of COVID-19 have been reported in over 186 countries and territories , resulting in more than 11,500 deaths and 92,000 recoveries. virus seems to mostly spread between people via respiratory droplets . - As of 18 March 2020 , more than 212,000 cases of COVID-19 have been reported in at least 170 countries and territories , with major outbreaks in China , Iran and the European Union . - source_sentence: 'The images were captured on the morning of 14 November by CCTV cameras at a French petrol station, a day after the attacks in which 130 were killed. In them, Salah Abdeslam seems relaxed, walking with his hands in his pockets. He is thought to have been in charge of logistics for the groups of gunmen who carried out the attacks. Salah Abdeslam is said to have called his two friends, Mohammed Amri and Salah Hamza Attou, from Paris early on 14 November to come and pick him up and take him to Belgium. En route from Paris to Brussels, the three men stopped at a petrol station near the Belgian border for about 15 minutes, where a CCTV camera filmed them, BFM reports. At that point, the three men had already been through three police checks, but had not been stopped as Salah Abdeslam had not yet been connected to the Paris attacks. Mohammed Amri and Salah Hamza Attou later dropped off Salah Abdeslam in the district of Laeken in Brussels. The two were arrested in Molenbeek the next day and face terror charges, while Salah Abdeslam is still on the run. Who were the Paris attackers? Paris attacks: The investigation so far Paris attacks: Who were the victims? Paris attacks: What happened on the night The Paris attacks are believed to have been at least partly planned in Brussels. Belgian police have arrested 10 people as part of their investigation. The suspected ringleader was Abdelhamid Abaaoud, a Belgian national. He and his cousin Hasna Aitboulahcen died in a fierce gun battle five days after the attacks, when police raided a flat in Paris where they were hiding, heavily armed.' sentences: - The first images of the fugitive Paris attacks suspect Salah Abdeslam are said to have emerged, according to French news channel BFM TV. - Excess Army food supplies should be given to the "army of the homeless", a senior MP says. - Head coach Philippe Montanier has said Nottingham Forest's second-half display against Derby County was poor in so many areas of the pitch. - source_sentence: Electrical energy can be converted into kinetic energy and heat energy by an electric motor. sentences: - Solution is the term for a homogeneous mixture of two or more substances. - Solution is the term for a homogeneous mixture of two or more substances. - Electric motors transform electrical energy into kinetic energy. - source_sentence: who plays the predator in the movie predator sentences: - Kevin Peter Hall Kevin Peter Hall (May 9, 1955 – April 10, 1991) was an American actor best known for his roles as the title character in the first two films in the Predator franchise and the title character of Harry in the film and television series, Harry and the Hendersons. He also appeared in the television series Misfits of Science and 227, along with the film Without Warning. - The Secret Daughter The Secret Daughter is an Australian television drama series which premiered on the Seven Network on 3 October 2016.[1] The series is written by Justin Monjo, Greg Haddrick, Louise Bowes and Keith Thompson and directed by Leah Purcell, Geoff Bennett and Paul Moloney. The drama centres around part-time country pub singer Billie Carter (Jessica Mauboy), who has a chance meeting with a wealthy city hotelier and rediscovers information about her family and history. The second season premiered on 8 November 2017.[2] - The Hunchback of Notre-Dame The story is set in Paris in 1482 during the reign of Louis XI. The gypsy Esmeralda (born as Agnes) captures the hearts of many men, including those of Captain Phoebus and Pierre Gringoire, but especially Quasimodo and his guardian Archdeacon Claude Frollo. Frollo is torn between his obsessive lust for Esmeralda and the rules of Notre Dame Cathedral. He orders Quasimodo to kidnap her, but Quasimodo is captured by Phoebus and his guards, who save Esmeralda. Gringoire, who attempted to help Esmeralda but was knocked out by Quasimodo, is about to be hanged by beggars when Esmeralda saves him by agreeing to marry him for four years. model-index: - name: SentenceTransformer based on microsoft/deberta-v3-small results: - task: type: semantic-similarity name: Semantic Similarity dataset: name: sts test type: sts-test metrics: - type: pearson_cosine value: 0.05719150926706661 name: Pearson Cosine - type: spearman_cosine value: 0.11293393987132996 name: Spearman Cosine - type: pearson_manhattan value: 0.07947531332122215 name: Pearson Manhattan - type: spearman_manhattan value: 0.09745056138447042 name: Spearman Manhattan - type: pearson_euclidean value: 0.05619658108650874 name: Pearson Euclidean - type: spearman_euclidean value: 0.07970793034154534 name: Spearman Euclidean - type: pearson_dot value: 0.2946717285740077 name: Pearson Dot - type: spearman_dot value: 0.3047454601059247 name: Spearman Dot - type: pearson_max value: 0.2946717285740077 name: Pearson Max - type: spearman_max value: 0.3047454601059247 name: Spearman Max - task: type: triplet name: Triplet dataset: name: NLI v2 type: NLI-v2 metrics: - type: cosine_accuracy value: 1.0 name: Cosine Accuracy - type: dot_accuracy value: 0.140625 name: Dot Accuracy - type: manhattan_accuracy value: 1.0 name: Manhattan Accuracy - type: euclidean_accuracy value: 1.0 name: Euclidean Accuracy - type: max_accuracy value: 1.0 name: Max Accuracy - task: type: binary-classification name: Binary Classification dataset: name: VitaminC type: VitaminC metrics: - type: cosine_accuracy value: 0.55078125 name: Cosine Accuracy - type: cosine_accuracy_threshold value: 0.9581937789916992 name: Cosine Accuracy Threshold - type: cosine_f1 value: 0.6507936507936508 name: Cosine F1 - type: cosine_f1_threshold value: 0.7856193780899048 name: Cosine F1 Threshold - type: cosine_precision value: 0.4823529411764706 name: Cosine Precision - type: cosine_recall value: 1.0 name: Cosine Recall - type: cosine_ap value: 0.5266754197706615 name: Cosine Ap - type: dot_accuracy value: 0.54296875 name: Dot Accuracy - type: dot_accuracy_threshold value: 461.7385559082031 name: Dot Accuracy Threshold - type: dot_f1 value: 0.6542553191489362 name: Dot F1 - type: dot_f1_threshold value: 349.2696838378906 name: Dot F1 Threshold - type: dot_precision value: 0.48616600790513836 name: Dot Precision - type: dot_recall value: 1.0 name: Dot Recall - type: dot_ap value: 0.5148613370532991 name: Dot Ap - type: manhattan_accuracy value: 0.546875 name: Manhattan Accuracy - type: manhattan_accuracy_threshold value: 111.04672241210938 name: Manhattan Accuracy Threshold - type: manhattan_f1 value: 0.6542553191489362 name: Manhattan F1 - type: manhattan_f1_threshold value: 232.8947296142578 name: Manhattan F1 Threshold - type: manhattan_precision value: 0.48616600790513836 name: Manhattan Precision - type: manhattan_recall value: 1.0 name: Manhattan Recall - type: manhattan_ap value: 0.5200203121459024 name: Manhattan Ap - type: euclidean_accuracy value: 0.55078125 name: Euclidean Accuracy - type: euclidean_accuracy_threshold value: 6.300814628601074 name: Euclidean Accuracy Threshold - type: euclidean_f1 value: 0.6472148541114058 name: Euclidean F1 - type: euclidean_f1_threshold value: 14.141785621643066 name: Euclidean F1 Threshold - type: euclidean_precision value: 0.48031496062992124 name: Euclidean Precision - type: euclidean_recall value: 0.991869918699187 name: Euclidean Recall - type: euclidean_ap value: 0.5255273000837065 name: Euclidean Ap - type: max_accuracy value: 0.55078125 name: Max Accuracy - type: max_accuracy_threshold value: 461.7385559082031 name: Max Accuracy Threshold - type: max_f1 value: 0.6542553191489362 name: Max F1 - type: max_f1_threshold value: 349.2696838378906 name: Max F1 Threshold - type: max_precision value: 0.48616600790513836 name: Max Precision - type: max_recall value: 1.0 name: Max Recall - type: max_ap value: 0.5266754197706615 name: Max Ap --- # SentenceTransformer based on microsoft/deberta-v3-small This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [microsoft/deberta-v3-small](https://huggingface.co/microsoft/deberta-v3-small) on the [negation-triplets](https://huggingface.co/datasets/jinaai/negation-dataset-v2), [vitaminc-pairs](https://huggingface.co/datasets/tals/vitaminc), [scitail-pairs-qa](https://huggingface.co/datasets/allenai/scitail), [scitail-pairs-pos](https://huggingface.co/datasets/allenai/scitail), xsum-pairs, [sciq_pairs](https://huggingface.co/datasets/allenai/sciq), [qasc_pairs](https://huggingface.co/datasets/allenai/qasc), openbookqa_pairs, [msmarco_pairs](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3), [nq_pairs](https://huggingface.co/datasets/sentence-transformers/natural-questions), [trivia_pairs](https://huggingface.co/datasets/sentence-transformers/trivia-qa), [gooaq_pairs](https://huggingface.co/datasets/sentence-transformers/gooaq) and [paws-pos](https://huggingface.co/datasets/google-research-datasets/paws) datasets. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [microsoft/deberta-v3-small](https://huggingface.co/microsoft/deberta-v3-small) - **Maximum Sequence Length:** 512 tokens - **Output Dimensionality:** 768 tokens - **Similarity Function:** Cosine Similarity - **Training Datasets:** - [negation-triplets](https://huggingface.co/datasets/jinaai/negation-dataset-v2) - [vitaminc-pairs](https://huggingface.co/datasets/tals/vitaminc) - [scitail-pairs-qa](https://huggingface.co/datasets/allenai/scitail) - [scitail-pairs-pos](https://huggingface.co/datasets/allenai/scitail) - xsum-pairs - [sciq_pairs](https://huggingface.co/datasets/allenai/sciq) - [qasc_pairs](https://huggingface.co/datasets/allenai/qasc) - openbookqa_pairs - [msmarco_pairs](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3) - [nq_pairs](https://huggingface.co/datasets/sentence-transformers/natural-questions) - [trivia_pairs](https://huggingface.co/datasets/sentence-transformers/trivia-qa) - [gooaq_pairs](https://huggingface.co/datasets/sentence-transformers/gooaq) - [paws-pos](https://huggingface.co/datasets/google-research-datasets/paws) - **Language:** en ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: DebertaV2Model (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("bobox/DeBERTa-small-ST-v1-toytest-checkpoints-tmp") # Run inference sentences = [ 'who plays the predator in the movie predator', 'Kevin Peter Hall Kevin Peter Hall (May 9, 1955\xa0– April 10, 1991) was an American actor best known for his roles as the title character in the first two films in the Predator franchise and the title character of Harry in the film and television series, Harry and the Hendersons. He also appeared in the television series Misfits of Science and 227, along with the film Without Warning.', 'The Hunchback of Notre-Dame The story is set in Paris in 1482 during the reign of Louis XI. The gypsy Esmeralda (born as Agnes) captures the hearts of many men, including those of Captain Phoebus and Pierre Gringoire, but especially Quasimodo and his guardian Archdeacon Claude Frollo. Frollo is torn between his obsessive lust for Esmeralda and the rules of Notre Dame Cathedral. He orders Quasimodo to kidnap her, but Quasimodo is captured by Phoebus and his guards, who save Esmeralda. Gringoire, who attempted to help Esmeralda but was knocked out by Quasimodo, is about to be hanged by beggars when Esmeralda saves him by agreeing to marry him for four years.', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 768] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` ## Evaluation ### Metrics #### Semantic Similarity * Dataset: `sts-test` * Evaluated with [EmbeddingSimilarityEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator) | Metric | Value | |:--------------------|:-----------| | pearson_cosine | 0.0572 | | **spearman_cosine** | **0.1129** | | pearson_manhattan | 0.0795 | | spearman_manhattan | 0.0975 | | pearson_euclidean | 0.0562 | | spearman_euclidean | 0.0797 | | pearson_dot | 0.2947 | | spearman_dot | 0.3047 | | pearson_max | 0.2947 | | spearman_max | 0.3047 | #### Triplet * Dataset: `NLI-v2` * Evaluated with [TripletEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.TripletEvaluator) | Metric | Value | |:-------------------|:--------| | cosine_accuracy | 1.0 | | dot_accuracy | 0.1406 | | manhattan_accuracy | 1.0 | | euclidean_accuracy | 1.0 | | **max_accuracy** | **1.0** | #### Binary Classification * Dataset: `VitaminC` * Evaluated with [BinaryClassificationEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.BinaryClassificationEvaluator) | Metric | Value | |:-----------------------------|:-----------| | cosine_accuracy | 0.5508 | | cosine_accuracy_threshold | 0.9582 | | cosine_f1 | 0.6508 | | cosine_f1_threshold | 0.7856 | | cosine_precision | 0.4824 | | cosine_recall | 1.0 | | cosine_ap | 0.5267 | | dot_accuracy | 0.543 | | dot_accuracy_threshold | 461.7386 | | dot_f1 | 0.6543 | | dot_f1_threshold | 349.2697 | | dot_precision | 0.4862 | | dot_recall | 1.0 | | dot_ap | 0.5149 | | manhattan_accuracy | 0.5469 | | manhattan_accuracy_threshold | 111.0467 | | manhattan_f1 | 0.6543 | | manhattan_f1_threshold | 232.8947 | | manhattan_precision | 0.4862 | | manhattan_recall | 1.0 | | manhattan_ap | 0.52 | | euclidean_accuracy | 0.5508 | | euclidean_accuracy_threshold | 6.3008 | | euclidean_f1 | 0.6472 | | euclidean_f1_threshold | 14.1418 | | euclidean_precision | 0.4803 | | euclidean_recall | 0.9919 | | euclidean_ap | 0.5255 | | max_accuracy | 0.5508 | | max_accuracy_threshold | 461.7386 | | max_f1 | 0.6543 | | max_f1_threshold | 349.2697 | | max_precision | 0.4862 | | max_recall | 1.0 | | **max_ap** | **0.5267** | ## Training Details ### Training Datasets #### negation-triplets * Dataset: [negation-triplets](https://huggingface.co/datasets/jinaai/negation-dataset-v2) * Size: 3,250 training samples * Columns: anchor, entailment, and negative * Approximate statistics based on the first 1000 samples: | | anchor | entailment | negative | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| | type | string | string | string | | details | | | | * Samples: | anchor | entailment | negative | |:------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------|:----------------------------------------------------------------------| | While some organizations made the dramatic change look effortless, for others, it did not come easy. | Dramatic changes within organizations seldom come simply. | Dramatic changes within organizations often come simply. | | A cook mixing a meal at a restaurant. | A chef preparing food in a metal bowl | A chef throwing away food in a metal bowl | | In addition, the women wear various heavy rings. | The women wear heavy jewelry. | The women do not wear heavy jewelry. | * Loss: [CachedGISTEmbedLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters: ```json {'guide': SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ), 'temperature': 0.05} ``` #### vitaminc-pairs * Dataset: [vitaminc-pairs](https://huggingface.co/datasets/tals/vitaminc) at [be6febb](https://huggingface.co/datasets/tals/vitaminc/tree/be6febb761b0b2807687e61e0b5282e459df2fa0) * Size: 3,000 training samples * Columns: claim and evidence * Approximate statistics based on the first 1000 samples: | | claim | evidence | |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | claim | evidence | |:--------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | Karl Marginson was the manager until October 2017 . | The team was managed by Karl Marginson since its formation in 2005 until October 2017 ; the current manager is Tom Greaves . | | Jerry Lee Lewis married his 13-year-old first cousin . | However , Lewis 's rock and roll career faltered in the wake of his marriage to his 13-year-old first cousin once removed when he was 23 years old . | | Estádio do Morumbi is also known as Panetone . | The Estádio Cícero Pompeu de Toledo , widely known as Morumbi ( ) or Panetone , is a football stadium located in the Morumbi district in São Paulo , Brazil . | * Loss: [CachedGISTEmbedLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters: ```json {'guide': SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ), 'temperature': 0.05} ``` #### scitail-pairs-qa * Dataset: [scitail-pairs-qa](https://huggingface.co/datasets/allenai/scitail) at [0cc4353](https://huggingface.co/datasets/allenai/scitail/tree/0cc4353235b289165dfde1c7c5d1be983f99ce44) * Size: 2,750 training samples * Columns: sentence2 and sentence1 * Approximate statistics based on the first 1000 samples: | | sentence2 | sentence1 | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence2 | sentence1 | |:-----------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------| | All matter in the universe is composed of one or more unique pure substances called elements. | All matter in the universe is composed of one or more unique pure substances called what? | | Corals build hard exoskeletons that grow to become coral reefs. | Corals build hard exoskeletons that grow to become what? | | Insulin is made up of two polypeptide chains. | Insulin is made up of how many polypeptide chains? | * Loss: [CachedGISTEmbedLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters: ```json {'guide': SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ), 'temperature': 0.05} ``` #### scitail-pairs-pos * Dataset: [scitail-pairs-pos](https://huggingface.co/datasets/allenai/scitail) at [0cc4353](https://huggingface.co/datasets/allenai/scitail/tree/0cc4353235b289165dfde1c7c5d1be983f99ce44) * Size: 2,750 training samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------| | Prokaryotes are organisms that lack a cell nucleus and the other membrane bound organelles. | Most organelles are not found in prokaryotic cells. | | Vitamins and minerals are needed in small quantities for the adequate functioning of the body. | Vitamins are the organic compounds that the body needs in small amounts to function properly; humans need 13 different ones. | | Saturn has a thick atmosphere made up of mostly hydrogen and helium. | Saturn is made mostly of helium and hydrogen. | * Loss: [CachedGISTEmbedLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters: ```json {'guide': SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ), 'temperature': 0.05} ``` #### xsum-pairs * Dataset: xsum-pairs * Size: 3,000 training samples * Columns: document and summary * Approximate statistics based on the first 1000 samples: | | document | summary | |:--------|:-------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | document | summary | |:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------| | It is claimed the incident took place after New Zealand's 30-12 defeat by Australia in Canberra on Friday.
The allegations emerged in a court case and, although Melbourne Storm's Bromwich and Gold Coast Titans' Proctor were named, neither have been charged.
NZRL says it will take immediate action if the allegations are proven.
The court heard that a local man was captured on CCTV preparing a white powder on his phone. He then handed it to Bromwich and Proctor, who were said to have rolled up bank notes and taken the substance.
"We are working with the NRL (the Australia-based National Rugby League) while investigations into the alleged incident are ongoing and New Zealand Rugby League will not be making any comment until more information becomes available," said an NZRL statement.
The news came after Damian Keogh, chairman of NRL side Cronulla Sharks, stood down after being arrested for alleged drug possession.
Keogh is a former basketball player for Australia and played in three Olympic Games. He is scheduled to appear in court on 30 June.
New Zealand international Shaun Kenny-Dowall was also charged over allegations of drug possession in Sydney.
| New Zealand Rugby League (NZRL) are investigating allegations national captain Jesse Bromwich and team-mate Kevin Proctor bought and took cocaine. | | Madeleine Bridle said the wall, which runs behind gardens in a cul-de-sac, was "integral" to Blandford Forum.
The town council had decided to replace the section with a wooden fence due to its poor condition.
Historic England said the entire wall, cemetery gateway and two chapels had been granted a Grade ll listing.
The wall, which was built in the mid-1800s, has been damaged by the roots of several lime and sycamore trees which are subject to preservation orders, Blandford Town Council said.
The authority said it had made a failed attempt to list the chapels following an arson attack in September 2013.
It had planned to replace the wall with a wooden fence at a cost of £13,525, some of which would be offset by the sale of the bricks.
Ms Bridle, who lodged the application, said: "Why should we sell town property?"
"We don't want a fence replacing this beautiful 19th Century brick wall.
"It is integral to the character of the town."
Blandford town clerk Linda Scott-Giles said the wall would now be preserved.
She said a builder originally contracted to repair one section had given an estimate of £150,000 to rebuild the entire wall.
She said: "This is money we do not have. I don't know what we're going to do.
"We may have to put buttresses in people's gardens, but that's not something residents will want."
| A Victorian cemetery wall has been saved from demolition after a Dorset resident succeeded in having it listed by Historic England. | | The show, with singer Adam Lambert, will be the band's debut performance at a UK music festival and their only UK show in 2016, organisers said.
Guitarist Brian May said former frontman Freddie Mercury "would have loved it".
The rock legends will close the four-day festival at Seaclose Park, Newport, on 12 June.
Queen drummer Roger Taylor said: "When I think of The Isle of Wight Festival I think of Hendrix, Dylan and The Who. What immortal company to be in.
"Queen are thrilled to be there and can promise a special night."
The band recently celebrated the 40th anniversary of their record-breaking worldwide hit single Bohemian Rhapsody.
They are the first headliners to be announced for the festival which will be marking its 15th year since it relaunched in 2002.
| Queen have been revealed as the Sunday night headliners for The Isle of Wight Festival next year. | * Loss: [CachedGISTEmbedLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters: ```json {'guide': SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ), 'temperature': 0.05} ``` #### sciq_pairs * Dataset: [sciq_pairs](https://huggingface.co/datasets/allenai/sciq) at [2c94ad3](https://huggingface.co/datasets/allenai/sciq/tree/2c94ad3e1aafab77146f384e23536f97a4849815) * Size: 2,750 training samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:----------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | In experiments with garden peas, austrian monk gregor mendel described the basic patterns of what? | For thousands of years, humans have understood that characteristics such as eye color, hair color, or even flower color are passed from one generation to the next. The passing of characteristics from parent to offspring is called heredity . Humans have long been interested in understanding heredity. Many hereditary mechanisms were developed by scholars but were not properly tested or quantified. The scientific study of genetics did not begin until the late 19 th century. In experiments with garden peas, Austrian monk Gregor Mendel described the basic patterns of inheritance. Keep in mind that while we know about DNA and its role as the genetic material, Mendel did not know of the existence of DNA. Nor did he understand the concept of the chromosome or the process of meiosis, and yet, he was still able to correctly describe basic inheritance patterns. | | What is the most effective color in interrupting the nighttime portion of the photoperiod? | | | This process of combining the wave functions for atomic orbitals is called what? | Quantum-mechanical calculations suggest why the observed bond angles in H2O differ from those predicted by the overlap of the 1s orbital of the hydrogen atoms with the 2p orbitals of the oxygen atom. The mathematical expression known as the wave function, ψ, contains information about each orbital and the wavelike properties of electrons in an isolated atom. When atoms are bound together in a molecule, the wave functions combine to produce new mathematical descriptions that have different shapes. This process of combining the wave functions for atomic orbitals is called hybridization and is mathematically accomplished by the linear combination of atomic orbitals, LCAO, (a technique that we will encounter again later). The new orbitals that result are called hybrid orbitals. The valence orbitals in an isolated oxygen atom are a 2s orbital and three 2p orbitals. The valence orbitals in an oxygen atom in a water molecule differ; they consist of four equivalent hybrid orbitals that point approximately toward the corners of a tetrahedron (Figure 8.7). Consequently, the overlap of the O and H orbitals should result in a tetrahedral bond angle (109.5°). The observed angle of 104.5° is experimental evidence for which quantummechanical calculations give a useful explanation: Valence bond theory must include a hybridization component to give accurate predictions. Note that orbitals may sometimes be drawn in an elongated “balloon” shape rather than in a more realistic “plump” shape in order to make the geometry easier to visualize. | * Loss: [CachedGISTEmbedLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters: ```json {'guide': SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ), 'temperature': 0.05} ``` #### qasc_pairs * Dataset: [qasc_pairs](https://huggingface.co/datasets/allenai/qasc) at [a34ba20](https://huggingface.co/datasets/allenai/qasc/tree/a34ba204eb9a33b919c10cc08f4f1c8dae5ec070) * Size: 2,750 training samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:---------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:-----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | What can HIV infect and destroy part of? | HIV infects and destroys helper T cells.. Helper T Cells Helper T cells are the brains behind immune response.. HIV infects and destroys part of the immune response. | | what does a renewable, economical source of electricity require? | hydropower requires damming a river. Hydropower is a renewable, economical source of electricity.. a renewable, economical source of electricity requires damming a river | | What may cause animals to fight towards members of their own species? | competition may cause animals to fight towards members of their own species. Competition Animals compete for food and shelter.. food and shelter may cause animals to fight towards members of their own species | * Loss: [CachedGISTEmbedLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters: ```json {'guide': SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ), 'temperature': 0.05} ``` #### openbookqa_pairs * Dataset: openbookqa_pairs * Size: 2,500 training samples * Columns: question and fact * Approximate statistics based on the first 1000 samples: | | question | fact | |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | question | fact | |:-----------------------------------------------------------------------------|:--------------------------------------------------------------------------------------| | What is animal competition? | if two animals eat the same prey then those animals compete for that pey | | If you wanted to make a metal bed frame, where would you start? | alloys are made of two or more metals | | Places lacking warmth have few what | cold environments contain few organisms | * Loss: [CachedGISTEmbedLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters: ```json {'guide': SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ), 'temperature': 0.05} ``` #### msmarco_pairs * Dataset: [msmarco_pairs](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3) at [28ff31e](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3/tree/28ff31e4c97cddd53d298497f766e653f1e666f9) * Size: 2,750 training samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:---------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:-----------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | what is bunco | 'Bunco, also known as Bonko or Bunko, is a popular game played with nine dice and a whole lot of luck. Play bunco at parties, with family, or with your 11 other friends that you got stranded on an island with. Follow these steps to learn how to play. | | what is the tropical zone | Report Abuse. The tropical zone is characterised by strongly monsoonal weather patterns, distinct wet and dry periods which are highly reliable, large quantities of rain and high intensity rainfall, high temperatures, and high rates of energy transformation. | | what is a potential drawback for having a student council | The advantages of having a student council are: 1 To create a positive school atmosphere; 2 To create a caring school environment, which is supportive and inclusive; 3 To act as a vehicle for student participation; 4 To have a beneficial impact on issues such as discipline, bullying and staff-student relations; | * Loss: [CachedGISTEmbedLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters: ```json {'guide': SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ), 'temperature': 0.05} ``` #### nq_pairs * Dataset: [nq_pairs](https://huggingface.co/datasets/sentence-transformers/natural-questions) at [f9e894e](https://huggingface.co/datasets/sentence-transformers/natural-questions/tree/f9e894e1081e206e577b4eaa9ee6de2b06ae6f17) * Size: 2,750 training samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:-----------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | link between get him to the greek and forgetting sarah marshall | Get Him to the Greek Get Him to the Greek is a 2010 American black comedy film written, produced and directed by Nicholas Stoller and starring Russell Brand and Jonah Hill. Released on June 4, 2010, the film is a spin-off sequel of Stoller's 2008 film Forgetting Sarah Marshall, reuniting director Stoller with stars Hill and Brand and producer Judd Apatow. Brand reprises his role as character Aldous Snow from Forgetting Sarah Marshall, while Hill plays an entirely new character referred to as Aaron Green instead of Matthew Van Der Wyk. The film also stars Elisabeth Moss, Rose Byrne, Sean "Diddy" Combs, and Colm Meaney. | | who said nothing short of state is the actualization of freedom | Fascism and ideology During the Enlightenment, a number of ideological influences arose that would shape the development of fascism. The development of the study of universal histories by Johann Gottfried Herder resulted in Herder's analysis of the development of nations, Herder developed the term Nationalismus ("nationalism") to describe this cultural phenomenon. At this time nationalism did not refer to the political ideology of nationalism that was later developed during the French Revolution.[24] Herder also developed the theory that Europeans are the descendants of Indo-Aryan people based on language studies. Herder argued that the Germanic peoples held close racial connections with the ancient Indians and ancient Persians, who he claimed were advanced peoples possessing a great capacity for wisdom, nobility, restraint and science.[25] Contemporaries of Herder utilized the concept of the Aryan race to draw a distinction between what they deemed "high and noble" Aryan culture versus that of "parasitic" Semitic culture and this anti-Semitic variant view of Europeans' Aryan roots formed the basis of Nazi racial views.[25][25] Another major influence on fascism came from the political theories of Georg Wilhelm Friedrich Hegel.[7] Hegel promoted the absolute authority of the state[7] and said "nothing short of the state is the actualization of freedom" and that the "state is the march of God on earth".[17] | | where does the mass number go in isotopic notation | Mass number The mass number is written either after the element name or as a superscript to the left of an element's symbol. For example, the most common isotope of carbon is carbon-12, or 12C, which has 6 protons and 6 neutrons. The full isotope symbol would also have the atomic number (Z) as a subscript to the left of the element symbol directly below the mass number: 12 6C.[2] This is technically redundant, as each element is defined by its atomic number, so it is often omitted. | * Loss: [CachedGISTEmbedLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters: ```json {'guide': SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ), 'temperature': 0.05} ``` #### trivia_pairs * Dataset: [trivia_pairs](https://huggingface.co/datasets/sentence-transformers/trivia-qa) at [a7c36e3](https://huggingface.co/datasets/sentence-transformers/trivia-qa/tree/a7c36e3c8c8c01526bc094d79bf80d4c848b0ad0) * Size: 2,500 training samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | What is Frodo's second name? | Frodo Baggins | The One Wiki to Rule Them All | Fandom powered by Wikia Childhood Bilbo talking to Frodo before he goes off to meet with Gandalf the Grey Much of Frodo's youth was spent at Brandy Hall in Buckland , the ancestral home of the Brandybuck family, including his mother ( Primula Brandybuck ). Frodo was known as something of a rascal, befriending Meriadoc (Merry) Brandybuck and Peregrin (Pippin) Took  and causing trouble wherever they went. They would often steal mushrooms from Farmer Maggot 's farm Bamfurlong .   In TA 2980 , when Frodo was only 12 years old, his parents drowned in a boating accident on the Brandywine River . An only child, Frodo stayed in Brandy Hall until his 99-year-old "uncle"  Bilbo Baggins adopted him in TA 2989 . Bilbo took Frodo to live with him in his home at Bag End and made him his heir. Frodo with Bilbo during his 111th birthday The two grew very close in the following years; Frodo learned much of the Elvish language during his time with Bilbo, as well as much of the lore of Middle-earth. The two shared the same birthday, September 22 by Shire Reckoning (around September 12–14 of our calendar), [1] and a party of special magnificence was held at the beginning of The Fellowship of the Ring when Frodo came of age of thirty-three and Bilbo hit the peculiar year of 111. Bilbo gave a memorable Birthday Speech before playing a joke on his fellow hobbits by using the One Ring to disappear, at which Gandalf quickly reacted and used his staff to create a blinding flash where Bilbo had been standing. The hobbits at the Party were left confused and disgruntled, and Bilbo was never again seen in the Shire. Before departing for his journey to Rivendell, Bilbo had a long conversation with Gandalf, who finally persuaded him to voluntarily surrender the One Ring. Bilbo left it on the fireplace mantel with a note for Frodo, who would now become the next Ring-bearer. Coming of Age and Quest Beginning Gandalf telling Frodo the story about the One Ring After the party finished, Frodo returned home and discovered that he was now the master of Bag End and the recipient of Bilbo's magic ring. Gandalf , ever more curious about the ring's origin, power, and purpose (but not yet positive it was the One Ring), advised the young hobbit against the using the ring. For the next seventeen years, Frodo complied with the wizard 's request and hid the Ring in a safe place. However, on April 12 , 3018 , Gandalf returned to Bag End and warned Frodo that the Ring was actually the One Ring, which the evil lord Sauron needed to rule over Middle-earth. Realizing that Sauron would be looking for the Ring, Gandalf advised the Hobbit to secretly follow Bilbo's journey to Rivendell. After Frodo's discussion with Gandalf, a rumor started that he was running out of money. This rumor, although not begun by Frodo, was encouraged by him. Merry helped Frodo to purchase a small house at Crickhollow . With the exception of his gardener Sam Gamgee , who had agreed to accompany him to Rivendell , Frodo told the other Hobbits of the Shire that he intended to move to Buckland . He sold his home to the Sackville-Baggins , and, on the September 23, 3018, the day after his fiftieth birthday, Frodo left from Bag End, taking with him Sam and Pippin. They left in the early morning for Bree , and just in time, as Sauron's most powerful servants, the nine Nazgûl , had entered the Shire dressed as Black riders searching for a hobbit with the name of Baggins. To Bree Frodo was unable to find much information about his pursuers from his conversations with the High Elves and Farmer Maggot , but what they were told was less than encouraging. When Frodo arrived at Buckland, where Merry was waiting, he found that Merry and Pippin already knew about Frodo's "secret" journey. Frodo was left with no alternative but to bring the two youngsters with him. They cut through the Old Forest and the Barrow-downs in hopes of losing the Black Riders, which did succeed. They met other troubles in those places though, at the hands of Old Man Willow and the Barrow-Wi | | Israel was proclaimed an independent state in 1948. Who was its prime minister from then until 1963? | The Declaration of the State of Israel The Declaration of the State of Israel May 14, 1948 donations Introduction As the British forces pulled out of Palestine and the mandate came to an end, the Executive Committee of the Jewish "Yishuv" (community) in Palestine met to decide whether or not to declare a state, as has been envisioned under UN Resolution 181. The Arab states had declared that if such a state was declared, they would invade it. Nonetheless, the committee decided to declare a state, armed with the promise of US President Harry S. Truman that he would recognize such a state if it was declared. The Israeli Declaration of Independence was read out on Friday, the 14th of May 1948 by   David Ben Gurion, who then became the first Prime Minister of the new state. The State was quickly recognized by the United States and the USSR. The Palestinians did not declare a state immediately, and though several attempts were made to do so, they were blocked by the Jordanians and then by the Egyptians. The Egyptians later allowed the declaration of such a state in Gaza in September 1948, but it was recognized by no-one and had no resources and no real existence. Arab states had no interest in the formation of a separate state in Palestine, both because each state had territorial ambitions in Palestine, and because they feared the radical influence of Palestinian leadership under Haj Amin El-Husseini, the Grand Mufti of Jerusalem. The declaration stated that Israel  "will uphold the full social and political equality of all its citizens, without distinction of race, creed or sex; will guarantee full freedom of conscience, worship, education and culture; will safeguard the sanctity and inviolability of the shrines and Holy Places of all religions; and will dedicate itself to the principles of the Charter of the United Nations. " The last sentence of the declaration refers to "the rock of Israel" (tsur Yisrael). This is one of the synonyms for God used in Hebrew. According to Tom Segev, in The First Israelis, the wording represents a compromise between the demand of Moshe Shapira representing the religious party that the declaration incorporate a reference to the Lord of Israel, and the demand of the leftist Mapam party representative that the declaration must not incorporate such a reference. The compromise formula made it possible to approve the declaration and publish it before the Sabbath and before the British left the country. May 15, 1948 was a Sabbath. David Ben Gurion, the first Prime Minister, who was a deist or possibly a polite atheist, was agreeable to this compromise. He said on other occasion that for him "the rock of Israel" was the Old Testament with its history and traditions.  Ami Isseroff Notice - Copyright This introduction is Copyright 2001-2003 by MidEastWeb http://www.mideastweb.org and the author. Please tell your friends about MidEastWeb and link to this page. Please do not copy this page to your Web site. You may print this page out for classroom use provided that this notice is appended, and you may cite this material in the usual way. Other uses by permission only.  The source material below is placed in the public domain  and is free of copy restrictions. MidEastWeb is a non-profit organization dedicated to promoting peace and coexistence in the Middle East. We provide balanced and complete information, news and views to promote understanding and dialog. We cannot continue without your help! If peace in the Middle East is important to you, please help us by making a tax-deductible donation . If you don't help us, who will? Thank you!   Declaration of Israel's Independence 1948 Issued at Tel Aviv on May 14, 1948 (5th of Iyar, 5708) The land of Israel was the birthplace of the Jewish people. Here their spiritual, religious and national identity was formed. Here they achieved independence and created a culture of national and universal significance. Here they wrote and gave the Bible to the world. Exiled from Palestine, the Jewish people remained faithful to it in all the | | What was the first artificial satellite? | Sputnik NASA Main Page Multimedia Interactive Feature on 50th Anniversary of the Space Age Sputnik and The Dawn of the Space Age History changed on October 4, 1957, when the Soviet Union successfully launched Sputnik I. The world's first artificial satellite was about the size of a beach ball (58 cm.or 22.8 inches in diameter), weighed only 83.6 kg. or 183.9 pounds, and took about 98 minutes to orbit the Earth on its elliptical path. That launch ushered in new political, military, technological, and scientific developments. While the Sputnik launch was a single event, it marked the start of the space age and the U.S.-U.S.S.R space race.  The story begins in 1952, when the International Council of Scientific Unions decided to establish July 1, 1957, to December 31, 1958, as the International Geophysical Year (IGY) because the scientists knew that the cycles of solar activity would be at a high point then. In October 1954, the council adopted a resolution calling for artificial satellites to be launched during the IGY to map the Earth's surface.  In July 1955, the White House announced plans to launch an Earth-orbiting satellite for the IGY and solicited proposals from various Government research agencies to undertake development. In September 1955, the Naval Research Laboratory's Vanguard proposal was chosen to represent the U.S. during the IGY.  The Sputnik launch changed everything. As a technical achievement, Sputnik caught the world's attention and the American public off-guard. Its size was more impressive than Vanguard's intended 3.5-pound payload. In addition, the public feared that the Soviets' ability to launch satellites also translated into the capability to launch ballistic missiles that could carry nuclear weapons from Europe to the U.S. Then the Soviets struck again; on November 3, Sputnik II was launched, carrying a much heavier payload, including a dog named Laika.  Immediately after the Sputnik I launch in October, the U.S. Defense Department responded to the political furor by approving funding for another U.S. satellite project. As a simultaneous alternative to Vanguard, Wernher von Braun and his Army Redstone Arsenal team began work on the Explorer project.  On January 31, 1958, the tide changed, when the United States successfully launched Explorer I. This satellite carried a small scientific payload that eventually discovered the magnetic radiation belts around the Earth, named after principal investigator James Van Allen. The Explorer program continued as a successful ongoing series of lightweight, scientifically useful spacecraft.  The Sputnik launch also led directly to the creation of National Aeronautics and Space Administration (NASA). In July 1958, Congress passed the National Aeronautics and Space Act (commonly called the "Space Act") , which created NASA as of October 1, 1958 from the National Advisory Committee for Aeronautics (NACA) and other government agencies.  Updated October 10, 2007 | * Loss: [CachedGISTEmbedLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters: ```json {'guide': SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ), 'temperature': 0.05} ``` #### gooaq_pairs * Dataset: [gooaq_pairs](https://huggingface.co/datasets/sentence-transformers/gooaq) at [b089f72](https://huggingface.co/datasets/sentence-transformers/gooaq/tree/b089f728748a068b7bc5234e5bcf5b25e3c8279c) * Size: 2,500 training samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:--------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | what is the main responsibility of a registered nurse? | Registered Nurse Responsibilities: Maintaining accurate, complete health care records and reports. Administering medications to patients and monitoring them for side effects and reactions. Prescribing assistive medical devices and related treatments. Recording patient vital signs and medical information. | | how to calculate your salary increase percentage? | ['First, determine the difference between their old and new salary: $52,000 – $50,000 = $2,000.', 'Next, divide the raise amount by their old salary: $2,000 / $50,000 = . ... ', 'To turn the decimal into a percentage, multiply by 100: 100 X . 04 = 4%'] | | how does hodgkin lymphoma affect the body? | Hodgkin lymphoma most often spreads through the lymph vessels from lymph node to lymph node. Rarely, late in the disease, it can invade the bloodstream and spread to other parts of the body, such as the liver, lungs, and/or bone marrow. | * Loss: [CachedGISTEmbedLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters: ```json {'guide': SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ), 'temperature': 0.05} ``` #### paws-pos * Dataset: [paws-pos](https://huggingface.co/datasets/google-research-datasets/paws) at [161ece9](https://huggingface.co/datasets/google-research-datasets/paws/tree/161ece9501cf0a11f3e48bd356eaa82de46d6a09) * Size: 3,250 training samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:-------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------| | He was drafted by the Chicago Cardinals and also played for the Philadelphia Eagles and the Washington Redskins . | He was drafted by the Chicago Cardinals and played for the Washington Redskins and the Philadelphia Eagles . | | A Jewish full fast takes the following night from sunset to darkness : there are two Jewish full days : | A Jewish full fast lasts from sunset to darkness the following night . There are two Jewish full days : | | Chad Ochocinco ( born 1978 ; formerly Chad Johnson ) is an American football wide receiver . | Chad Ochocinco ( born 1978 ; formerly Chad Johnson ) is an American - American - football receiver . | * Loss: [CachedGISTEmbedLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters: ```json {'guide': SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ), 'temperature': 0.05} ``` ### Evaluation Datasets #### vitaminc-pairs * Dataset: [vitaminc-pairs](https://huggingface.co/datasets/tals/vitaminc) at [be6febb](https://huggingface.co/datasets/tals/vitaminc/tree/be6febb761b0b2807687e61e0b5282e459df2fa0) * Size: 108 evaluation samples * Columns: claim and evidence * Approximate statistics based on the first 1000 samples: | | claim | evidence | |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | claim | evidence | |:------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | Dragon Con had over 5000 guests . | Among the more than 6000 guests and musical performers at the 2009 convention were such notables as Patrick Stewart , William Shatner , Leonard Nimoy , Terry Gilliam , Bruce Boxleitner , James Marsters , and Mary McDonnell . | | COVID-19 has reached more than 185 countries . | As of , more than cases of COVID-19 have been reported in more than 190 countries and 200 territories , resulting in more than deaths . | | In March , Italy had 3.6x times more cases of coronavirus than China . | As of 12 March , among nations with at least one million citizens , Italy has the world 's highest per capita rate of positive coronavirus cases at 206.1 cases per million people ( 3.6x times the rate of China ) and is the country with the second-highest number of positive cases as well as of deaths in the world , after China . | * Loss: [CachedGISTEmbedLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters: ```json {'guide': SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ), 'temperature': 0.05} ``` #### negation-triplets * Dataset: [negation-triplets](https://huggingface.co/datasets/jinaai/negation-dataset-v2) * Size: 64 evaluation samples * Columns: anchor, entailment, and negative * Approximate statistics based on the first 1000 samples: | | anchor | entailment | negative | |:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | string | | details | | | | * Samples: | anchor | entailment | negative | |:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | A clean white bathroom with a simple mirror above the vanity. | A bathroom with a white sink and mirror. | A bathroom with a black sink and mirror. | | Many sheep grazing in a large, green pasture. | Many sheep graze in a grassy pasture in a valley. | Few sheep graze in a grassy pasture in a valley. | | A group of older people sitting on a park bench with a dog on a leash. | Three elderly people on a bench gazing into the middle distance. | Three elderly people not on a bench gazing into the middle distance. | * Loss: [CachedGISTEmbedLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters: ```json {'guide': SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ), 'temperature': 0.05} ``` #### scitail-pairs-pos * Dataset: [scitail-pairs-pos](https://huggingface.co/datasets/allenai/scitail) at [0cc4353](https://huggingface.co/datasets/allenai/scitail/tree/0cc4353235b289165dfde1c7c5d1be983f99ce44) * Size: 54 evaluation samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------| | humans normally have 23 pairs of chromosomes. | Humans typically have 23 pairs pairs of chromosomes. | | A solution is a homogenous mixture of two or more substances that exist in a single phase. | Solution is the term for a homogeneous mixture of two or more substances. | | Upwelling The physical process in near-shore ocean systems of rising of nutrients and colder bottom waters to the surface because of constant wind patterns along the shoreline. | Upwelling is the term for when deep ocean water rises to the surface. | * Loss: [CachedGISTEmbedLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters: ```json {'guide': SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ), 'temperature': 0.05} ``` #### xsum-pairs * Dataset: xsum-pairs * Size: 128 evaluation samples * Columns: document and summary * Approximate statistics based on the first 1000 samples: | | document | summary | |:--------|:-------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | document | summary | |:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------| | Wallace made 49 appearances for the Owls last season, having joined the club on a one-year deal in July 2015 following his release by Burnley.
The 31-year-old old, who also had spells with Preston, Sunderland and Celtic, scored six goals last term.
Dundee-born Wallace won his only senior Scotland cap in a 2-0 defeat by Japan in October 2009.
| Sheffield Wednesday winger Ross Wallace has signed a new contract to stay with the Championship club until 2018. | | Media playback is not supported on this device
The Swans were bottom of the table before beating fellow strugglers Sunderland 3-0 on Saturday.
Bradley had been under pressure having won only one of his first seven games in charge - but insisted he was not worried about his own future.
"It's not about me, it's about the work," Bradley said.
"I don't spend all week worrying about myself. I only know one way to work, and that's to think about the team, engage the staff, engage the players.
"Criticism is part of the job for a manager in the Premier League. I don't think I was the only one to be criticised in the last week."
The win against Sunderland was a fine response from Bradley and his players after they were humiliated 5-0 at Tottenham a week earlier.
Victory over the Black Cats means Swansea are now above the Premier League's bottom three by virtue of goal difference.
"We did get a good response. The players deserve full credit. That's the part of the job, a result gets a little bit out of hand, you can cry about it but you have to look at it in a strong way," said Bradley.
"This is a step but we have to build upon it, there's still plenty of work to do. It's a nice bonus to be out of the bottom three, but the work is still there and we can't get ahead of ourselves.
"The word many players used when we talked this week was 'pride' and the only thing I did was I tried to get back at them and say: 'What does pride look like actually on the pitch?'
"Pride has to turn into intensity, pride has to turn into clean sheets. Don't just talk about pride - put it into something more. At the end of that, for a few seconds you can look at the table and say you're not there yet, but it looks better than last week and we can continue move forward."
| Swansea City manager Bob Bradley has warned his side they still have "plenty of work to do" despite climbing out of the Premier League relegation zone. | | The level indicates that Americans expect the economy to remain strong through the second half of the year.
According to the Conference Board, which tracks consumer sentiment, the index reading for August was 101.1, up from 96.7 in July.
The index has not reached such a high point since September 2015.
"Consumers' assessment of both current business and labour market conditions was considerably more favourable than last month," Lynn Franco, the Conference Board's head of economic indicators.
"Short-term expectations regarding business and employment conditions, as well as personal income prospects, also improved, suggesting the possibility of a moderate pick-up in growth in the coming months."
Increases in consumer confidence typically indicate more people are willing to spend money. As more than two-thirds of the US economy is generated by consumer spending, the increase signals likely economic growth.
The percentage of Americans who expect business conditions to continue to improve over the in the next six months rose from 15.7% to 17.3%, while the number who expected worsening conditions fell from 12.4% to 11.1%.
The figure beat expectations by analysts, who predicted consumer confidence would stand at 97 on the index.
Earlier on Tuesday, the Federal Reserve's vice-chairman, Stanley Fischer, said in an interview with Bloomberg News that the US labour market was close to full strength.
Mr Fischer did not say whether the improvements in the labour market meant the Fed would increase interest rates at its upcoming meeting in September.
| US consumer confidence reached its highest point in nearly a year in August as economic conditions continue to improve. | * Loss: [CachedGISTEmbedLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters: ```json {'guide': SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ), 'temperature': 0.05} ``` #### sciq_pairs * Dataset: [sciq_pairs](https://huggingface.co/datasets/allenai/sciq) at [2c94ad3](https://huggingface.co/datasets/allenai/sciq/tree/2c94ad3e1aafab77146f384e23536f97a4849815) * Size: 128 evaluation samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:---------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | How many forces do objects on earth have acting on them at all times? | More than one force may act on an object at the same time. In fact, just about all objects on Earth have at least two forces acting on them at all times. One force is gravity, which pulls objects down toward the center of Earth. The other force is an upward force that may be provided by the ground or other surface. | | A rusty bike has been left outside in damp weather too many times, so the iron in the metal parts have? | Look at this rusty bike. It has been left outside in damp weather too many times, so the iron in the metal parts has rusted. Iron rusts when it combines with oxygen in the air. Iron rusting is an example of a chemical reaction. In a chemical reaction, substances change into entirely different substances. For example, the iron in the bike and the oxygen in the air have changed into rust. | | What are the smallest type of blood vessel? | Further away from the heart, the aorta branches into smaller arteries, which eventually branch into capillaries. Capillaries are the smallest type of blood vessel; they connect very small arteries and veins. Gases and other substances are exchanged between cells and the blood across the very thin walls of capillaries. | * Loss: [CachedGISTEmbedLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters: ```json {'guide': SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ), 'temperature': 0.05} ``` #### qasc_pairs * Dataset: [qasc_pairs](https://huggingface.co/datasets/allenai/qasc) at [a34ba20](https://huggingface.co/datasets/allenai/qasc/tree/a34ba204eb9a33b919c10cc08f4f1c8dae5ec070) * Size: 128 evaluation samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:----------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------| | what have a circulatory system? | Mollusks have a circulatory system with one or two hearts that pump blood.. Mussels are bivalve mollusks.. mussels have a circulatory system | | what can the eye sense? | Sight is the ability to sense light, and the eye is the organ that senses light.. Colors use the sense of sight.. eyes can sense colors | | If a person does what it may be due to a pathogen? | bacteria can cause people to become ill. Bacteria that cause disease are called pathogens.. If a person falls ill it may be due to a pathogen | * Loss: [CachedGISTEmbedLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters: ```json {'guide': SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ), 'temperature': 0.05} ``` #### openbookqa_pairs * Dataset: openbookqa_pairs * Size: 128 evaluation samples * Columns: question and fact * Approximate statistics based on the first 1000 samples: | | question | fact | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | question | fact | |:-----------------------------------------------------------------------|:-----------------------------------------------------------------------------| | The thermal production of a stove is generically used for | a stove generates heat for cooking usually | | What creates a valley? | a valley is formed by a river flowing | | when it turns day and night on a planet, what cause this? | a planet rotating causes cycles of day and night on that planet | * Loss: [CachedGISTEmbedLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters: ```json {'guide': SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ), 'temperature': 0.05} ``` #### msmarco_pairs * Dataset: [msmarco_pairs](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3) at [28ff31e](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3/tree/28ff31e4c97cddd53d298497f766e653f1e666f9) * Size: 128 evaluation samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:---------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:----------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | where is the rib king located | Carolina Rib King is located in Georgetown, South Carolina. This organization primarily operates in the Eating Places business / industry within the Eating and Drinking Places sector. This organization has been operating for approximately 8 years. | | what dosage does vyvanse come in | Vyvanse (Lisdexamfetamine) Dosage. Vyvanse comes in capsules of 10 milligrams (mg), 20 mg, 30 mg, 40 mg, 50 mg, 60 mg, and 70 mg. A typical starting dose for adults is 30 mg every morning. Your doctor will monitor your results every week or so and may adjust your dose by 10 to 20 mg, depending on your response to it.yvanse is the brand name of the prescription drug lisdexamfetamine dimesylate. Vyvanse is used to treat attention deficit hyperactivity disorder (ADHD) in children and adults and binge-eating disorder (BED) in adults. | | what is a rose engine? | g{x `ÉwxÜÇ eÉáx. Definition: “A rose engine lathe is a specialized kind of ornamental lathe. The headstock rocks back. and forth, controlled by a rubber moving against a rosette or cam-like pattern mounted on. the spindle, while the lathe spindle rotates. Rose engine work can make flower patterns, as. | * Loss: [CachedGISTEmbedLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters: ```json {'guide': SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ), 'temperature': 0.05} ``` #### nq_pairs * Dataset: [nq_pairs](https://huggingface.co/datasets/sentence-transformers/natural-questions) at [f9e894e](https://huggingface.co/datasets/sentence-transformers/natural-questions/tree/f9e894e1081e206e577b4eaa9ee6de2b06ae6f17) * Size: 128 evaluation samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:---------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | how many episodes does the last ship have | List of The Last Ship episodes On August 11, 2015, The Last Ship was renewed for a 13-episode third season,[4] which was scheduled to premiere on June 12, 2016, but postponed following the 2016 Orlando nightclub shooting due to the plot of the episode also containing a mass shooting in a nightclub.[5][6] As of October 8, 2017,[update] 46 episodes of The Last Ship have aired. In September 2016, TNT renewed the series for a 10-episode fifth season.[7] | | who was given the task of illuminating the original document of constitution of india | Constitution of India The assembly met in sessions open to the public, for 166 days, spread over a period of 2 years, 11 months and 18 days before adopting the Constitution, the 308 members of the assembly signed two copies of the document (one each in Hindi and English) on 24 January 1950. The original Constitution of India is hand-written with beautiful calligraphy, each page beautified and decorated by artists from Shantiniketan including Beohar Rammanohar Sinha and Nandalal Bose. The illustrations on the cover and pages represent styles from the different civilisations of the subcontinent, ranging from the prehistoric Mohenjodaro civilisation, in the Indus Valley, to the present. The calligraphy in the book was done by Prem Behari Narain Raizda. It was published in Dehra Dun, and photolithographed at the offices of Survey of India. The entire exercise to produce the original took nearly five years. Two days later, on 26 January 1950, the Constitution of India became the law of all the States and territories of India.[17] Rs.1,00,00,000 was official estimate of expenditure on constituent assembly. It has undergone many amendments since its enactment.[18] | | what amendments were added after the bill of rights | List of amendments to the United States Constitution Thirty-three amendments to the United States Constitution have been proposed by the United States Congress and sent to the states for ratification since the Constitution was put into operation on March 4, 1789. Twenty-seven of these, having been ratified by the requisite number of states, are part of the Constitution. The first ten amendments were adopted and ratified simultaneously and are known collectively as the Bill of Rights. Six amendments adopted by Congress and sent to the states have not been ratified by the required number of states. Four of these amendments are still technically open and pending, one is closed and has failed by its own terms, and one is closed and has failed by the terms of the resolution proposing it. | * Loss: [CachedGISTEmbedLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters: ```json {'guide': SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ), 'temperature': 0.05} ``` #### trivia_pairs * Dataset: [trivia_pairs](https://huggingface.co/datasets/sentence-transformers/trivia-qa) at [a7c36e3](https://huggingface.co/datasets/sentence-transformers/trivia-qa/tree/a7c36e3c8c8c01526bc094d79bf80d4c848b0ad0) * Size: 128 evaluation samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:---------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | In which country did Argentina first win soccer's World Cup? | World Cup winners list: A complete history - SBNation.com World Cup winners list: A complete history Rec Dean Mouhtaropoulos In 1930, thirteen teams participated in the first World Cup held in Uruguay. Since then, the countries of the world have come together every four years (except in the 1940's-yes Germany, looking at you here) to play in the tournament, with 77 countries having participated in 20 tournaments as of 2014. Despite, the large number of countries to participate, only eight of them have enjoyed the glory of actually winning it. Brazil are on the top with five (don't mention this to Brazilians right now, though), and Germany are next on the list with four, their most recent having been secured against Argentina on Sunday. Here's a quick tour of each winning nation. Brazil 2014: Germany Germany became the first ever European team to win a World Cup in South America, and lifted the trophy for the first time since reunification. Fittingly, in a tournament in which nothing was predictable, Germany didn't look completely convincing en route to their final against Argentina, and notably needed extra time to get past the unfavoured Algeria in the first knockout round. However, Die Mannschaft grew into the tournament, and inflicted a historic 7-1 thrashing on tournament hosts Brazil in the semis before Mario Götze's last-gasp extra time strike settled a close final. Argentina captain Lionel Messi earned the Golden Ball as a consolation which was really none at all. South Africa 2010: Spain The Spanish team in 2010 was special, which makes its early exit in Brazil even more of a mystery. In South Africa, Andrés Iniesta scored in the 116th minute agaist the Netherlands to give Spain their first World Cup. Six members of the team, along with their coach Vincente del Bosque, were voted onto the team of the tournament. Iker Casillas, the goalkeeper, won the Golden Glove award (previously the Yashin Award), shutting out his opponents in five of the seven matches. The team also won the FIFA Fair Play Trophy. Germany 2006: Italy Italy's victory over France in the final was one for the memories. Not only did Italy win 5-3 on penalty kicks, but France's captain Zinedine Zidane was red-carded for head-butting Marco Materazzi in extra-time.  Italy's goalkeeper, Gianluigi Buffon won the Yashin Award given to the best goalkeeper, and was one of seven Italian players voted to the All-Star team. The victory gave Italy their fourth World Cup title, then second only to Brazil's five, but matched by Germany this year. Korea-Japan 2002: Brazil This World Cup was Ronaldo's World Cup. The old one. The Brazilian striker won the Golden Boot award (highest scoring player), scoring eight goals in the tournament. Two of those came in the final, as Brazil shut out Germany 2-0 and won their record fifth World Cup. Ronaldo was voted to the team of the tournament along with teammates Rivaldo, Ronaldinho, and Roberto Carlos finished with a 7-0-0 record and a plus-14 goal differential. France 1998: France If you think the header is a typo, you are mistaken! When France won the tournament in France they became the sixth country to win the tournament on home soil. France's goalkeeper won the inaugural Yashin Award, letting in only two goals, and eight French players scored in the tournament. Zinedine Zidane headlined the French attack, as France ended with a plus-13 goal differential. They were also given the FIFA Fair Play Trophy and voted the Most Entertaining Team. USA 1994: Brazil When Brazil faced Italy in the '94 final both teams were looking for their record fourth title. Brazil defeated Italy 3-2 on penalty kicks, becoming the first country to win the final via a shootout. Romário scored five goals and won the Golden Ball award (best player), and Brazil won the FIFA Fair Play Trophy and was voted the Most Entertaining Team. On a side note, the US chose this as the the mascot for the tournament. #Fifa #WorldCup World Cup In honor of this amazing month of soccer, #tbt World Cup '94 with the mascot Striker #b ... pic.twitter.com/ri1nVPC4iT — FI | | When Elisha Graves Otis invented it, he called it the safety hoist. What do we call it now? | Inventor Elisha Otis Biography Inventor: Elisha Graves Otis Criteria; First to invent. First to patent. First practical. Entrepreneur. Birth: August 3, 1811 in Halifax, Vermont Death: April 8, 1861 in Yonkers, New York Nationality: American Invention: elevator, safety brake in 1852 Function: noun / el�e�va�tor Definition: A platform or an enclosure raised and lowered in a vertical shaft to transport people or freight. The shaft contains the operating equipment, motor, cables, and accessories. Patent: 31,128 (US) issued January 15, 1861 Milestones: 1852 invents a safety latch for hoisting equipment 1853 starts a company to manufacture safe elevators. Sells elevator to hoist freight 1854 Otis demonstrates the elevator at the World's Fair, Crystal Palace exposition in New York City 1857 Installs the first passenger safe elevator in a New York department store 1861 receives patent for improvements to hoisting apparatus, safety brake 1861 after his death his sons form Otis Brothers & Company 1873 over 2,000 Otis elevators were in use in office buildings, hotels and department stores 1898 Otis Brothers merged with 14 other elevator entities to form the Otis Elevator Company 1903 introduced the gearless traction electric elevator 1931 first Otis double-deck elevator was installed elevator, safety elevator, safety brake for elevators, elisha graves otis, otis elevatorm UTC, patent 31128, invention, history, inventor of, history of, who invented, invention of, fascinating facts. The Story: Imagine the skyline of a modern city if the elevator did not exist. Buildings would be limited to five or six stories. Most of the architecture of the 20th and 21st century would be impossible. Office towers, hotels and high-rise apartments would hardly stand in their present form. The need for vertical transport is as old as civilization. Over the centuries, mankind has employed ingenious forms of lifting. The earliest lifts used man, animal and water power to raise the load. Lifting devices relied on these basic forms of power from the early agricultural societies until the dawn of the Industrial Revolution. From ancient times through the Middle Ages, and into the 13th century, man or animal power was the driving force behind hoisting devices. In ancient Greece, Archimedes developed an improved lifting device operated by ropes and pulleys, in which the hoisting ropes were coiled around a winding drum by a capstan and levers. By A.D. 80, gladiators and wild animals rode crude elevators up to the arena level of the Roman Coliseum. Medieval records contain numerous drawings of hoists lifting men and supplies to isolated locations. Among the most famous is the hoist at the monastery of St. Barlaam in Greece. The monastery stood on a pinnacle approximately (200 ft) above the ground. Its hoist, which employed a basket or cargo net, was the only means up or down. The first elevator designed for a passemger was built in 1743 for King Louis XV at his palace in France. The one-person contraption went up only one floor, from the first to the second. Known as the "Flying Chair," it was on the outside of the building, and was entered by the king via his balcony. The mechanism consisted of a carefully balanced arrangement of weights and pulleys hanging inside a chimney. Men stationed inside the chimney then raised or lowered the Flying Chair at the king's command.  By 1850 steam and hydraulic elevators had been introduced, but it was in 1852 that the landmark event in elevator history occurred: the invention of the world's first safety elevator by Elisha Graves Otis. The first passenger elevator was installed by Otis in New York in 1857. After Otis' death in 1861, his sons, Charles and | | What colour of flag should a ship fly to show it is in quarantine? | Quarantine Flag Quarantine flag Posted to Maritime Musings (by Dennis Bryant ) on January 6, 2012 A visible warning to stay clear. The quarantine flag, also called the “Yellow Jack”, is the international signal flag LIMA.  It is square in shape.  Its display is divided into four smaller squares, with two on top and two on the bottom.  The smaller squares are alternately yellow and black in color.  The flag is flown from a ship that is either arriving in port with known serious health problems or that has been placed under quarantine by the local port authorities.  Once the local authorities have determined that the ship’s health problems have been resolved and removed the quarantine order, the ship may fly the free pratique flag (e.g., the international signal flag QUEBEC), which is solid yellow.  The concept of quarantine is ancient and is mentioned in the Old Testament.  The term itself is derived from the practice of the city-state of Venice during the Middle Ages of requiring ships arriving from locations known to being experiencing diseases such as the plague to anchor or moor off the port for 40 days (quaranta giorni) so that any disease on board might run its course.  The practice of quarantine has varied over the centuries, but the concept of protecting the public health by restricting the movements of individuals who are suspected of possibly harboring serious disease has remained constant.  The World Health Organization (WHO) provides guidelines on how and when quarantine should be used, but its actual implementation is left to the discretion of individual nations.  In the United States, the Center for Disease Control and Prevention (CDC) administers the federal quarantine program, but the separate states and local communities also have broad powers.  Ships arriving in a US port with serious disease on board are required to provide advance notification.  The ship may be required to undertake certain sanitary measures and to exercise various controls over all persons on board to prevent them from serving as disease vectors potentially infecting the local populace.  The closest we have come recently to a general quarantine affecting the maritime industry was during the 2002 SARS epidemic, which heavily impacted southeast Asia.  A future pandemic, whether the result of avian flu or otherwise, may see widespread implementation of quarantine measures and flying of the quarantine flag. | * Loss: [CachedGISTEmbedLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters: ```json {'guide': SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ), 'temperature': 0.05} ``` #### gooaq_pairs * Dataset: [gooaq_pairs](https://huggingface.co/datasets/sentence-transformers/gooaq) at [b089f72](https://huggingface.co/datasets/sentence-transformers/gooaq/tree/b089f728748a068b7bc5234e5bcf5b25e3c8279c) * Size: 128 evaluation samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:-------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | what is the difference between lemon eucalyptus oil and eucalyptus oil? | What Is the Difference between Eucalyptus Oil and Lemon Eucalyptus Oil? Lemon Eucalyptus oil comes from a different kind of tree than Eucalyptus oil. Lemon eucalyptus is a common nickname for the tree, but it is also called the lemon-scented gum and blue spotted gum. Despite its name, Lemon Eucalyptus is not a citrus. | | pokemon sword and shield will pokemon follow you? | Any Pokémon that can be used in Sword and Shield is eligible to follow you around, even Legendaries. This feature only works while you're inside the Isle of Armor area, however. Once you return to Galar proper, the Pokémon will return to its ball. | | how long does it take to get a naturalization certificate replacement? | After filing Form N-565, Application for Replacement Naturalization/Citizenship Document, the N-565 processing time will take 5-12 months in most cases. | * Loss: [CachedGISTEmbedLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters: ```json {'guide': SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ), 'temperature': 0.05} ``` #### paws-pos * Dataset: [paws-pos](https://huggingface.co/datasets/google-research-datasets/paws) at [161ece9](https://huggingface.co/datasets/google-research-datasets/paws/tree/161ece9501cf0a11f3e48bd356eaa82de46d6a09) * Size: 128 evaluation samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:---------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------| | They were there to enjoy us and they were there to pray for us . | They were there for us to enjoy and they were there for us to pray . | | After the end of the war in June 1902 , Higgins left Southampton in the `` SSBavarian '' in August , returning to Cape Town the following month . | In August , after the end of the war in June 1902 , Higgins Southampton left the `` SSBavarian '' and returned to Cape Town the following month . | | From the merger of the Four Rivers Council and the Audubon Council , the Shawnee Trails Council was born . | Shawnee Trails Council was formed from the merger of the Four Rivers Council and the Audubon Council . | * Loss: [CachedGISTEmbedLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedgistembedloss) with these parameters: ```json {'guide': SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ), 'temperature': 0.05} ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: steps - `per_device_train_batch_size`: 224 - `per_device_eval_batch_size`: 64 - `gradient_accumulation_steps`: 5 - `learning_rate`: 4e-05 - `weight_decay`: 0.0001 - `num_train_epochs`: 1 - `lr_scheduler_type`: cosine_with_min_lr - `lr_scheduler_kwargs`: {'num_cycles': 0.5, 'min_lr': 1e-05} - `warmup_ratio`: 0.33 - `save_safetensors`: False - `fp16`: True - `push_to_hub`: True - `hub_model_id`: bobox/DeBERTa-small-ST-v1-toytest-checkpoints-tmp - `hub_strategy`: all_checkpoints - `batch_sampler`: no_duplicates #### All Hyperparameters
Click to expand - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: steps - `prediction_loss_only`: True - `per_device_train_batch_size`: 224 - `per_device_eval_batch_size`: 64 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 5 - `eval_accumulation_steps`: None - `learning_rate`: 4e-05 - `weight_decay`: 0.0001 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 1 - `max_steps`: -1 - `lr_scheduler_type`: cosine_with_min_lr - `lr_scheduler_kwargs`: {'num_cycles': 0.5, 'min_lr': 1e-05} - `warmup_ratio`: 0.33 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: False - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: True - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: True - `resume_from_checkpoint`: None - `hub_model_id`: bobox/DeBERTa-small-ST-v1-toytest-checkpoints-tmp - `hub_strategy`: all_checkpoints - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `eval_on_start`: False - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional
### Training Logs | Epoch | Step | Training Loss | openbookqa pairs loss | qasc pairs loss | xsum-pairs loss | msmarco pairs loss | trivia pairs loss | negation-triplets loss | sciq pairs loss | nq pairs loss | gooaq pairs loss | scitail-pairs-pos loss | vitaminc-pairs loss | paws-pos loss | NLI-v2_max_accuracy | VitaminC_max_ap | sts-test_spearman_cosine | |:------:|:----:|:-------------:|:---------------------:|:---------------:|:---------------:|:------------------:|:-----------------:|:----------------------:|:---------------:|:-------------:|:----------------:|:----------------------:|:-------------------:|:-------------:|:-------------------:|:---------------:|:------------------------:| | 0.0291 | 1 | 6.7536 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0581 | 2 | 6.6203 | 4.7439 | 3.9689 | 6.3278 | 10.5136 | 3.8610 | 5.0942 | 0.3654 | 4.9690 | 8.0411 | 1.9184 | 2.7266 | 2.2190 | 1.0 | 0.5178 | 0.0712 | | 0.0872 | 3 | 6.7963 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1163 | 4 | 6.4488 | 4.6508 | 3.6622 | 6.1990 | 9.4879 | 3.5246 | 5.0816 | 0.3414 | 4.4714 | 7.3951 | 1.9187 | 2.7045 | 2.2332 | 1.0 | 0.5220 | 0.0777 | | 0.1453 | 5 | 6.5567 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1744 | 6 | 7.994 | 4.4811 | 3.3633 | 6.0795 | 8.0488 | 3.2845 | 5.0681 | 0.3208 | 3.7927 | 6.6778 | 1.9320 | 2.6922 | 2.2626 | 1.0 | 0.5220 | 0.0909 | | 0.2035 | 7 | 7.1037 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2326 | 8 | 6.6239 | 4.3260 | 3.1746 | 6.0509 | 6.9898 | 3.2417 | 5.0856 | 0.3155 | 3.3527 | 6.2884 | 1.9701 | 2.7007 | 2.3511 | 1.0 | 0.5262 | 0.1031 | | 0.2616 | 9 | 6.7359 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2907 | 10 | 7.0187 | 4.2138 | 3.0288 | 5.9589 | 6.4430 | 3.1168 | 5.1371 | 0.3123 | 3.1352 | 6.0863 | 2.0432 | 2.7152 | 2.5095 | 1.0 | 0.5267 | 0.1129 | ### Framework Versions - Python: 3.10.13 - Sentence Transformers: 3.0.1 - Transformers: 4.42.3 - PyTorch: 2.1.2 - Accelerate: 0.32.1 - Datasets: 2.20.0 - Tokenizers: 0.19.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ```