Joemgu commited on
Commit
d01eb27
1 Parent(s): 75bbd44

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +44 -0
README.md ADDED
@@ -0,0 +1,44 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Pegasus-x-sumstew
2
+
3
+ ## Model description
4
+ This model is a fine-tuned version of the Pegasus-x-large model on a filtered subset of a mixture of CNN-Dailymail, Samsum, Booksum and Laysum datasets. It can generate abstractive summaries of long texts.
5
+
6
+ ## Intended uses & limitations
7
+ This model can be used for summarizing long texts in English, such as academic transcripts, meeting minutes, or literature. It is not intended for summarizing short texts, such as tweets, headlines, or captions. The model may produce inaccurate or biased summaries if the input text contains factual errors, slang, or offensive language.
8
+
9
+ ## How to use
10
+ You can use this model with the `pipeline` function from the `transformers` library:
11
+
12
+ ```python
13
+ from transformers import pipeline, AutoTokenizer
14
+
15
+ tokenizer = AutoTokenizer.from_pretrained("joemgu/pegasus-x-large")
16
+ summarizer = pipeline("summarization", model="joemgu/pegasus-x-large", tokenizer=tokenizer)
17
+ text = "Alice was beginning to get very tired of sitting by her sister on the bank, and of having nothing to do: once or twice she had peeped into the book her sister was reading, but it had no pictures or conversations in it, 'and what is the use of a book,' thought Alice 'without pictures or conversations?' So she was considering in her own mind (as well as she could, for the hot day made her feel very sleepy and stupid), whether the pleasure of making a daisy-chain would be worth the trouble of getting up and picking the daisies, when suddenly a White Rabbit with pink eyes ran close by her. There was nothing so very remarkable in that; nor did Alice think it so very much out of the way to hear the Rabbit say to itself, 'Oh dear! Oh dear! I shall be late!' (when she thought it over afterwards, it occurred to her that she ought to have wondered at this, but at the time it all seemed quite natural); but when the Rabbit actually took a watch out of its waistcoat-pocket, and looked at it, and then hurried on, Alice started to her feet, for it flashed across her mind that she had never before seen a rabbit with either a waistcoat-pocket, or a watch to take out of it, and burning with curiosity, she ran across the field after it, and fortunately was just in time to see it pop down a large rabbit-hole under the hedge. In another moment down went Alice after it, never once considering how in the world she was to get out again."
18
+ summary = summarizer(text,
19
+ num_beams=8,
20
+ repetition_penalty=3.5,
21
+ no_repeat_ngram_size=4,
22
+ encoder_no_repeat_ngram_size=4)[0]["summary_text"]
23
+ print(summary)
24
+ ```
25
+ Output:
26
+ ```text
27
+ Alice is a bored and curious girl who follows a White Rabbit with a watch into a rabbit-hole. She enters a strange world where she has many adventures and meets many peculiar creatures.
28
+ ```
29
+
30
+ ## Training data
31
+ The model was fine-tuned on a filtered subset of a mixture of CNN-Dailymail, Samsum, Booksum and Laysum datasets. These datasets contain various types of texts and their abstractive summaries. The subset was selected to include only texts that are longer than 1000 words and have summaries that are shorter than 100 words. The total size of the subset is about 150k examples.
32
+
33
+ ## Evaluation results
34
+ TODO
35
+
36
+ ## Limitations and bias
37
+ The model may have inherited some limitations and biases from the pre-trained Pegasus-x-large model and the fine-tuning datasets. Some possible sources of bias are:
38
+
39
+ - The pre-trained Pegasus-x-large model was trained on a large corpus of English texts from various sources, which may not reflect the diversity and nuances of different languages and cultures.
40
+ - The fine-tuning datasets were collected from different domains and genres, which may have their own stylistic conventions and perspectives on certain topics and events.
41
+ - The fine-tuning datasets only contain abstractive summaries, which may not capture all the important information and nuances of the original texts.
42
+ - The fine-tuning datasets only cover texts from certain time periods and sources, which may not reflect the current state of affairs and trends.
43
+
44
+ Therefore, users should be aware of these limitations and biases when using this model and evaluate its performance and suitability for their specific use cases.