joheras commited on
Commit
527e53b
1 Parent(s): 3be70e3

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +92 -0
README.md ADDED
@@ -0,0 +1,92 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - simplification
5
+ - generated_from_trainer
6
+ metrics:
7
+ - rouge
8
+ model-index:
9
+ - name: flan-t5-base-clara-med
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # flan-t5-base-clara-med
17
+
18
+ This model is a fine-tuned version of [google/flan-t5-base](https://huggingface.co/google/flan-t5-base) on the None dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 1.2296
21
+ - Rouge1: 29.9558
22
+ - Rouge2: 16.9558
23
+ - Rougel: 28.1645
24
+ - Rougelsum: 28.1582
25
+
26
+ ## Model description
27
+
28
+ More information needed
29
+
30
+ ## Intended uses & limitations
31
+
32
+ More information needed
33
+
34
+ ## Training and evaluation data
35
+
36
+ More information needed
37
+
38
+ ## Training procedure
39
+
40
+ ### Training hyperparameters
41
+
42
+ The following hyperparameters were used during training:
43
+ - learning_rate: 5.6e-05
44
+ - train_batch_size: 8
45
+ - eval_batch_size: 8
46
+ - seed: 42
47
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
48
+ - lr_scheduler_type: linear
49
+ - num_epochs: 30
50
+
51
+ ### Training results
52
+
53
+ | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum |
54
+ |:-------------:|:-----:|:-----:|:---------------:|:-------:|:-------:|:-------:|:---------:|
55
+ | No log | 1.0 | 380 | 1.4310 | 27.8189 | 15.801 | 26.3299 | 26.2963 |
56
+ | No log | 2.0 | 760 | 1.3600 | 28.154 | 15.998 | 26.6032 | 26.606 |
57
+ | 1.6592 | 3.0 | 1140 | 1.3062 | 28.5836 | 16.3289 | 27.1685 | 27.1789 |
58
+ | 1.6592 | 4.0 | 1520 | 1.2747 | 29.1122 | 16.6792 | 27.5758 | 27.5733 |
59
+ | 1.3824 | 5.0 | 1900 | 1.2505 | 28.9849 | 16.4234 | 27.343 | 27.3457 |
60
+ | 1.3824 | 6.0 | 2280 | 1.2358 | 29.2208 | 16.7068 | 27.5005 | 27.4958 |
61
+ | 1.3824 | 7.0 | 2660 | 1.2317 | 29.4609 | 17.1435 | 27.7924 | 27.809 |
62
+ | 1.2296 | 8.0 | 3040 | 1.2215 | 29.9464 | 17.1933 | 28.1208 | 28.1532 |
63
+ | 1.2296 | 9.0 | 3420 | 1.2187 | 29.8723 | 17.2898 | 27.9957 | 28.0209 |
64
+ | 1.1295 | 10.0 | 3800 | 1.2143 | 29.76 | 17.2644 | 27.9598 | 27.9482 |
65
+ | 1.1295 | 11.0 | 4180 | 1.2044 | 29.5394 | 16.9554 | 27.7543 | 27.7495 |
66
+ | 1.1295 | 12.0 | 4560 | 1.2082 | 29.6155 | 17.0565 | 27.9131 | 27.9027 |
67
+ | 1.0493 | 13.0 | 4940 | 1.2047 | 30.0647 | 17.314 | 28.3498 | 28.3241 |
68
+ | 1.0493 | 14.0 | 5320 | 1.2073 | 29.8209 | 17.0308 | 27.9766 | 27.9716 |
69
+ | 0.9857 | 15.0 | 5700 | 1.2058 | 29.7392 | 17.0373 | 28.0291 | 28.029 |
70
+ | 0.9857 | 16.0 | 6080 | 1.2077 | 30.1819 | 17.298 | 28.3771 | 28.3706 |
71
+ | 0.9857 | 17.0 | 6460 | 1.2043 | 30.0708 | 17.2588 | 28.3525 | 28.3654 |
72
+ | 0.9331 | 18.0 | 6840 | 1.2103 | 29.9749 | 17.0748 | 28.1575 | 28.1827 |
73
+ | 0.9331 | 19.0 | 7220 | 1.2086 | 29.561 | 16.8513 | 27.7646 | 27.7808 |
74
+ | 0.8997 | 20.0 | 7600 | 1.2183 | 30.1109 | 17.186 | 28.3103 | 28.3078 |
75
+ | 0.8997 | 21.0 | 7980 | 1.2177 | 29.851 | 17.0093 | 28.0336 | 28.0348 |
76
+ | 0.8997 | 22.0 | 8360 | 1.2181 | 30.2841 | 17.5662 | 28.5167 | 28.526 |
77
+ | 0.8628 | 23.0 | 8740 | 1.2224 | 29.8959 | 17.0802 | 28.1386 | 28.1456 |
78
+ | 0.8628 | 24.0 | 9120 | 1.2244 | 29.9 | 17.1425 | 28.1456 | 28.1179 |
79
+ | 0.8369 | 25.0 | 9500 | 1.2234 | 30.0394 | 17.0066 | 28.2387 | 28.2345 |
80
+ | 0.8369 | 26.0 | 9880 | 1.2266 | 29.9758 | 17.1042 | 28.2635 | 28.2669 |
81
+ | 0.8369 | 27.0 | 10260 | 1.2263 | 29.893 | 16.993 | 28.0106 | 28.0009 |
82
+ | 0.8187 | 28.0 | 10640 | 1.2272 | 29.9718 | 17.0048 | 28.1821 | 28.1751 |
83
+ | 0.8187 | 29.0 | 11020 | 1.2279 | 29.973 | 17.0096 | 28.1837 | 28.1655 |
84
+ | 0.8181 | 30.0 | 11400 | 1.2296 | 29.9558 | 16.9558 | 28.1645 | 28.1582 |
85
+
86
+
87
+ ### Framework versions
88
+
89
+ - Transformers 4.25.1
90
+ - Pytorch 1.13.0
91
+ - Datasets 2.8.0
92
+ - Tokenizers 0.12.1