Add multilingual to the language tag

#3
by lbourdois - opened
Files changed (1) hide show
  1. README.md +55 -70
README.md CHANGED
@@ -2,205 +2,190 @@
2
  language:
3
  - en
4
  - fr
 
 
5
  tags:
6
  - translation
7
  - opus-mt-tc
8
- license: cc-by-4.0
9
  model-index:
10
  - name: opus-mt-tc-big-fr-en
11
  results:
12
  - task:
13
- name: Translation fra-eng
14
  type: translation
15
- args: fra-eng
16
  dataset:
17
  name: flores101-devtest
18
  type: flores_101
19
  args: fra eng devtest
20
  metrics:
21
- - name: BLEU
22
- type: bleu
23
  value: 46.0
 
24
  - task:
25
- name: Translation fra-eng
26
  type: translation
27
- args: fra-eng
28
  dataset:
29
  name: multi30k_test_2016_flickr
30
  type: multi30k-2016_flickr
31
  args: fra-eng
32
  metrics:
33
- - name: BLEU
34
- type: bleu
35
  value: 49.7
 
36
  - task:
37
- name: Translation fra-eng
38
  type: translation
39
- args: fra-eng
40
  dataset:
41
  name: multi30k_test_2017_flickr
42
  type: multi30k-2017_flickr
43
  args: fra-eng
44
  metrics:
45
- - name: BLEU
46
- type: bleu
47
  value: 52.0
 
48
  - task:
49
- name: Translation fra-eng
50
  type: translation
51
- args: fra-eng
52
  dataset:
53
  name: multi30k_test_2017_mscoco
54
  type: multi30k-2017_mscoco
55
  args: fra-eng
56
  metrics:
57
- - name: BLEU
58
- type: bleu
59
  value: 50.6
 
60
  - task:
61
- name: Translation fra-eng
62
  type: translation
63
- args: fra-eng
64
  dataset:
65
  name: multi30k_test_2018_flickr
66
  type: multi30k-2018_flickr
67
  args: fra-eng
68
  metrics:
69
- - name: BLEU
70
- type: bleu
71
  value: 44.9
 
72
  - task:
73
- name: Translation fra-eng
74
  type: translation
75
- args: fra-eng
76
  dataset:
77
  name: news-test2008
78
  type: news-test2008
79
  args: fra-eng
80
  metrics:
81
- - name: BLEU
82
- type: bleu
83
  value: 26.5
 
84
  - task:
85
- name: Translation fra-eng
86
  type: translation
87
- args: fra-eng
88
  dataset:
89
  name: newsdiscussdev2015
90
  type: newsdiscussdev2015
91
  args: fra-eng
92
  metrics:
93
- - name: BLEU
94
- type: bleu
95
  value: 34.4
 
96
  - task:
97
- name: Translation fra-eng
98
  type: translation
99
- args: fra-eng
100
  dataset:
101
  name: newsdiscusstest2015
102
  type: newsdiscusstest2015
103
  args: fra-eng
104
  metrics:
105
- - name: BLEU
106
- type: bleu
107
  value: 40.2
 
108
  - task:
109
- name: Translation fra-eng
110
  type: translation
111
- args: fra-eng
112
  dataset:
113
  name: tatoeba-test-v2021-08-07
114
  type: tatoeba_mt
115
  args: fra-eng
116
  metrics:
117
- - name: BLEU
118
- type: bleu
119
  value: 59.8
 
120
  - task:
121
- name: Translation fra-eng
122
  type: translation
123
- args: fra-eng
124
  dataset:
125
  name: tico19-test
126
  type: tico19-test
127
  args: fra-eng
128
  metrics:
129
- - name: BLEU
130
- type: bleu
131
  value: 41.3
 
132
  - task:
133
- name: Translation fra-eng
134
  type: translation
135
- args: fra-eng
136
  dataset:
137
  name: newstest2009
138
  type: wmt-2009-news
139
  args: fra-eng
140
  metrics:
141
- - name: BLEU
142
- type: bleu
143
  value: 30.4
 
144
  - task:
145
- name: Translation fra-eng
146
  type: translation
147
- args: fra-eng
148
  dataset:
149
  name: newstest2010
150
  type: wmt-2010-news
151
  args: fra-eng
152
  metrics:
153
- - name: BLEU
154
- type: bleu
155
  value: 33.4
 
156
  - task:
157
- name: Translation fra-eng
158
  type: translation
159
- args: fra-eng
160
  dataset:
161
  name: newstest2011
162
  type: wmt-2011-news
163
  args: fra-eng
164
  metrics:
165
- - name: BLEU
166
- type: bleu
167
  value: 33.8
 
168
  - task:
169
- name: Translation fra-eng
170
  type: translation
171
- args: fra-eng
172
  dataset:
173
  name: newstest2012
174
  type: wmt-2012-news
175
  args: fra-eng
176
  metrics:
177
- - name: BLEU
178
- type: bleu
179
  value: 33.6
 
180
  - task:
181
- name: Translation fra-eng
182
  type: translation
183
- args: fra-eng
184
  dataset:
185
  name: newstest2013
186
  type: wmt-2013-news
187
  args: fra-eng
188
  metrics:
189
- - name: BLEU
190
- type: bleu
191
  value: 34.8
 
192
  - task:
193
- name: Translation fra-eng
194
  type: translation
195
- args: fra-eng
196
  dataset:
197
  name: newstest2014
198
  type: wmt-2014-news
199
  args: fra-eng
200
  metrics:
201
- - name: BLEU
202
- type: bleu
203
  value: 39.4
 
204
  ---
205
  # opus-mt-tc-big-fr-en
206
 
@@ -208,7 +193,7 @@ Neural machine translation model for translating from French (fr) to English (en
208
 
209
  This model is part of the [OPUS-MT project](https://github.com/Helsinki-NLP/Opus-MT), an effort to make neural machine translation models widely available and accessible for many languages in the world. All models are originally trained using the amazing framework of [Marian NMT](https://marian-nmt.github.io/), an efficient NMT implementation written in pure C++. The models have been converted to pyTorch using the transformers library by huggingface. Training data is taken from [OPUS](https://opus.nlpl.eu/) and training pipelines use the procedures of [OPUS-MT-train](https://github.com/Helsinki-NLP/Opus-MT-train).
210
 
211
- * Publications: [OPUS-MT Building open translation services for the World](https://aclanthology.org/2020.eamt-1.61/) and [The Tatoeba Translation Challenge Realistic Data Sets for Low Resource and Multilingual MT](https://aclanthology.org/2020.wmt-1.139/) (Please, cite if you use this model.)
212
 
213
  ```
214
  @inproceedings{tiedemann-thottingal-2020-opus,
@@ -255,8 +240,8 @@ A short example code:
255
  from transformers import MarianMTModel, MarianTokenizer
256
 
257
  src_text = [
258
- "J'ai adoré l'Angleterre.",
259
- "C'était la seule chose à faire."
260
  ]
261
 
262
  model_name = "pytorch-models/opus-mt-tc-big-fr-en"
@@ -277,7 +262,7 @@ You can also use OPUS-MT models with the transformers pipelines, for example:
277
  ```python
278
  from transformers import pipeline
279
  pipe = pipeline("translation", model="Helsinki-NLP/opus-mt-tc-big-fr-en")
280
- print(pipe("J'ai adoré l'Angleterre."))
281
 
282
  # expected output: I loved England.
283
  ```
@@ -311,7 +296,7 @@ print(pipe("J'ai adoré l'Angleterre."))
311
 
312
  ## Acknowledgements
313
 
314
- The work is supported by the [European Language Grid](https://www.european-language-grid.eu/) as [pilot project 2866](https://live.european-language-grid.eu/catalogue/#/resource/projects/2866), by the [FoTran project](https://www.helsinki.fi/en/researchgroups/natural-language-understanding-with-cross-lingual-grounding), funded by the European Research Council (ERC) under the European Unions Horizon 2020 research and innovation programme (grant agreement No 771113), and the [MeMAD project](https://memad.eu/), funded by the European Unions Horizon 2020 Research and Innovation Programme under grant agreement No 780069. We are also grateful for the generous computational resources and IT infrastructure provided by [CSC -- IT Center for Science](https://www.csc.fi/), Finland.
315
 
316
  ## Model conversion info
317
 
 
2
  language:
3
  - en
4
  - fr
5
+ - multilingual
6
+ license: cc-by-4.0
7
  tags:
8
  - translation
9
  - opus-mt-tc
 
10
  model-index:
11
  - name: opus-mt-tc-big-fr-en
12
  results:
13
  - task:
 
14
  type: translation
15
+ name: Translation fra-eng
16
  dataset:
17
  name: flores101-devtest
18
  type: flores_101
19
  args: fra eng devtest
20
  metrics:
21
+ - type: bleu
 
22
  value: 46.0
23
+ name: BLEU
24
  - task:
 
25
  type: translation
26
+ name: Translation fra-eng
27
  dataset:
28
  name: multi30k_test_2016_flickr
29
  type: multi30k-2016_flickr
30
  args: fra-eng
31
  metrics:
32
+ - type: bleu
 
33
  value: 49.7
34
+ name: BLEU
35
  - task:
 
36
  type: translation
37
+ name: Translation fra-eng
38
  dataset:
39
  name: multi30k_test_2017_flickr
40
  type: multi30k-2017_flickr
41
  args: fra-eng
42
  metrics:
43
+ - type: bleu
 
44
  value: 52.0
45
+ name: BLEU
46
  - task:
 
47
  type: translation
48
+ name: Translation fra-eng
49
  dataset:
50
  name: multi30k_test_2017_mscoco
51
  type: multi30k-2017_mscoco
52
  args: fra-eng
53
  metrics:
54
+ - type: bleu
 
55
  value: 50.6
56
+ name: BLEU
57
  - task:
 
58
  type: translation
59
+ name: Translation fra-eng
60
  dataset:
61
  name: multi30k_test_2018_flickr
62
  type: multi30k-2018_flickr
63
  args: fra-eng
64
  metrics:
65
+ - type: bleu
 
66
  value: 44.9
67
+ name: BLEU
68
  - task:
 
69
  type: translation
70
+ name: Translation fra-eng
71
  dataset:
72
  name: news-test2008
73
  type: news-test2008
74
  args: fra-eng
75
  metrics:
76
+ - type: bleu
 
77
  value: 26.5
78
+ name: BLEU
79
  - task:
 
80
  type: translation
81
+ name: Translation fra-eng
82
  dataset:
83
  name: newsdiscussdev2015
84
  type: newsdiscussdev2015
85
  args: fra-eng
86
  metrics:
87
+ - type: bleu
 
88
  value: 34.4
89
+ name: BLEU
90
  - task:
 
91
  type: translation
92
+ name: Translation fra-eng
93
  dataset:
94
  name: newsdiscusstest2015
95
  type: newsdiscusstest2015
96
  args: fra-eng
97
  metrics:
98
+ - type: bleu
 
99
  value: 40.2
100
+ name: BLEU
101
  - task:
 
102
  type: translation
103
+ name: Translation fra-eng
104
  dataset:
105
  name: tatoeba-test-v2021-08-07
106
  type: tatoeba_mt
107
  args: fra-eng
108
  metrics:
109
+ - type: bleu
 
110
  value: 59.8
111
+ name: BLEU
112
  - task:
 
113
  type: translation
114
+ name: Translation fra-eng
115
  dataset:
116
  name: tico19-test
117
  type: tico19-test
118
  args: fra-eng
119
  metrics:
120
+ - type: bleu
 
121
  value: 41.3
122
+ name: BLEU
123
  - task:
 
124
  type: translation
125
+ name: Translation fra-eng
126
  dataset:
127
  name: newstest2009
128
  type: wmt-2009-news
129
  args: fra-eng
130
  metrics:
131
+ - type: bleu
 
132
  value: 30.4
133
+ name: BLEU
134
  - task:
 
135
  type: translation
136
+ name: Translation fra-eng
137
  dataset:
138
  name: newstest2010
139
  type: wmt-2010-news
140
  args: fra-eng
141
  metrics:
142
+ - type: bleu
 
143
  value: 33.4
144
+ name: BLEU
145
  - task:
 
146
  type: translation
147
+ name: Translation fra-eng
148
  dataset:
149
  name: newstest2011
150
  type: wmt-2011-news
151
  args: fra-eng
152
  metrics:
153
+ - type: bleu
 
154
  value: 33.8
155
+ name: BLEU
156
  - task:
 
157
  type: translation
158
+ name: Translation fra-eng
159
  dataset:
160
  name: newstest2012
161
  type: wmt-2012-news
162
  args: fra-eng
163
  metrics:
164
+ - type: bleu
 
165
  value: 33.6
166
+ name: BLEU
167
  - task:
 
168
  type: translation
169
+ name: Translation fra-eng
170
  dataset:
171
  name: newstest2013
172
  type: wmt-2013-news
173
  args: fra-eng
174
  metrics:
175
+ - type: bleu
 
176
  value: 34.8
177
+ name: BLEU
178
  - task:
 
179
  type: translation
180
+ name: Translation fra-eng
181
  dataset:
182
  name: newstest2014
183
  type: wmt-2014-news
184
  args: fra-eng
185
  metrics:
186
+ - type: bleu
 
187
  value: 39.4
188
+ name: BLEU
189
  ---
190
  # opus-mt-tc-big-fr-en
191
 
 
193
 
194
  This model is part of the [OPUS-MT project](https://github.com/Helsinki-NLP/Opus-MT), an effort to make neural machine translation models widely available and accessible for many languages in the world. All models are originally trained using the amazing framework of [Marian NMT](https://marian-nmt.github.io/), an efficient NMT implementation written in pure C++. The models have been converted to pyTorch using the transformers library by huggingface. Training data is taken from [OPUS](https://opus.nlpl.eu/) and training pipelines use the procedures of [OPUS-MT-train](https://github.com/Helsinki-NLP/Opus-MT-train).
195
 
196
+ * Publications: [OPUS-MT Building open translation services for the World](https://aclanthology.org/2020.eamt-1.61/) and [The Tatoeba Translation Challenge Realistic Data Sets for Low Resource and Multilingual MT](https://aclanthology.org/2020.wmt-1.139/) (Please, cite if you use this model.)
197
 
198
  ```
199
  @inproceedings{tiedemann-thottingal-2020-opus,
 
240
  from transformers import MarianMTModel, MarianTokenizer
241
 
242
  src_text = [
243
+ "J'ai ador� l'Angleterre.",
244
+ "C'�tait la seule chose faire."
245
  ]
246
 
247
  model_name = "pytorch-models/opus-mt-tc-big-fr-en"
 
262
  ```python
263
  from transformers import pipeline
264
  pipe = pipeline("translation", model="Helsinki-NLP/opus-mt-tc-big-fr-en")
265
+ print(pipe("J'ai ador� l'Angleterre."))
266
 
267
  # expected output: I loved England.
268
  ```
 
296
 
297
  ## Acknowledgements
298
 
299
+ The work is supported by the [European Language Grid](https://www.european-language-grid.eu/) as [pilot project 2866](https://live.european-language-grid.eu/catalogue/#/resource/projects/2866), by the [FoTran project](https://www.helsinki.fi/en/researchgroups/natural-language-understanding-with-cross-lingual-grounding), funded by the European Research Council (ERC) under the European Unions Horizon 2020 research and innovation programme (grant agreement No 771113), and the [MeMAD project](https://memad.eu/), funded by the European Unions Horizon 2020 Research and Innovation Programme under grant agreement No 780069. We are also grateful for the generous computational resources and IT infrastructure provided by [CSC -- IT Center for Science](https://www.csc.fi/), Finland.
300
 
301
  ## Model conversion info
302