wall-e-zz commited on
Commit
cd904a5
1 Parent(s): b65bee4

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +13 -0
README.md CHANGED
@@ -18,3 +18,16 @@ mBART-50 is a multilingual Sequence-to-Sequence model pre-trained using the "Mul
18
  mBART-50 is a multilingual Sequence-to-Sequence model. It was introduced to show that multilingual translation models can be created through multilingual fine-tuning.
19
  Instead of fine-tuning on one direction, a pre-trained model is fine-tuned on many directions simultaneously. mBART-50 is created using the original mBART model and extended to add extra 25 languages to support multilingual machine translation models of 50 languages. The pre-training objective is explained below.
20
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
18
  mBART-50 is a multilingual Sequence-to-Sequence model. It was introduced to show that multilingual translation models can be created through multilingual fine-tuning.
19
  Instead of fine-tuning on one direction, a pre-trained model is fine-tuned on many directions simultaneously. mBART-50 is created using the original mBART model and extended to add extra 25 languages to support multilingual machine translation models of 50 languages. The pre-training objective is explained below.
20
 
21
+ ## Docker with GPU
22
+
23
+ ```
24
+ docker run -it --gpus all -p 7860:7860 --platform=linux/amd64 \
25
+ registry.hf.space/wall-e-zz-mbarttranslator:latest python app.py
26
+ ```
27
+
28
+ ## Docker with CPU
29
+
30
+ ```
31
+ docker run -it -p 7860:7860 --platform=linux/amd64 \
32
+ registry.hf.space/wall-e-zz-mbarttranslator:latest python app.py
33
+ ```