birgermoell commited on
Commit
cb26e38
1 Parent(s): 9b02708

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +65 -1
README.md CHANGED
@@ -10,4 +10,68 @@ widget:
10
  ---
11
  # Nordic Roberta Wikipedia
12
  ## Description
13
- This is a sample reference model for Flax/Jax training using only Wikipedia.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
10
  ---
11
  # Nordic Roberta Wikipedia
12
  ## Description
13
+ Nord roberta model trainined on the swedish danish and norwegian wikipedia.
14
+
15
+ ## Evaluation
16
+ Evaluation on Named Entity recognition in Danish.
17
+
18
+ I finetuned each model on 3 epochs on DaNE, repeated it 5 times for each model, and calculated 95% confidence intervals for the means. Here are the results:
19
+ xlm-roberta-base : 88.01 +- 0.43
20
+ flax-community/nordic-roberta-wiki: 85.75 +- 0.69 (this model)
21
+ Maltehb/danish-bert-botxo: 85.38 +- 0.55
22
+ flax-community/roberta-base-danish: 80.14 +- 1.47
23
+ flax-community/roberta-base-scandinavian : 78.03 +- 3.02
24
+ Maltehb/-l-ctra-danish-electra-small-cased: 57.87 +- 3.19
25
+ NbAiLab/nb-bert-base : 30.24 +- 1.21
26
+ Randomly initialised RoBERTa model: 19.79 +- 2.00
27
+
28
+ Evaluation on Sentiment analysis in Dansish
29
+ Here are the results on test set, where each model has been trained 5 times, and the “+-” refers to a 95% confidence interval of the mean score:
30
+ Maltehb/danish-bert-botxo: 65.19 +- 0.53
31
+ NbAiLab/nb-bert-base : 63.80 +- 0.77
32
+ xlm-roberta-base : 63.55 +- 1.59
33
+ flax-community/nordic-roberta-wiki : 56.46 +- 1.77
34
+ flax-community/roberta-base-danish : 54.73 +- 8.96
35
+ flax-community/roberta-base-scandinavian : 44.28 +- 9.21
36
+ Maltehb/-l-ctra-danish-electra-small-cased : 47.78 +- 12.65
37
+ Randomly initialised RoBERTa model: 36.96 +- 1.02
38
+ Maltehb/roberta-base-scandinavian : 33.65 +- 8.32
39
+
40
+ ## Model series
41
+ This model is part of a series of models training on TPU with Flax Jax during Huggingface Flax/Jax challenge.
42
+
43
+ ## Gpt models
44
+
45
+ ## Swedish Gpt
46
+ https://huggingface.co/birgermoell/swedish-gpt/
47
+
48
+ ## Swedish gpt wiki
49
+ https://huggingface.co/flax-community/swe-gpt-wiki
50
+
51
+ # Nordic gpt wiki
52
+ https://huggingface.co/flax-community/nordic-gpt-wiki
53
+
54
+ ## Dansk gpt wiki
55
+ https://huggingface.co/flax-community/dansk-gpt-wiki
56
+
57
+ ## Norsk gpt wiki
58
+ https://huggingface.co/flax-community/norsk-gpt-wiki
59
+
60
+ ## Roberta models
61
+
62
+ ## Nordic Roberta Wiki
63
+ https://huggingface.co/flax-community/nordic-roberta-wiki
64
+
65
+ ## Swe Roberta Wiki Oscar
66
+ https://huggingface.co/flax-community/swe-roberta-wiki-oscar
67
+
68
+ ## Roberta Swedish Scandi
69
+ https://huggingface.co/birgermoell/roberta-swedish-scandi
70
+
71
+ ## Roberta Swedish
72
+ https://huggingface.co/birgermoell/roberta-swedish
73
+
74
+ ## Swedish T5 model
75
+ https://huggingface.co/birgermoell/t5-base-swedish
76
+
77
+