artyomboyko
commited on
Commit
•
c26be60
1
Parent(s):
2942510
Update README.md
Browse filesSpecified the exact library of the AdamW used in the training process.
README.md
CHANGED
@@ -58,7 +58,7 @@ The following hyperparameters were used during training:
|
|
58 |
- train_batch_size: 16
|
59 |
- eval_batch_size: 8
|
60 |
- seed: 42
|
61 |
-
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
62 |
- lr_scheduler_type: linear
|
63 |
- lr_scheduler_warmup_steps: 250
|
64 |
- training_steps: 50000
|
|
|
58 |
- train_batch_size: 16
|
59 |
- eval_batch_size: 8
|
60 |
- seed: 42
|
61 |
+
- optimizer: Pytorch Adam with betas=(0.9,0.999) and epsilon=1e-08
|
62 |
- lr_scheduler_type: linear
|
63 |
- lr_scheduler_warmup_steps: 250
|
64 |
- training_steps: 50000
|