GJ7583 commited on
Commit
070d59d
1 Parent(s): c4dbd2b

modify readme

Browse files
Files changed (1) hide show
  1. README.md +0 -18
README.md CHANGED
@@ -33,26 +33,8 @@ More information needed
33
 
34
  ### Training hyperparameters
35
 
36
- The following hyperparameters were used during training:
37
- - learning_rate: 2e-05
38
- - train_batch_size: 8
39
- - eval_batch_size: 8
40
- - seed: 42
41
- - distributed_type: multi-GPU
42
- - num_devices: 8
43
- - gradient_accumulation_steps: 8
44
- - total_train_batch_size: 512
45
- - total_eval_batch_size: 64
46
- - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
47
- - lr_scheduler_type: cosine
48
- - lr_scheduler_warmup_steps: 10
49
- - num_epochs: 1
50
- - mixed_precision_training: Native AMP
51
-
52
  ### Training results
53
 
54
-
55
-
56
  ### Framework versions
57
 
58
  - PEFT 0.9.1.dev0
 
33
 
34
  ### Training hyperparameters
35
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
36
  ### Training results
37
 
 
 
38
  ### Framework versions
39
 
40
  - PEFT 0.9.1.dev0