541Jack commited on
Commit
723737c
1 Parent(s): dc9765a

Training in progress, step 1000, checkpoint

Browse files
checkpoint-1000/README.md ADDED
@@ -0,0 +1,202 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: google/gemma-2b
3
+ library_name: peft
4
+ ---
5
+
6
+ # Model Card for Model ID
7
+
8
+ <!-- Provide a quick summary of what the model is/does. -->
9
+
10
+
11
+
12
+ ## Model Details
13
+
14
+ ### Model Description
15
+
16
+ <!-- Provide a longer summary of what this model is. -->
17
+
18
+
19
+
20
+ - **Developed by:** [More Information Needed]
21
+ - **Funded by [optional]:** [More Information Needed]
22
+ - **Shared by [optional]:** [More Information Needed]
23
+ - **Model type:** [More Information Needed]
24
+ - **Language(s) (NLP):** [More Information Needed]
25
+ - **License:** [More Information Needed]
26
+ - **Finetuned from model [optional]:** [More Information Needed]
27
+
28
+ ### Model Sources [optional]
29
+
30
+ <!-- Provide the basic links for the model. -->
31
+
32
+ - **Repository:** [More Information Needed]
33
+ - **Paper [optional]:** [More Information Needed]
34
+ - **Demo [optional]:** [More Information Needed]
35
+
36
+ ## Uses
37
+
38
+ <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
39
+
40
+ ### Direct Use
41
+
42
+ <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
43
+
44
+ [More Information Needed]
45
+
46
+ ### Downstream Use [optional]
47
+
48
+ <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
49
+
50
+ [More Information Needed]
51
+
52
+ ### Out-of-Scope Use
53
+
54
+ <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
55
+
56
+ [More Information Needed]
57
+
58
+ ## Bias, Risks, and Limitations
59
+
60
+ <!-- This section is meant to convey both technical and sociotechnical limitations. -->
61
+
62
+ [More Information Needed]
63
+
64
+ ### Recommendations
65
+
66
+ <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
67
+
68
+ Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
69
+
70
+ ## How to Get Started with the Model
71
+
72
+ Use the code below to get started with the model.
73
+
74
+ [More Information Needed]
75
+
76
+ ## Training Details
77
+
78
+ ### Training Data
79
+
80
+ <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
81
+
82
+ [More Information Needed]
83
+
84
+ ### Training Procedure
85
+
86
+ <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
87
+
88
+ #### Preprocessing [optional]
89
+
90
+ [More Information Needed]
91
+
92
+
93
+ #### Training Hyperparameters
94
+
95
+ - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
96
+
97
+ #### Speeds, Sizes, Times [optional]
98
+
99
+ <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
100
+
101
+ [More Information Needed]
102
+
103
+ ## Evaluation
104
+
105
+ <!-- This section describes the evaluation protocols and provides the results. -->
106
+
107
+ ### Testing Data, Factors & Metrics
108
+
109
+ #### Testing Data
110
+
111
+ <!-- This should link to a Dataset Card if possible. -->
112
+
113
+ [More Information Needed]
114
+
115
+ #### Factors
116
+
117
+ <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
118
+
119
+ [More Information Needed]
120
+
121
+ #### Metrics
122
+
123
+ <!-- These are the evaluation metrics being used, ideally with a description of why. -->
124
+
125
+ [More Information Needed]
126
+
127
+ ### Results
128
+
129
+ [More Information Needed]
130
+
131
+ #### Summary
132
+
133
+
134
+
135
+ ## Model Examination [optional]
136
+
137
+ <!-- Relevant interpretability work for the model goes here -->
138
+
139
+ [More Information Needed]
140
+
141
+ ## Environmental Impact
142
+
143
+ <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
144
+
145
+ Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
146
+
147
+ - **Hardware Type:** [More Information Needed]
148
+ - **Hours used:** [More Information Needed]
149
+ - **Cloud Provider:** [More Information Needed]
150
+ - **Compute Region:** [More Information Needed]
151
+ - **Carbon Emitted:** [More Information Needed]
152
+
153
+ ## Technical Specifications [optional]
154
+
155
+ ### Model Architecture and Objective
156
+
157
+ [More Information Needed]
158
+
159
+ ### Compute Infrastructure
160
+
161
+ [More Information Needed]
162
+
163
+ #### Hardware
164
+
165
+ [More Information Needed]
166
+
167
+ #### Software
168
+
169
+ [More Information Needed]
170
+
171
+ ## Citation [optional]
172
+
173
+ <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
174
+
175
+ **BibTeX:**
176
+
177
+ [More Information Needed]
178
+
179
+ **APA:**
180
+
181
+ [More Information Needed]
182
+
183
+ ## Glossary [optional]
184
+
185
+ <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
186
+
187
+ [More Information Needed]
188
+
189
+ ## More Information [optional]
190
+
191
+ [More Information Needed]
192
+
193
+ ## Model Card Authors [optional]
194
+
195
+ [More Information Needed]
196
+
197
+ ## Model Card Contact
198
+
199
+ [More Information Needed]
200
+ ### Framework versions
201
+
202
+ - PEFT 0.11.1
checkpoint-1000/adapter_config.json ADDED
@@ -0,0 +1,34 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alpha_pattern": {},
3
+ "auto_mapping": null,
4
+ "base_model_name_or_path": "google/gemma-2b",
5
+ "bias": "none",
6
+ "fan_in_fan_out": false,
7
+ "inference_mode": true,
8
+ "init_lora_weights": true,
9
+ "layer_replication": null,
10
+ "layers_pattern": null,
11
+ "layers_to_transform": null,
12
+ "loftq_config": {},
13
+ "lora_alpha": 16,
14
+ "lora_dropout": 0.0,
15
+ "megatron_config": null,
16
+ "megatron_core": "megatron.core",
17
+ "modules_to_save": null,
18
+ "peft_type": "LORA",
19
+ "r": 8,
20
+ "rank_pattern": {},
21
+ "revision": null,
22
+ "target_modules": [
23
+ "gate_proj",
24
+ "k_proj",
25
+ "v_proj",
26
+ "o_proj",
27
+ "q_proj",
28
+ "up_proj",
29
+ "down_proj"
30
+ ],
31
+ "task_type": "CAUSAL_LM",
32
+ "use_dora": false,
33
+ "use_rslora": false
34
+ }
checkpoint-1000/adapter_model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0ec6c7753aeef9c1552f9f586fd1f48ffa3fa4a381b7e6bed45d98461a3835ba
3
+ size 39256456
checkpoint-1000/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6fc674e11b4306de4bc14cce899c175178cf37297ead1db74c276ef27d217bb8
3
+ size 78658234
checkpoint-1000/rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9196a1e708bf24d6abba41cce3f8558820acc3e50f9394c5955e29eb41ffea3d
3
+ size 14244
checkpoint-1000/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0d40dae03d4ee39f9e99ae0bc5927eaaa78c3e5793ca1f75a57c68a6500de5f3
3
+ size 1064
checkpoint-1000/special_tokens_map.json ADDED
@@ -0,0 +1,34 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "additional_special_tokens": [
3
+ "<start_of_turn>",
4
+ "<end_of_turn>"
5
+ ],
6
+ "bos_token": {
7
+ "content": "<bos>",
8
+ "lstrip": false,
9
+ "normalized": false,
10
+ "rstrip": false,
11
+ "single_word": false
12
+ },
13
+ "eos_token": {
14
+ "content": "<eos>",
15
+ "lstrip": false,
16
+ "normalized": false,
17
+ "rstrip": false,
18
+ "single_word": false
19
+ },
20
+ "pad_token": {
21
+ "content": "<pad>",
22
+ "lstrip": false,
23
+ "normalized": false,
24
+ "rstrip": false,
25
+ "single_word": false
26
+ },
27
+ "unk_token": {
28
+ "content": "<unk>",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false
33
+ }
34
+ }
checkpoint-1000/tokenizer.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7da53ca29fb16f6b2489482fc0bc6a394162cdab14d12764a1755ebc583fea79
3
+ size 17518525
checkpoint-1000/tokenizer.model ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:61a7b147390c64585d6c3543dd6fc636906c9af3865a5548f27f31aee1d4c8e2
3
+ size 4241003
checkpoint-1000/tokenizer_config.json ADDED
@@ -0,0 +1,1759 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_bos_token": true,
3
+ "add_eos_token": false,
4
+ "added_tokens_decoder": {
5
+ "0": {
6
+ "content": "<pad>",
7
+ "lstrip": false,
8
+ "normalized": false,
9
+ "rstrip": false,
10
+ "single_word": false,
11
+ "special": true
12
+ },
13
+ "1": {
14
+ "content": "<eos>",
15
+ "lstrip": false,
16
+ "normalized": false,
17
+ "rstrip": false,
18
+ "single_word": false,
19
+ "special": true
20
+ },
21
+ "2": {
22
+ "content": "<bos>",
23
+ "lstrip": false,
24
+ "normalized": false,
25
+ "rstrip": false,
26
+ "single_word": false,
27
+ "special": true
28
+ },
29
+ "3": {
30
+ "content": "<unk>",
31
+ "lstrip": false,
32
+ "normalized": false,
33
+ "rstrip": false,
34
+ "single_word": false,
35
+ "special": true
36
+ },
37
+ "4": {
38
+ "content": "<mask>",
39
+ "lstrip": false,
40
+ "normalized": false,
41
+ "rstrip": false,
42
+ "single_word": false,
43
+ "special": false
44
+ },
45
+ "5": {
46
+ "content": "<2mass>",
47
+ "lstrip": false,
48
+ "normalized": false,
49
+ "rstrip": false,
50
+ "single_word": false,
51
+ "special": false
52
+ },
53
+ "6": {
54
+ "content": "[@BOS@]",
55
+ "lstrip": false,
56
+ "normalized": false,
57
+ "rstrip": false,
58
+ "single_word": false,
59
+ "special": false
60
+ },
61
+ "7": {
62
+ "content": "<unused0>",
63
+ "lstrip": false,
64
+ "normalized": false,
65
+ "rstrip": false,
66
+ "single_word": false,
67
+ "special": false
68
+ },
69
+ "8": {
70
+ "content": "<unused1>",
71
+ "lstrip": false,
72
+ "normalized": false,
73
+ "rstrip": false,
74
+ "single_word": false,
75
+ "special": false
76
+ },
77
+ "9": {
78
+ "content": "<unused2>",
79
+ "lstrip": false,
80
+ "normalized": false,
81
+ "rstrip": false,
82
+ "single_word": false,
83
+ "special": false
84
+ },
85
+ "10": {
86
+ "content": "<unused3>",
87
+ "lstrip": false,
88
+ "normalized": false,
89
+ "rstrip": false,
90
+ "single_word": false,
91
+ "special": false
92
+ },
93
+ "11": {
94
+ "content": "<unused4>",
95
+ "lstrip": false,
96
+ "normalized": false,
97
+ "rstrip": false,
98
+ "single_word": false,
99
+ "special": false
100
+ },
101
+ "12": {
102
+ "content": "<unused5>",
103
+ "lstrip": false,
104
+ "normalized": false,
105
+ "rstrip": false,
106
+ "single_word": false,
107
+ "special": false
108
+ },
109
+ "13": {
110
+ "content": "<unused6>",
111
+ "lstrip": false,
112
+ "normalized": false,
113
+ "rstrip": false,
114
+ "single_word": false,
115
+ "special": false
116
+ },
117
+ "14": {
118
+ "content": "<unused7>",
119
+ "lstrip": false,
120
+ "normalized": false,
121
+ "rstrip": false,
122
+ "single_word": false,
123
+ "special": false
124
+ },
125
+ "15": {
126
+ "content": "<unused8>",
127
+ "lstrip": false,
128
+ "normalized": false,
129
+ "rstrip": false,
130
+ "single_word": false,
131
+ "special": false
132
+ },
133
+ "16": {
134
+ "content": "<unused9>",
135
+ "lstrip": false,
136
+ "normalized": false,
137
+ "rstrip": false,
138
+ "single_word": false,
139
+ "special": false
140
+ },
141
+ "17": {
142
+ "content": "<unused10>",
143
+ "lstrip": false,
144
+ "normalized": false,
145
+ "rstrip": false,
146
+ "single_word": false,
147
+ "special": false
148
+ },
149
+ "18": {
150
+ "content": "<unused11>",
151
+ "lstrip": false,
152
+ "normalized": false,
153
+ "rstrip": false,
154
+ "single_word": false,
155
+ "special": false
156
+ },
157
+ "19": {
158
+ "content": "<unused12>",
159
+ "lstrip": false,
160
+ "normalized": false,
161
+ "rstrip": false,
162
+ "single_word": false,
163
+ "special": false
164
+ },
165
+ "20": {
166
+ "content": "<unused13>",
167
+ "lstrip": false,
168
+ "normalized": false,
169
+ "rstrip": false,
170
+ "single_word": false,
171
+ "special": false
172
+ },
173
+ "21": {
174
+ "content": "<unused14>",
175
+ "lstrip": false,
176
+ "normalized": false,
177
+ "rstrip": false,
178
+ "single_word": false,
179
+ "special": false
180
+ },
181
+ "22": {
182
+ "content": "<unused15>",
183
+ "lstrip": false,
184
+ "normalized": false,
185
+ "rstrip": false,
186
+ "single_word": false,
187
+ "special": false
188
+ },
189
+ "23": {
190
+ "content": "<unused16>",
191
+ "lstrip": false,
192
+ "normalized": false,
193
+ "rstrip": false,
194
+ "single_word": false,
195
+ "special": false
196
+ },
197
+ "24": {
198
+ "content": "<unused17>",
199
+ "lstrip": false,
200
+ "normalized": false,
201
+ "rstrip": false,
202
+ "single_word": false,
203
+ "special": false
204
+ },
205
+ "25": {
206
+ "content": "<unused18>",
207
+ "lstrip": false,
208
+ "normalized": false,
209
+ "rstrip": false,
210
+ "single_word": false,
211
+ "special": false
212
+ },
213
+ "26": {
214
+ "content": "<unused19>",
215
+ "lstrip": false,
216
+ "normalized": false,
217
+ "rstrip": false,
218
+ "single_word": false,
219
+ "special": false
220
+ },
221
+ "27": {
222
+ "content": "<unused20>",
223
+ "lstrip": false,
224
+ "normalized": false,
225
+ "rstrip": false,
226
+ "single_word": false,
227
+ "special": false
228
+ },
229
+ "28": {
230
+ "content": "<unused21>",
231
+ "lstrip": false,
232
+ "normalized": false,
233
+ "rstrip": false,
234
+ "single_word": false,
235
+ "special": false
236
+ },
237
+ "29": {
238
+ "content": "<unused22>",
239
+ "lstrip": false,
240
+ "normalized": false,
241
+ "rstrip": false,
242
+ "single_word": false,
243
+ "special": false
244
+ },
245
+ "30": {
246
+ "content": "<unused23>",
247
+ "lstrip": false,
248
+ "normalized": false,
249
+ "rstrip": false,
250
+ "single_word": false,
251
+ "special": false
252
+ },
253
+ "31": {
254
+ "content": "<unused24>",
255
+ "lstrip": false,
256
+ "normalized": false,
257
+ "rstrip": false,
258
+ "single_word": false,
259
+ "special": false
260
+ },
261
+ "32": {
262
+ "content": "<unused25>",
263
+ "lstrip": false,
264
+ "normalized": false,
265
+ "rstrip": false,
266
+ "single_word": false,
267
+ "special": false
268
+ },
269
+ "33": {
270
+ "content": "<unused26>",
271
+ "lstrip": false,
272
+ "normalized": false,
273
+ "rstrip": false,
274
+ "single_word": false,
275
+ "special": false
276
+ },
277
+ "34": {
278
+ "content": "<unused27>",
279
+ "lstrip": false,
280
+ "normalized": false,
281
+ "rstrip": false,
282
+ "single_word": false,
283
+ "special": false
284
+ },
285
+ "35": {
286
+ "content": "<unused28>",
287
+ "lstrip": false,
288
+ "normalized": false,
289
+ "rstrip": false,
290
+ "single_word": false,
291
+ "special": false
292
+ },
293
+ "36": {
294
+ "content": "<unused29>",
295
+ "lstrip": false,
296
+ "normalized": false,
297
+ "rstrip": false,
298
+ "single_word": false,
299
+ "special": false
300
+ },
301
+ "37": {
302
+ "content": "<unused30>",
303
+ "lstrip": false,
304
+ "normalized": false,
305
+ "rstrip": false,
306
+ "single_word": false,
307
+ "special": false
308
+ },
309
+ "38": {
310
+ "content": "<unused31>",
311
+ "lstrip": false,
312
+ "normalized": false,
313
+ "rstrip": false,
314
+ "single_word": false,
315
+ "special": false
316
+ },
317
+ "39": {
318
+ "content": "<unused32>",
319
+ "lstrip": false,
320
+ "normalized": false,
321
+ "rstrip": false,
322
+ "single_word": false,
323
+ "special": false
324
+ },
325
+ "40": {
326
+ "content": "<unused33>",
327
+ "lstrip": false,
328
+ "normalized": false,
329
+ "rstrip": false,
330
+ "single_word": false,
331
+ "special": false
332
+ },
333
+ "41": {
334
+ "content": "<unused34>",
335
+ "lstrip": false,
336
+ "normalized": false,
337
+ "rstrip": false,
338
+ "single_word": false,
339
+ "special": false
340
+ },
341
+ "42": {
342
+ "content": "<unused35>",
343
+ "lstrip": false,
344
+ "normalized": false,
345
+ "rstrip": false,
346
+ "single_word": false,
347
+ "special": false
348
+ },
349
+ "43": {
350
+ "content": "<unused36>",
351
+ "lstrip": false,
352
+ "normalized": false,
353
+ "rstrip": false,
354
+ "single_word": false,
355
+ "special": false
356
+ },
357
+ "44": {
358
+ "content": "<unused37>",
359
+ "lstrip": false,
360
+ "normalized": false,
361
+ "rstrip": false,
362
+ "single_word": false,
363
+ "special": false
364
+ },
365
+ "45": {
366
+ "content": "<unused38>",
367
+ "lstrip": false,
368
+ "normalized": false,
369
+ "rstrip": false,
370
+ "single_word": false,
371
+ "special": false
372
+ },
373
+ "46": {
374
+ "content": "<unused39>",
375
+ "lstrip": false,
376
+ "normalized": false,
377
+ "rstrip": false,
378
+ "single_word": false,
379
+ "special": false
380
+ },
381
+ "47": {
382
+ "content": "<unused40>",
383
+ "lstrip": false,
384
+ "normalized": false,
385
+ "rstrip": false,
386
+ "single_word": false,
387
+ "special": false
388
+ },
389
+ "48": {
390
+ "content": "<unused41>",
391
+ "lstrip": false,
392
+ "normalized": false,
393
+ "rstrip": false,
394
+ "single_word": false,
395
+ "special": false
396
+ },
397
+ "49": {
398
+ "content": "<unused42>",
399
+ "lstrip": false,
400
+ "normalized": false,
401
+ "rstrip": false,
402
+ "single_word": false,
403
+ "special": false
404
+ },
405
+ "50": {
406
+ "content": "<unused43>",
407
+ "lstrip": false,
408
+ "normalized": false,
409
+ "rstrip": false,
410
+ "single_word": false,
411
+ "special": false
412
+ },
413
+ "51": {
414
+ "content": "<unused44>",
415
+ "lstrip": false,
416
+ "normalized": false,
417
+ "rstrip": false,
418
+ "single_word": false,
419
+ "special": false
420
+ },
421
+ "52": {
422
+ "content": "<unused45>",
423
+ "lstrip": false,
424
+ "normalized": false,
425
+ "rstrip": false,
426
+ "single_word": false,
427
+ "special": false
428
+ },
429
+ "53": {
430
+ "content": "<unused46>",
431
+ "lstrip": false,
432
+ "normalized": false,
433
+ "rstrip": false,
434
+ "single_word": false,
435
+ "special": false
436
+ },
437
+ "54": {
438
+ "content": "<unused47>",
439
+ "lstrip": false,
440
+ "normalized": false,
441
+ "rstrip": false,
442
+ "single_word": false,
443
+ "special": false
444
+ },
445
+ "55": {
446
+ "content": "<unused48>",
447
+ "lstrip": false,
448
+ "normalized": false,
449
+ "rstrip": false,
450
+ "single_word": false,
451
+ "special": false
452
+ },
453
+ "56": {
454
+ "content": "<unused49>",
455
+ "lstrip": false,
456
+ "normalized": false,
457
+ "rstrip": false,
458
+ "single_word": false,
459
+ "special": false
460
+ },
461
+ "57": {
462
+ "content": "<unused50>",
463
+ "lstrip": false,
464
+ "normalized": false,
465
+ "rstrip": false,
466
+ "single_word": false,
467
+ "special": false
468
+ },
469
+ "58": {
470
+ "content": "<unused51>",
471
+ "lstrip": false,
472
+ "normalized": false,
473
+ "rstrip": false,
474
+ "single_word": false,
475
+ "special": false
476
+ },
477
+ "59": {
478
+ "content": "<unused52>",
479
+ "lstrip": false,
480
+ "normalized": false,
481
+ "rstrip": false,
482
+ "single_word": false,
483
+ "special": false
484
+ },
485
+ "60": {
486
+ "content": "<unused53>",
487
+ "lstrip": false,
488
+ "normalized": false,
489
+ "rstrip": false,
490
+ "single_word": false,
491
+ "special": false
492
+ },
493
+ "61": {
494
+ "content": "<unused54>",
495
+ "lstrip": false,
496
+ "normalized": false,
497
+ "rstrip": false,
498
+ "single_word": false,
499
+ "special": false
500
+ },
501
+ "62": {
502
+ "content": "<unused55>",
503
+ "lstrip": false,
504
+ "normalized": false,
505
+ "rstrip": false,
506
+ "single_word": false,
507
+ "special": false
508
+ },
509
+ "63": {
510
+ "content": "<unused56>",
511
+ "lstrip": false,
512
+ "normalized": false,
513
+ "rstrip": false,
514
+ "single_word": false,
515
+ "special": false
516
+ },
517
+ "64": {
518
+ "content": "<unused57>",
519
+ "lstrip": false,
520
+ "normalized": false,
521
+ "rstrip": false,
522
+ "single_word": false,
523
+ "special": false
524
+ },
525
+ "65": {
526
+ "content": "<unused58>",
527
+ "lstrip": false,
528
+ "normalized": false,
529
+ "rstrip": false,
530
+ "single_word": false,
531
+ "special": false
532
+ },
533
+ "66": {
534
+ "content": "<unused59>",
535
+ "lstrip": false,
536
+ "normalized": false,
537
+ "rstrip": false,
538
+ "single_word": false,
539
+ "special": false
540
+ },
541
+ "67": {
542
+ "content": "<unused60>",
543
+ "lstrip": false,
544
+ "normalized": false,
545
+ "rstrip": false,
546
+ "single_word": false,
547
+ "special": false
548
+ },
549
+ "68": {
550
+ "content": "<unused61>",
551
+ "lstrip": false,
552
+ "normalized": false,
553
+ "rstrip": false,
554
+ "single_word": false,
555
+ "special": false
556
+ },
557
+ "69": {
558
+ "content": "<unused62>",
559
+ "lstrip": false,
560
+ "normalized": false,
561
+ "rstrip": false,
562
+ "single_word": false,
563
+ "special": false
564
+ },
565
+ "70": {
566
+ "content": "<unused63>",
567
+ "lstrip": false,
568
+ "normalized": false,
569
+ "rstrip": false,
570
+ "single_word": false,
571
+ "special": false
572
+ },
573
+ "71": {
574
+ "content": "<unused64>",
575
+ "lstrip": false,
576
+ "normalized": false,
577
+ "rstrip": false,
578
+ "single_word": false,
579
+ "special": false
580
+ },
581
+ "72": {
582
+ "content": "<unused65>",
583
+ "lstrip": false,
584
+ "normalized": false,
585
+ "rstrip": false,
586
+ "single_word": false,
587
+ "special": false
588
+ },
589
+ "73": {
590
+ "content": "<unused66>",
591
+ "lstrip": false,
592
+ "normalized": false,
593
+ "rstrip": false,
594
+ "single_word": false,
595
+ "special": false
596
+ },
597
+ "74": {
598
+ "content": "<unused67>",
599
+ "lstrip": false,
600
+ "normalized": false,
601
+ "rstrip": false,
602
+ "single_word": false,
603
+ "special": false
604
+ },
605
+ "75": {
606
+ "content": "<unused68>",
607
+ "lstrip": false,
608
+ "normalized": false,
609
+ "rstrip": false,
610
+ "single_word": false,
611
+ "special": false
612
+ },
613
+ "76": {
614
+ "content": "<unused69>",
615
+ "lstrip": false,
616
+ "normalized": false,
617
+ "rstrip": false,
618
+ "single_word": false,
619
+ "special": false
620
+ },
621
+ "77": {
622
+ "content": "<unused70>",
623
+ "lstrip": false,
624
+ "normalized": false,
625
+ "rstrip": false,
626
+ "single_word": false,
627
+ "special": false
628
+ },
629
+ "78": {
630
+ "content": "<unused71>",
631
+ "lstrip": false,
632
+ "normalized": false,
633
+ "rstrip": false,
634
+ "single_word": false,
635
+ "special": false
636
+ },
637
+ "79": {
638
+ "content": "<unused72>",
639
+ "lstrip": false,
640
+ "normalized": false,
641
+ "rstrip": false,
642
+ "single_word": false,
643
+ "special": false
644
+ },
645
+ "80": {
646
+ "content": "<unused73>",
647
+ "lstrip": false,
648
+ "normalized": false,
649
+ "rstrip": false,
650
+ "single_word": false,
651
+ "special": false
652
+ },
653
+ "81": {
654
+ "content": "<unused74>",
655
+ "lstrip": false,
656
+ "normalized": false,
657
+ "rstrip": false,
658
+ "single_word": false,
659
+ "special": false
660
+ },
661
+ "82": {
662
+ "content": "<unused75>",
663
+ "lstrip": false,
664
+ "normalized": false,
665
+ "rstrip": false,
666
+ "single_word": false,
667
+ "special": false
668
+ },
669
+ "83": {
670
+ "content": "<unused76>",
671
+ "lstrip": false,
672
+ "normalized": false,
673
+ "rstrip": false,
674
+ "single_word": false,
675
+ "special": false
676
+ },
677
+ "84": {
678
+ "content": "<unused77>",
679
+ "lstrip": false,
680
+ "normalized": false,
681
+ "rstrip": false,
682
+ "single_word": false,
683
+ "special": false
684
+ },
685
+ "85": {
686
+ "content": "<unused78>",
687
+ "lstrip": false,
688
+ "normalized": false,
689
+ "rstrip": false,
690
+ "single_word": false,
691
+ "special": false
692
+ },
693
+ "86": {
694
+ "content": "<unused79>",
695
+ "lstrip": false,
696
+ "normalized": false,
697
+ "rstrip": false,
698
+ "single_word": false,
699
+ "special": false
700
+ },
701
+ "87": {
702
+ "content": "<unused80>",
703
+ "lstrip": false,
704
+ "normalized": false,
705
+ "rstrip": false,
706
+ "single_word": false,
707
+ "special": false
708
+ },
709
+ "88": {
710
+ "content": "<unused81>",
711
+ "lstrip": false,
712
+ "normalized": false,
713
+ "rstrip": false,
714
+ "single_word": false,
715
+ "special": false
716
+ },
717
+ "89": {
718
+ "content": "<unused82>",
719
+ "lstrip": false,
720
+ "normalized": false,
721
+ "rstrip": false,
722
+ "single_word": false,
723
+ "special": false
724
+ },
725
+ "90": {
726
+ "content": "<unused83>",
727
+ "lstrip": false,
728
+ "normalized": false,
729
+ "rstrip": false,
730
+ "single_word": false,
731
+ "special": false
732
+ },
733
+ "91": {
734
+ "content": "<unused84>",
735
+ "lstrip": false,
736
+ "normalized": false,
737
+ "rstrip": false,
738
+ "single_word": false,
739
+ "special": false
740
+ },
741
+ "92": {
742
+ "content": "<unused85>",
743
+ "lstrip": false,
744
+ "normalized": false,
745
+ "rstrip": false,
746
+ "single_word": false,
747
+ "special": false
748
+ },
749
+ "93": {
750
+ "content": "<unused86>",
751
+ "lstrip": false,
752
+ "normalized": false,
753
+ "rstrip": false,
754
+ "single_word": false,
755
+ "special": false
756
+ },
757
+ "94": {
758
+ "content": "<unused87>",
759
+ "lstrip": false,
760
+ "normalized": false,
761
+ "rstrip": false,
762
+ "single_word": false,
763
+ "special": false
764
+ },
765
+ "95": {
766
+ "content": "<unused88>",
767
+ "lstrip": false,
768
+ "normalized": false,
769
+ "rstrip": false,
770
+ "single_word": false,
771
+ "special": false
772
+ },
773
+ "96": {
774
+ "content": "<unused89>",
775
+ "lstrip": false,
776
+ "normalized": false,
777
+ "rstrip": false,
778
+ "single_word": false,
779
+ "special": false
780
+ },
781
+ "97": {
782
+ "content": "<unused90>",
783
+ "lstrip": false,
784
+ "normalized": false,
785
+ "rstrip": false,
786
+ "single_word": false,
787
+ "special": false
788
+ },
789
+ "98": {
790
+ "content": "<unused91>",
791
+ "lstrip": false,
792
+ "normalized": false,
793
+ "rstrip": false,
794
+ "single_word": false,
795
+ "special": false
796
+ },
797
+ "99": {
798
+ "content": "<unused92>",
799
+ "lstrip": false,
800
+ "normalized": false,
801
+ "rstrip": false,
802
+ "single_word": false,
803
+ "special": false
804
+ },
805
+ "100": {
806
+ "content": "<unused93>",
807
+ "lstrip": false,
808
+ "normalized": false,
809
+ "rstrip": false,
810
+ "single_word": false,
811
+ "special": false
812
+ },
813
+ "101": {
814
+ "content": "<unused94>",
815
+ "lstrip": false,
816
+ "normalized": false,
817
+ "rstrip": false,
818
+ "single_word": false,
819
+ "special": false
820
+ },
821
+ "102": {
822
+ "content": "<unused95>",
823
+ "lstrip": false,
824
+ "normalized": false,
825
+ "rstrip": false,
826
+ "single_word": false,
827
+ "special": false
828
+ },
829
+ "103": {
830
+ "content": "<unused96>",
831
+ "lstrip": false,
832
+ "normalized": false,
833
+ "rstrip": false,
834
+ "single_word": false,
835
+ "special": false
836
+ },
837
+ "104": {
838
+ "content": "<unused97>",
839
+ "lstrip": false,
840
+ "normalized": false,
841
+ "rstrip": false,
842
+ "single_word": false,
843
+ "special": false
844
+ },
845
+ "105": {
846
+ "content": "<unused98>",
847
+ "lstrip": false,
848
+ "normalized": false,
849
+ "rstrip": false,
850
+ "single_word": false,
851
+ "special": false
852
+ },
853
+ "106": {
854
+ "content": "<start_of_turn>",
855
+ "lstrip": false,
856
+ "normalized": false,
857
+ "rstrip": false,
858
+ "single_word": false,
859
+ "special": true
860
+ },
861
+ "107": {
862
+ "content": "<end_of_turn>",
863
+ "lstrip": false,
864
+ "normalized": false,
865
+ "rstrip": false,
866
+ "single_word": false,
867
+ "special": true
868
+ },
869
+ "108": {
870
+ "content": "\n",
871
+ "lstrip": false,
872
+ "normalized": false,
873
+ "rstrip": false,
874
+ "single_word": false,
875
+ "special": false
876
+ },
877
+ "109": {
878
+ "content": "\n\n",
879
+ "lstrip": false,
880
+ "normalized": false,
881
+ "rstrip": false,
882
+ "single_word": false,
883
+ "special": false
884
+ },
885
+ "110": {
886
+ "content": "\n\n\n",
887
+ "lstrip": false,
888
+ "normalized": false,
889
+ "rstrip": false,
890
+ "single_word": false,
891
+ "special": false
892
+ },
893
+ "111": {
894
+ "content": "\n\n\n\n",
895
+ "lstrip": false,
896
+ "normalized": false,
897
+ "rstrip": false,
898
+ "single_word": false,
899
+ "special": false
900
+ },
901
+ "112": {
902
+ "content": "\n\n\n\n\n",
903
+ "lstrip": false,
904
+ "normalized": false,
905
+ "rstrip": false,
906
+ "single_word": false,
907
+ "special": false
908
+ },
909
+ "113": {
910
+ "content": "\n\n\n\n\n\n",
911
+ "lstrip": false,
912
+ "normalized": false,
913
+ "rstrip": false,
914
+ "single_word": false,
915
+ "special": false
916
+ },
917
+ "114": {
918
+ "content": "\n\n\n\n\n\n\n",
919
+ "lstrip": false,
920
+ "normalized": false,
921
+ "rstrip": false,
922
+ "single_word": false,
923
+ "special": false
924
+ },
925
+ "115": {
926
+ "content": "\n\n\n\n\n\n\n\n",
927
+ "lstrip": false,
928
+ "normalized": false,
929
+ "rstrip": false,
930
+ "single_word": false,
931
+ "special": false
932
+ },
933
+ "116": {
934
+ "content": "\n\n\n\n\n\n\n\n\n",
935
+ "lstrip": false,
936
+ "normalized": false,
937
+ "rstrip": false,
938
+ "single_word": false,
939
+ "special": false
940
+ },
941
+ "117": {
942
+ "content": "\n\n\n\n\n\n\n\n\n\n",
943
+ "lstrip": false,
944
+ "normalized": false,
945
+ "rstrip": false,
946
+ "single_word": false,
947
+ "special": false
948
+ },
949
+ "118": {
950
+ "content": "\n\n\n\n\n\n\n\n\n\n\n",
951
+ "lstrip": false,
952
+ "normalized": false,
953
+ "rstrip": false,
954
+ "single_word": false,
955
+ "special": false
956
+ },
957
+ "119": {
958
+ "content": "\n\n\n\n\n\n\n\n\n\n\n\n",
959
+ "lstrip": false,
960
+ "normalized": false,
961
+ "rstrip": false,
962
+ "single_word": false,
963
+ "special": false
964
+ },
965
+ "120": {
966
+ "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n",
967
+ "lstrip": false,
968
+ "normalized": false,
969
+ "rstrip": false,
970
+ "single_word": false,
971
+ "special": false
972
+ },
973
+ "121": {
974
+ "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n",
975
+ "lstrip": false,
976
+ "normalized": false,
977
+ "rstrip": false,
978
+ "single_word": false,
979
+ "special": false
980
+ },
981
+ "122": {
982
+ "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n",
983
+ "lstrip": false,
984
+ "normalized": false,
985
+ "rstrip": false,
986
+ "single_word": false,
987
+ "special": false
988
+ },
989
+ "123": {
990
+ "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n",
991
+ "lstrip": false,
992
+ "normalized": false,
993
+ "rstrip": false,
994
+ "single_word": false,
995
+ "special": false
996
+ },
997
+ "124": {
998
+ "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n",
999
+ "lstrip": false,
1000
+ "normalized": false,
1001
+ "rstrip": false,
1002
+ "single_word": false,
1003
+ "special": false
1004
+ },
1005
+ "125": {
1006
+ "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n",
1007
+ "lstrip": false,
1008
+ "normalized": false,
1009
+ "rstrip": false,
1010
+ "single_word": false,
1011
+ "special": false
1012
+ },
1013
+ "126": {
1014
+ "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n",
1015
+ "lstrip": false,
1016
+ "normalized": false,
1017
+ "rstrip": false,
1018
+ "single_word": false,
1019
+ "special": false
1020
+ },
1021
+ "127": {
1022
+ "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n",
1023
+ "lstrip": false,
1024
+ "normalized": false,
1025
+ "rstrip": false,
1026
+ "single_word": false,
1027
+ "special": false
1028
+ },
1029
+ "128": {
1030
+ "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n",
1031
+ "lstrip": false,
1032
+ "normalized": false,
1033
+ "rstrip": false,
1034
+ "single_word": false,
1035
+ "special": false
1036
+ },
1037
+ "129": {
1038
+ "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n",
1039
+ "lstrip": false,
1040
+ "normalized": false,
1041
+ "rstrip": false,
1042
+ "single_word": false,
1043
+ "special": false
1044
+ },
1045
+ "130": {
1046
+ "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n",
1047
+ "lstrip": false,
1048
+ "normalized": false,
1049
+ "rstrip": false,
1050
+ "single_word": false,
1051
+ "special": false
1052
+ },
1053
+ "131": {
1054
+ "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n",
1055
+ "lstrip": false,
1056
+ "normalized": false,
1057
+ "rstrip": false,
1058
+ "single_word": false,
1059
+ "special": false
1060
+ },
1061
+ "132": {
1062
+ "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n",
1063
+ "lstrip": false,
1064
+ "normalized": false,
1065
+ "rstrip": false,
1066
+ "single_word": false,
1067
+ "special": false
1068
+ },
1069
+ "133": {
1070
+ "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n",
1071
+ "lstrip": false,
1072
+ "normalized": false,
1073
+ "rstrip": false,
1074
+ "single_word": false,
1075
+ "special": false
1076
+ },
1077
+ "134": {
1078
+ "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n",
1079
+ "lstrip": false,
1080
+ "normalized": false,
1081
+ "rstrip": false,
1082
+ "single_word": false,
1083
+ "special": false
1084
+ },
1085
+ "135": {
1086
+ "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n",
1087
+ "lstrip": false,
1088
+ "normalized": false,
1089
+ "rstrip": false,
1090
+ "single_word": false,
1091
+ "special": false
1092
+ },
1093
+ "136": {
1094
+ "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n",
1095
+ "lstrip": false,
1096
+ "normalized": false,
1097
+ "rstrip": false,
1098
+ "single_word": false,
1099
+ "special": false
1100
+ },
1101
+ "137": {
1102
+ "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n",
1103
+ "lstrip": false,
1104
+ "normalized": false,
1105
+ "rstrip": false,
1106
+ "single_word": false,
1107
+ "special": false
1108
+ },
1109
+ "138": {
1110
+ "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n",
1111
+ "lstrip": false,
1112
+ "normalized": false,
1113
+ "rstrip": false,
1114
+ "single_word": false,
1115
+ "special": false
1116
+ },
1117
+ "139": {
1118
+ "content": "▁▁",
1119
+ "lstrip": false,
1120
+ "normalized": false,
1121
+ "rstrip": false,
1122
+ "single_word": false,
1123
+ "special": false
1124
+ },
1125
+ "140": {
1126
+ "content": "▁▁▁",
1127
+ "lstrip": false,
1128
+ "normalized": false,
1129
+ "rstrip": false,
1130
+ "single_word": false,
1131
+ "special": false
1132
+ },
1133
+ "141": {
1134
+ "content": "▁▁▁▁",
1135
+ "lstrip": false,
1136
+ "normalized": false,
1137
+ "rstrip": false,
1138
+ "single_word": false,
1139
+ "special": false
1140
+ },
1141
+ "142": {
1142
+ "content": "▁▁▁▁▁",
1143
+ "lstrip": false,
1144
+ "normalized": false,
1145
+ "rstrip": false,
1146
+ "single_word": false,
1147
+ "special": false
1148
+ },
1149
+ "143": {
1150
+ "content": "▁▁▁▁▁▁",
1151
+ "lstrip": false,
1152
+ "normalized": false,
1153
+ "rstrip": false,
1154
+ "single_word": false,
1155
+ "special": false
1156
+ },
1157
+ "144": {
1158
+ "content": "▁▁▁▁▁▁▁",
1159
+ "lstrip": false,
1160
+ "normalized": false,
1161
+ "rstrip": false,
1162
+ "single_word": false,
1163
+ "special": false
1164
+ },
1165
+ "145": {
1166
+ "content": "▁▁▁▁▁▁▁▁",
1167
+ "lstrip": false,
1168
+ "normalized": false,
1169
+ "rstrip": false,
1170
+ "single_word": false,
1171
+ "special": false
1172
+ },
1173
+ "146": {
1174
+ "content": "▁▁▁▁▁▁▁▁▁",
1175
+ "lstrip": false,
1176
+ "normalized": false,
1177
+ "rstrip": false,
1178
+ "single_word": false,
1179
+ "special": false
1180
+ },
1181
+ "147": {
1182
+ "content": "▁▁▁▁▁▁▁▁▁▁",
1183
+ "lstrip": false,
1184
+ "normalized": false,
1185
+ "rstrip": false,
1186
+ "single_word": false,
1187
+ "special": false
1188
+ },
1189
+ "148": {
1190
+ "content": "▁▁▁▁▁▁▁▁▁▁▁",
1191
+ "lstrip": false,
1192
+ "normalized": false,
1193
+ "rstrip": false,
1194
+ "single_word": false,
1195
+ "special": false
1196
+ },
1197
+ "149": {
1198
+ "content": "▁▁▁▁▁▁▁▁▁▁▁▁",
1199
+ "lstrip": false,
1200
+ "normalized": false,
1201
+ "rstrip": false,
1202
+ "single_word": false,
1203
+ "special": false
1204
+ },
1205
+ "150": {
1206
+ "content": "▁▁▁▁▁▁▁▁▁▁▁▁▁",
1207
+ "lstrip": false,
1208
+ "normalized": false,
1209
+ "rstrip": false,
1210
+ "single_word": false,
1211
+ "special": false
1212
+ },
1213
+ "151": {
1214
+ "content": "▁▁▁▁▁▁▁▁▁▁▁▁▁▁",
1215
+ "lstrip": false,
1216
+ "normalized": false,
1217
+ "rstrip": false,
1218
+ "single_word": false,
1219
+ "special": false
1220
+ },
1221
+ "152": {
1222
+ "content": "▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁",
1223
+ "lstrip": false,
1224
+ "normalized": false,
1225
+ "rstrip": false,
1226
+ "single_word": false,
1227
+ "special": false
1228
+ },
1229
+ "153": {
1230
+ "content": "▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁",
1231
+ "lstrip": false,
1232
+ "normalized": false,
1233
+ "rstrip": false,
1234
+ "single_word": false,
1235
+ "special": false
1236
+ },
1237
+ "154": {
1238
+ "content": "▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁",
1239
+ "lstrip": false,
1240
+ "normalized": false,
1241
+ "rstrip": false,
1242
+ "single_word": false,
1243
+ "special": false
1244
+ },
1245
+ "155": {
1246
+ "content": "▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁",
1247
+ "lstrip": false,
1248
+ "normalized": false,
1249
+ "rstrip": false,
1250
+ "single_word": false,
1251
+ "special": false
1252
+ },
1253
+ "156": {
1254
+ "content": "▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁",
1255
+ "lstrip": false,
1256
+ "normalized": false,
1257
+ "rstrip": false,
1258
+ "single_word": false,
1259
+ "special": false
1260
+ },
1261
+ "157": {
1262
+ "content": "▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁",
1263
+ "lstrip": false,
1264
+ "normalized": false,
1265
+ "rstrip": false,
1266
+ "single_word": false,
1267
+ "special": false
1268
+ },
1269
+ "158": {
1270
+ "content": "▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁",
1271
+ "lstrip": false,
1272
+ "normalized": false,
1273
+ "rstrip": false,
1274
+ "single_word": false,
1275
+ "special": false
1276
+ },
1277
+ "159": {
1278
+ "content": "▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁",
1279
+ "lstrip": false,
1280
+ "normalized": false,
1281
+ "rstrip": false,
1282
+ "single_word": false,
1283
+ "special": false
1284
+ },
1285
+ "160": {
1286
+ "content": "▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁",
1287
+ "lstrip": false,
1288
+ "normalized": false,
1289
+ "rstrip": false,
1290
+ "single_word": false,
1291
+ "special": false
1292
+ },
1293
+ "161": {
1294
+ "content": "▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁",
1295
+ "lstrip": false,
1296
+ "normalized": false,
1297
+ "rstrip": false,
1298
+ "single_word": false,
1299
+ "special": false
1300
+ },
1301
+ "162": {
1302
+ "content": "▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁",
1303
+ "lstrip": false,
1304
+ "normalized": false,
1305
+ "rstrip": false,
1306
+ "single_word": false,
1307
+ "special": false
1308
+ },
1309
+ "163": {
1310
+ "content": "▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁",
1311
+ "lstrip": false,
1312
+ "normalized": false,
1313
+ "rstrip": false,
1314
+ "single_word": false,
1315
+ "special": false
1316
+ },
1317
+ "164": {
1318
+ "content": "▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁",
1319
+ "lstrip": false,
1320
+ "normalized": false,
1321
+ "rstrip": false,
1322
+ "single_word": false,
1323
+ "special": false
1324
+ },
1325
+ "165": {
1326
+ "content": "▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁",
1327
+ "lstrip": false,
1328
+ "normalized": false,
1329
+ "rstrip": false,
1330
+ "single_word": false,
1331
+ "special": false
1332
+ },
1333
+ "166": {
1334
+ "content": "▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁",
1335
+ "lstrip": false,
1336
+ "normalized": false,
1337
+ "rstrip": false,
1338
+ "single_word": false,
1339
+ "special": false
1340
+ },
1341
+ "167": {
1342
+ "content": "▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁",
1343
+ "lstrip": false,
1344
+ "normalized": false,
1345
+ "rstrip": false,
1346
+ "single_word": false,
1347
+ "special": false
1348
+ },
1349
+ "168": {
1350
+ "content": "▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁",
1351
+ "lstrip": false,
1352
+ "normalized": false,
1353
+ "rstrip": false,
1354
+ "single_word": false,
1355
+ "special": false
1356
+ },
1357
+ "169": {
1358
+ "content": "<table>",
1359
+ "lstrip": false,
1360
+ "normalized": false,
1361
+ "rstrip": false,
1362
+ "single_word": false,
1363
+ "special": false
1364
+ },
1365
+ "170": {
1366
+ "content": "<caption>",
1367
+ "lstrip": false,
1368
+ "normalized": false,
1369
+ "rstrip": false,
1370
+ "single_word": false,
1371
+ "special": false
1372
+ },
1373
+ "171": {
1374
+ "content": "<thead>",
1375
+ "lstrip": false,
1376
+ "normalized": false,
1377
+ "rstrip": false,
1378
+ "single_word": false,
1379
+ "special": false
1380
+ },
1381
+ "172": {
1382
+ "content": "<tbody>",
1383
+ "lstrip": false,
1384
+ "normalized": false,
1385
+ "rstrip": false,
1386
+ "single_word": false,
1387
+ "special": false
1388
+ },
1389
+ "173": {
1390
+ "content": "<tfoot>",
1391
+ "lstrip": false,
1392
+ "normalized": false,
1393
+ "rstrip": false,
1394
+ "single_word": false,
1395
+ "special": false
1396
+ },
1397
+ "174": {
1398
+ "content": "<tr>",
1399
+ "lstrip": false,
1400
+ "normalized": false,
1401
+ "rstrip": false,
1402
+ "single_word": false,
1403
+ "special": false
1404
+ },
1405
+ "175": {
1406
+ "content": "<th>",
1407
+ "lstrip": false,
1408
+ "normalized": false,
1409
+ "rstrip": false,
1410
+ "single_word": false,
1411
+ "special": false
1412
+ },
1413
+ "176": {
1414
+ "content": "<td>",
1415
+ "lstrip": false,
1416
+ "normalized": false,
1417
+ "rstrip": false,
1418
+ "single_word": false,
1419
+ "special": false
1420
+ },
1421
+ "177": {
1422
+ "content": "</table>",
1423
+ "lstrip": false,
1424
+ "normalized": false,
1425
+ "rstrip": false,
1426
+ "single_word": false,
1427
+ "special": false
1428
+ },
1429
+ "178": {
1430
+ "content": "</caption>",
1431
+ "lstrip": false,
1432
+ "normalized": false,
1433
+ "rstrip": false,
1434
+ "single_word": false,
1435
+ "special": false
1436
+ },
1437
+ "179": {
1438
+ "content": "</thead>",
1439
+ "lstrip": false,
1440
+ "normalized": false,
1441
+ "rstrip": false,
1442
+ "single_word": false,
1443
+ "special": false
1444
+ },
1445
+ "180": {
1446
+ "content": "</tbody>",
1447
+ "lstrip": false,
1448
+ "normalized": false,
1449
+ "rstrip": false,
1450
+ "single_word": false,
1451
+ "special": false
1452
+ },
1453
+ "181": {
1454
+ "content": "</tfoot>",
1455
+ "lstrip": false,
1456
+ "normalized": false,
1457
+ "rstrip": false,
1458
+ "single_word": false,
1459
+ "special": false
1460
+ },
1461
+ "182": {
1462
+ "content": "</tr>",
1463
+ "lstrip": false,
1464
+ "normalized": false,
1465
+ "rstrip": false,
1466
+ "single_word": false,
1467
+ "special": false
1468
+ },
1469
+ "183": {
1470
+ "content": "</th>",
1471
+ "lstrip": false,
1472
+ "normalized": false,
1473
+ "rstrip": false,
1474
+ "single_word": false,
1475
+ "special": false
1476
+ },
1477
+ "184": {
1478
+ "content": "</td>",
1479
+ "lstrip": false,
1480
+ "normalized": false,
1481
+ "rstrip": false,
1482
+ "single_word": false,
1483
+ "special": false
1484
+ },
1485
+ "185": {
1486
+ "content": "<h1>",
1487
+ "lstrip": false,
1488
+ "normalized": false,
1489
+ "rstrip": false,
1490
+ "single_word": false,
1491
+ "special": false
1492
+ },
1493
+ "186": {
1494
+ "content": "<h2>",
1495
+ "lstrip": false,
1496
+ "normalized": false,
1497
+ "rstrip": false,
1498
+ "single_word": false,
1499
+ "special": false
1500
+ },
1501
+ "187": {
1502
+ "content": "<h3>",
1503
+ "lstrip": false,
1504
+ "normalized": false,
1505
+ "rstrip": false,
1506
+ "single_word": false,
1507
+ "special": false
1508
+ },
1509
+ "188": {
1510
+ "content": "<h4>",
1511
+ "lstrip": false,
1512
+ "normalized": false,
1513
+ "rstrip": false,
1514
+ "single_word": false,
1515
+ "special": false
1516
+ },
1517
+ "189": {
1518
+ "content": "<h5>",
1519
+ "lstrip": false,
1520
+ "normalized": false,
1521
+ "rstrip": false,
1522
+ "single_word": false,
1523
+ "special": false
1524
+ },
1525
+ "190": {
1526
+ "content": "<h6>",
1527
+ "lstrip": false,
1528
+ "normalized": false,
1529
+ "rstrip": false,
1530
+ "single_word": false,
1531
+ "special": false
1532
+ },
1533
+ "191": {
1534
+ "content": "<blockquote>",
1535
+ "lstrip": false,
1536
+ "normalized": false,
1537
+ "rstrip": false,
1538
+ "single_word": false,
1539
+ "special": false
1540
+ },
1541
+ "192": {
1542
+ "content": "</h1>",
1543
+ "lstrip": false,
1544
+ "normalized": false,
1545
+ "rstrip": false,
1546
+ "single_word": false,
1547
+ "special": false
1548
+ },
1549
+ "193": {
1550
+ "content": "</h2>",
1551
+ "lstrip": false,
1552
+ "normalized": false,
1553
+ "rstrip": false,
1554
+ "single_word": false,
1555
+ "special": false
1556
+ },
1557
+ "194": {
1558
+ "content": "</h3>",
1559
+ "lstrip": false,
1560
+ "normalized": false,
1561
+ "rstrip": false,
1562
+ "single_word": false,
1563
+ "special": false
1564
+ },
1565
+ "195": {
1566
+ "content": "</h4>",
1567
+ "lstrip": false,
1568
+ "normalized": false,
1569
+ "rstrip": false,
1570
+ "single_word": false,
1571
+ "special": false
1572
+ },
1573
+ "196": {
1574
+ "content": "</h5>",
1575
+ "lstrip": false,
1576
+ "normalized": false,
1577
+ "rstrip": false,
1578
+ "single_word": false,
1579
+ "special": false
1580
+ },
1581
+ "197": {
1582
+ "content": "</h6>",
1583
+ "lstrip": false,
1584
+ "normalized": false,
1585
+ "rstrip": false,
1586
+ "single_word": false,
1587
+ "special": false
1588
+ },
1589
+ "198": {
1590
+ "content": "</blockquote>",
1591
+ "lstrip": false,
1592
+ "normalized": false,
1593
+ "rstrip": false,
1594
+ "single_word": false,
1595
+ "special": false
1596
+ },
1597
+ "199": {
1598
+ "content": "<strong>",
1599
+ "lstrip": false,
1600
+ "normalized": false,
1601
+ "rstrip": false,
1602
+ "single_word": false,
1603
+ "special": false
1604
+ },
1605
+ "200": {
1606
+ "content": "<em>",
1607
+ "lstrip": false,
1608
+ "normalized": false,
1609
+ "rstrip": false,
1610
+ "single_word": false,
1611
+ "special": false
1612
+ },
1613
+ "201": {
1614
+ "content": "<b>",
1615
+ "lstrip": false,
1616
+ "normalized": false,
1617
+ "rstrip": false,
1618
+ "single_word": false,
1619
+ "special": false
1620
+ },
1621
+ "202": {
1622
+ "content": "<i>",
1623
+ "lstrip": false,
1624
+ "normalized": false,
1625
+ "rstrip": false,
1626
+ "single_word": false,
1627
+ "special": false
1628
+ },
1629
+ "203": {
1630
+ "content": "<u>",
1631
+ "lstrip": false,
1632
+ "normalized": false,
1633
+ "rstrip": false,
1634
+ "single_word": false,
1635
+ "special": false
1636
+ },
1637
+ "204": {
1638
+ "content": "<s>",
1639
+ "lstrip": false,
1640
+ "normalized": false,
1641
+ "rstrip": false,
1642
+ "single_word": false,
1643
+ "special": false
1644
+ },
1645
+ "205": {
1646
+ "content": "<sub>",
1647
+ "lstrip": false,
1648
+ "normalized": false,
1649
+ "rstrip": false,
1650
+ "single_word": false,
1651
+ "special": false
1652
+ },
1653
+ "206": {
1654
+ "content": "<sup>",
1655
+ "lstrip": false,
1656
+ "normalized": false,
1657
+ "rstrip": false,
1658
+ "single_word": false,
1659
+ "special": false
1660
+ },
1661
+ "207": {
1662
+ "content": "<code>",
1663
+ "lstrip": false,
1664
+ "normalized": false,
1665
+ "rstrip": false,
1666
+ "single_word": false,
1667
+ "special": false
1668
+ },
1669
+ "208": {
1670
+ "content": "</strong>",
1671
+ "lstrip": false,
1672
+ "normalized": false,
1673
+ "rstrip": false,
1674
+ "single_word": false,
1675
+ "special": false
1676
+ },
1677
+ "209": {
1678
+ "content": "</em>",
1679
+ "lstrip": false,
1680
+ "normalized": false,
1681
+ "rstrip": false,
1682
+ "single_word": false,
1683
+ "special": false
1684
+ },
1685
+ "210": {
1686
+ "content": "</b>",
1687
+ "lstrip": false,
1688
+ "normalized": false,
1689
+ "rstrip": false,
1690
+ "single_word": false,
1691
+ "special": false
1692
+ },
1693
+ "211": {
1694
+ "content": "</i>",
1695
+ "lstrip": false,
1696
+ "normalized": false,
1697
+ "rstrip": false,
1698
+ "single_word": false,
1699
+ "special": false
1700
+ },
1701
+ "212": {
1702
+ "content": "</u>",
1703
+ "lstrip": false,
1704
+ "normalized": false,
1705
+ "rstrip": false,
1706
+ "single_word": false,
1707
+ "special": false
1708
+ },
1709
+ "213": {
1710
+ "content": "</s>",
1711
+ "lstrip": false,
1712
+ "normalized": false,
1713
+ "rstrip": false,
1714
+ "single_word": false,
1715
+ "special": false
1716
+ },
1717
+ "214": {
1718
+ "content": "</sub>",
1719
+ "lstrip": false,
1720
+ "normalized": false,
1721
+ "rstrip": false,
1722
+ "single_word": false,
1723
+ "special": false
1724
+ },
1725
+ "215": {
1726
+ "content": "</sup>",
1727
+ "lstrip": false,
1728
+ "normalized": false,
1729
+ "rstrip": false,
1730
+ "single_word": false,
1731
+ "special": false
1732
+ },
1733
+ "216": {
1734
+ "content": "</code>",
1735
+ "lstrip": false,
1736
+ "normalized": false,
1737
+ "rstrip": false,
1738
+ "single_word": false,
1739
+ "special": false
1740
+ }
1741
+ },
1742
+ "additional_special_tokens": [
1743
+ "<start_of_turn>",
1744
+ "<end_of_turn>"
1745
+ ],
1746
+ "bos_token": "<bos>",
1747
+ "chat_template": "{% if messages[0]['role'] == 'system' %}{% set system_message = messages[0]['content'] %}{% endif %}{% if system_message is defined %}{{ system_message + '\n' }}{% endif %}{% for message in messages %}{% set content = message['content'] %}{% if message['role'] == 'user' %}{{ 'Human: ' + content + '\nAssistant: ' }}{% elif message['role'] == 'assistant' %}{{ content + '<eos>' + '\n' }}{% endif %}{% endfor %}",
1748
+ "clean_up_tokenization_spaces": false,
1749
+ "eos_token": "<eos>",
1750
+ "model_max_length": 1000000000000000019884624838656,
1751
+ "pad_token": "<pad>",
1752
+ "padding_side": "right",
1753
+ "sp_model_kwargs": {},
1754
+ "spaces_between_special_tokens": false,
1755
+ "split_special_tokens": false,
1756
+ "tokenizer_class": "GemmaTokenizer",
1757
+ "unk_token": "<unk>",
1758
+ "use_default_system_prompt": false
1759
+ }
checkpoint-1000/trainer_state.json ADDED
@@ -0,0 +1,1565 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_metric": null,
3
+ "best_model_checkpoint": null,
4
+ "epoch": 0.15488867376573087,
5
+ "eval_steps": 500,
6
+ "global_step": 1000,
7
+ "is_hyper_param_search": false,
8
+ "is_local_process_zero": true,
9
+ "is_world_process_zero": true,
10
+ "log_history": [
11
+ {
12
+ "epoch": 0.001548886737657309,
13
+ "grad_norm": 0.3146345317363739,
14
+ "learning_rate": 1e-08,
15
+ "logits/chosen": -18.652690887451172,
16
+ "logits/rejected": -18.62572479248047,
17
+ "logps/chosen": -2.296290636062622,
18
+ "logps/rejected": -2.2631335258483887,
19
+ "loss": 1.0033,
20
+ "rewards/accuracies": 0.47187501192092896,
21
+ "rewards/chosen": -2.296290636062622,
22
+ "rewards/margins": -0.03315729647874832,
23
+ "rewards/rejected": -2.2631335258483887,
24
+ "step": 10
25
+ },
26
+ {
27
+ "epoch": 0.003097773475314618,
28
+ "grad_norm": 1.226308822631836,
29
+ "learning_rate": 2e-08,
30
+ "logits/chosen": -18.68000030517578,
31
+ "logits/rejected": -18.688858032226562,
32
+ "logps/chosen": -2.357570171356201,
33
+ "logps/rejected": -2.3560101985931396,
34
+ "loss": 0.9862,
35
+ "rewards/accuracies": 0.46562498807907104,
36
+ "rewards/chosen": -2.357570171356201,
37
+ "rewards/margins": -0.001559938071295619,
38
+ "rewards/rejected": -2.3560101985931396,
39
+ "step": 20
40
+ },
41
+ {
42
+ "epoch": 0.004646660212971926,
43
+ "grad_norm": 0.37734341621398926,
44
+ "learning_rate": 3e-08,
45
+ "logits/chosen": -19.277925491333008,
46
+ "logits/rejected": -19.275354385375977,
47
+ "logps/chosen": -2.291140556335449,
48
+ "logps/rejected": -2.275416135787964,
49
+ "loss": 0.9911,
50
+ "rewards/accuracies": 0.43437498807907104,
51
+ "rewards/chosen": -2.291140556335449,
52
+ "rewards/margins": -0.015724267810583115,
53
+ "rewards/rejected": -2.275416135787964,
54
+ "step": 30
55
+ },
56
+ {
57
+ "epoch": 0.006195546950629236,
58
+ "grad_norm": 0.7164713144302368,
59
+ "learning_rate": 4e-08,
60
+ "logits/chosen": -18.86441421508789,
61
+ "logits/rejected": -18.846973419189453,
62
+ "logps/chosen": -2.2503223419189453,
63
+ "logps/rejected": -2.242508888244629,
64
+ "loss": 0.9863,
65
+ "rewards/accuracies": 0.484375,
66
+ "rewards/chosen": -2.2503223419189453,
67
+ "rewards/margins": -0.007813749834895134,
68
+ "rewards/rejected": -2.242508888244629,
69
+ "step": 40
70
+ },
71
+ {
72
+ "epoch": 0.007744433688286544,
73
+ "grad_norm": 0.35502341389656067,
74
+ "learning_rate": 5e-08,
75
+ "logits/chosen": -18.388294219970703,
76
+ "logits/rejected": -18.373645782470703,
77
+ "logps/chosen": -2.357111692428589,
78
+ "logps/rejected": -2.3261992931365967,
79
+ "loss": 1.0016,
80
+ "rewards/accuracies": 0.503125011920929,
81
+ "rewards/chosen": -2.357111692428589,
82
+ "rewards/margins": -0.030912142246961594,
83
+ "rewards/rejected": -2.3261992931365967,
84
+ "step": 50
85
+ },
86
+ {
87
+ "epoch": 0.009293320425943852,
88
+ "grad_norm": 3.113129138946533,
89
+ "learning_rate": 6e-08,
90
+ "logits/chosen": -18.67694664001465,
91
+ "logits/rejected": -18.65276336669922,
92
+ "logps/chosen": -2.307683229446411,
93
+ "logps/rejected": -2.280695915222168,
94
+ "loss": 0.9993,
95
+ "rewards/accuracies": 0.46875,
96
+ "rewards/chosen": -2.307683229446411,
97
+ "rewards/margins": -0.026987096294760704,
98
+ "rewards/rejected": -2.280695915222168,
99
+ "step": 60
100
+ },
101
+ {
102
+ "epoch": 0.010842207163601162,
103
+ "grad_norm": 0.3706030547618866,
104
+ "learning_rate": 7e-08,
105
+ "logits/chosen": -18.240503311157227,
106
+ "logits/rejected": -18.231136322021484,
107
+ "logps/chosen": -2.3145792484283447,
108
+ "logps/rejected": -2.266110897064209,
109
+ "loss": 1.0125,
110
+ "rewards/accuracies": 0.453125,
111
+ "rewards/chosen": -2.3145792484283447,
112
+ "rewards/margins": -0.04846823215484619,
113
+ "rewards/rejected": -2.266110897064209,
114
+ "step": 70
115
+ },
116
+ {
117
+ "epoch": 0.012391093901258471,
118
+ "grad_norm": 0.8700842261314392,
119
+ "learning_rate": 8e-08,
120
+ "logits/chosen": -18.302425384521484,
121
+ "logits/rejected": -18.337112426757812,
122
+ "logps/chosen": -2.2703537940979004,
123
+ "logps/rejected": -2.247792959213257,
124
+ "loss": 0.9962,
125
+ "rewards/accuracies": 0.4625000059604645,
126
+ "rewards/chosen": -2.2703537940979004,
127
+ "rewards/margins": -0.022560495883226395,
128
+ "rewards/rejected": -2.247792959213257,
129
+ "step": 80
130
+ },
131
+ {
132
+ "epoch": 0.013939980638915779,
133
+ "grad_norm": 1.540815830230713,
134
+ "learning_rate": 9e-08,
135
+ "logits/chosen": -18.31792640686035,
136
+ "logits/rejected": -18.348609924316406,
137
+ "logps/chosen": -2.385634660720825,
138
+ "logps/rejected": -2.357053279876709,
139
+ "loss": 1.0,
140
+ "rewards/accuracies": 0.4437499940395355,
141
+ "rewards/chosen": -2.385634660720825,
142
+ "rewards/margins": -0.02858111262321472,
143
+ "rewards/rejected": -2.357053279876709,
144
+ "step": 90
145
+ },
146
+ {
147
+ "epoch": 0.015488867376573089,
148
+ "grad_norm": 0.36140134930610657,
149
+ "learning_rate": 1e-07,
150
+ "logits/chosen": -18.837753295898438,
151
+ "logits/rejected": -18.833690643310547,
152
+ "logps/chosen": -2.2959933280944824,
153
+ "logps/rejected": -2.287949323654175,
154
+ "loss": 0.9906,
155
+ "rewards/accuracies": 0.484375,
156
+ "rewards/chosen": -2.2959933280944824,
157
+ "rewards/margins": -0.008043941110372543,
158
+ "rewards/rejected": -2.287949323654175,
159
+ "step": 100
160
+ },
161
+ {
162
+ "epoch": 0.017037754114230398,
163
+ "grad_norm": 0.4019833505153656,
164
+ "learning_rate": 1.0999999999999999e-07,
165
+ "logits/chosen": -19.038759231567383,
166
+ "logits/rejected": -19.039684295654297,
167
+ "logps/chosen": -2.3060922622680664,
168
+ "logps/rejected": -2.290321111679077,
169
+ "loss": 0.9918,
170
+ "rewards/accuracies": 0.49687498807907104,
171
+ "rewards/chosen": -2.3060922622680664,
172
+ "rewards/margins": -0.01577121391892433,
173
+ "rewards/rejected": -2.290321111679077,
174
+ "step": 110
175
+ },
176
+ {
177
+ "epoch": 0.018586640851887704,
178
+ "grad_norm": 0.26196885108947754,
179
+ "learning_rate": 1.2e-07,
180
+ "logits/chosen": -18.49613380432129,
181
+ "logits/rejected": -18.51828384399414,
182
+ "logps/chosen": -2.2891242504119873,
183
+ "logps/rejected": -2.256464719772339,
184
+ "loss": 1.0011,
185
+ "rewards/accuracies": 0.484375,
186
+ "rewards/chosen": -2.2891242504119873,
187
+ "rewards/margins": -0.032659418880939484,
188
+ "rewards/rejected": -2.256464719772339,
189
+ "step": 120
190
+ },
191
+ {
192
+ "epoch": 0.020135527589545014,
193
+ "grad_norm": 0.4448695778846741,
194
+ "learning_rate": 1.3e-07,
195
+ "logits/chosen": -18.244447708129883,
196
+ "logits/rejected": -18.261150360107422,
197
+ "logps/chosen": -2.321626901626587,
198
+ "logps/rejected": -2.27813720703125,
199
+ "loss": 1.0088,
200
+ "rewards/accuracies": 0.44999998807907104,
201
+ "rewards/chosen": -2.321626901626587,
202
+ "rewards/margins": -0.04348985478281975,
203
+ "rewards/rejected": -2.27813720703125,
204
+ "step": 130
205
+ },
206
+ {
207
+ "epoch": 0.021684414327202323,
208
+ "grad_norm": 0.4014554023742676,
209
+ "learning_rate": 1.4e-07,
210
+ "logits/chosen": -18.389209747314453,
211
+ "logits/rejected": -18.408641815185547,
212
+ "logps/chosen": -2.30610990524292,
213
+ "logps/rejected": -2.285956859588623,
214
+ "loss": 0.9946,
215
+ "rewards/accuracies": 0.4437499940395355,
216
+ "rewards/chosen": -2.30610990524292,
217
+ "rewards/margins": -0.020152900367975235,
218
+ "rewards/rejected": -2.285956859588623,
219
+ "step": 140
220
+ },
221
+ {
222
+ "epoch": 0.023233301064859633,
223
+ "grad_norm": 1.7483704090118408,
224
+ "learning_rate": 1.5e-07,
225
+ "logits/chosen": -19.08399200439453,
226
+ "logits/rejected": -19.081613540649414,
227
+ "logps/chosen": -2.2456228733062744,
228
+ "logps/rejected": -2.2097201347351074,
229
+ "loss": 1.0024,
230
+ "rewards/accuracies": 0.43437498807907104,
231
+ "rewards/chosen": -2.2456228733062744,
232
+ "rewards/margins": -0.03590259328484535,
233
+ "rewards/rejected": -2.2097201347351074,
234
+ "step": 150
235
+ },
236
+ {
237
+ "epoch": 0.024782187802516942,
238
+ "grad_norm": 0.5051783919334412,
239
+ "learning_rate": 1.6e-07,
240
+ "logits/chosen": -18.577585220336914,
241
+ "logits/rejected": -18.57271385192871,
242
+ "logps/chosen": -2.315284013748169,
243
+ "logps/rejected": -2.290706157684326,
244
+ "loss": 0.9978,
245
+ "rewards/accuracies": 0.4625000059604645,
246
+ "rewards/chosen": -2.315284013748169,
247
+ "rewards/margins": -0.024577850475907326,
248
+ "rewards/rejected": -2.290706157684326,
249
+ "step": 160
250
+ },
251
+ {
252
+ "epoch": 0.02633107454017425,
253
+ "grad_norm": 0.34279578924179077,
254
+ "learning_rate": 1.7000000000000001e-07,
255
+ "logits/chosen": -18.236188888549805,
256
+ "logits/rejected": -18.241741180419922,
257
+ "logps/chosen": -2.2755513191223145,
258
+ "logps/rejected": -2.246842384338379,
259
+ "loss": 1.0017,
260
+ "rewards/accuracies": 0.4593749940395355,
261
+ "rewards/chosen": -2.2755513191223145,
262
+ "rewards/margins": -0.028709029778838158,
263
+ "rewards/rejected": -2.246842384338379,
264
+ "step": 170
265
+ },
266
+ {
267
+ "epoch": 0.027879961277831558,
268
+ "grad_norm": 0.4080793559551239,
269
+ "learning_rate": 1.8e-07,
270
+ "logits/chosen": -18.710674285888672,
271
+ "logits/rejected": -18.72400665283203,
272
+ "logps/chosen": -2.305172920227051,
273
+ "logps/rejected": -2.292833089828491,
274
+ "loss": 0.9899,
275
+ "rewards/accuracies": 0.5218750238418579,
276
+ "rewards/chosen": -2.305172920227051,
277
+ "rewards/margins": -0.012339852750301361,
278
+ "rewards/rejected": -2.292833089828491,
279
+ "step": 180
280
+ },
281
+ {
282
+ "epoch": 0.029428848015488868,
283
+ "grad_norm": 0.405762642621994,
284
+ "learning_rate": 1.8999999999999998e-07,
285
+ "logits/chosen": -18.450515747070312,
286
+ "logits/rejected": -18.441322326660156,
287
+ "logps/chosen": -2.3555941581726074,
288
+ "logps/rejected": -2.3243470191955566,
289
+ "loss": 1.0001,
290
+ "rewards/accuracies": 0.45625001192092896,
291
+ "rewards/chosen": -2.3555941581726074,
292
+ "rewards/margins": -0.031247194856405258,
293
+ "rewards/rejected": -2.3243470191955566,
294
+ "step": 190
295
+ },
296
+ {
297
+ "epoch": 0.030977734753146177,
298
+ "grad_norm": 0.6220257878303528,
299
+ "learning_rate": 2e-07,
300
+ "logits/chosen": -18.348241806030273,
301
+ "logits/rejected": -18.33614158630371,
302
+ "logps/chosen": -2.34555983543396,
303
+ "logps/rejected": -2.3049354553222656,
304
+ "loss": 1.006,
305
+ "rewards/accuracies": 0.42500001192092896,
306
+ "rewards/chosen": -2.34555983543396,
307
+ "rewards/margins": -0.04062436893582344,
308
+ "rewards/rejected": -2.3049354553222656,
309
+ "step": 200
310
+ },
311
+ {
312
+ "epoch": 0.03252662149080349,
313
+ "grad_norm": 3.5855581760406494,
314
+ "learning_rate": 2.0999999999999997e-07,
315
+ "logits/chosen": -18.243648529052734,
316
+ "logits/rejected": -18.23639678955078,
317
+ "logps/chosen": -2.3692617416381836,
318
+ "logps/rejected": -2.3047068119049072,
319
+ "loss": 1.0245,
320
+ "rewards/accuracies": 0.42500001192092896,
321
+ "rewards/chosen": -2.3692617416381836,
322
+ "rewards/margins": -0.06455513834953308,
323
+ "rewards/rejected": -2.3047068119049072,
324
+ "step": 210
325
+ },
326
+ {
327
+ "epoch": 0.034075508228460796,
328
+ "grad_norm": 0.6635384559631348,
329
+ "learning_rate": 2.1999999999999998e-07,
330
+ "logits/chosen": -18.2818603515625,
331
+ "logits/rejected": -18.27474594116211,
332
+ "logps/chosen": -2.364041566848755,
333
+ "logps/rejected": -2.36596417427063,
334
+ "loss": 0.9793,
335
+ "rewards/accuracies": 0.512499988079071,
336
+ "rewards/chosen": -2.364041566848755,
337
+ "rewards/margins": 0.001922703580930829,
338
+ "rewards/rejected": -2.36596417427063,
339
+ "step": 220
340
+ },
341
+ {
342
+ "epoch": 0.035624394966118106,
343
+ "grad_norm": 0.3510674834251404,
344
+ "learning_rate": 2.3e-07,
345
+ "logits/chosen": -18.33125877380371,
346
+ "logits/rejected": -18.33574104309082,
347
+ "logps/chosen": -2.287745475769043,
348
+ "logps/rejected": -2.279340982437134,
349
+ "loss": 0.9888,
350
+ "rewards/accuracies": 0.46875,
351
+ "rewards/chosen": -2.287745475769043,
352
+ "rewards/margins": -0.008404545485973358,
353
+ "rewards/rejected": -2.279340982437134,
354
+ "step": 230
355
+ },
356
+ {
357
+ "epoch": 0.03717328170377541,
358
+ "grad_norm": 0.37024492025375366,
359
+ "learning_rate": 2.4e-07,
360
+ "logits/chosen": -18.695646286010742,
361
+ "logits/rejected": -18.679492950439453,
362
+ "logps/chosen": -2.264132022857666,
363
+ "logps/rejected": -2.238377809524536,
364
+ "loss": 0.9979,
365
+ "rewards/accuracies": 0.4749999940395355,
366
+ "rewards/chosen": -2.264132022857666,
367
+ "rewards/margins": -0.02575414814054966,
368
+ "rewards/rejected": -2.238377809524536,
369
+ "step": 240
370
+ },
371
+ {
372
+ "epoch": 0.03872216844143272,
373
+ "grad_norm": 0.3906606435775757,
374
+ "learning_rate": 2.5e-07,
375
+ "logits/chosen": -18.439117431640625,
376
+ "logits/rejected": -18.4523983001709,
377
+ "logps/chosen": -2.366199493408203,
378
+ "logps/rejected": -2.340691089630127,
379
+ "loss": 1.0002,
380
+ "rewards/accuracies": 0.4781250059604645,
381
+ "rewards/chosen": -2.366199493408203,
382
+ "rewards/margins": -0.025508452206850052,
383
+ "rewards/rejected": -2.340691089630127,
384
+ "step": 250
385
+ },
386
+ {
387
+ "epoch": 0.04027105517909003,
388
+ "grad_norm": 0.34524816274642944,
389
+ "learning_rate": 2.6e-07,
390
+ "logits/chosen": -18.767566680908203,
391
+ "logits/rejected": -18.776933670043945,
392
+ "logps/chosen": -2.356963872909546,
393
+ "logps/rejected": -2.3292131423950195,
394
+ "loss": 1.0001,
395
+ "rewards/accuracies": 0.5093749761581421,
396
+ "rewards/chosen": -2.356963872909546,
397
+ "rewards/margins": -0.027750641107559204,
398
+ "rewards/rejected": -2.3292131423950195,
399
+ "step": 260
400
+ },
401
+ {
402
+ "epoch": 0.04181994191674734,
403
+ "grad_norm": 0.42237332463264465,
404
+ "learning_rate": 2.7e-07,
405
+ "logits/chosen": -18.576757431030273,
406
+ "logits/rejected": -18.59623146057129,
407
+ "logps/chosen": -2.3094825744628906,
408
+ "logps/rejected": -2.2776551246643066,
409
+ "loss": 1.0025,
410
+ "rewards/accuracies": 0.4749999940395355,
411
+ "rewards/chosen": -2.3094825744628906,
412
+ "rewards/margins": -0.03182785585522652,
413
+ "rewards/rejected": -2.2776551246643066,
414
+ "step": 270
415
+ },
416
+ {
417
+ "epoch": 0.04336882865440465,
418
+ "grad_norm": 1.5920014381408691,
419
+ "learning_rate": 2.8e-07,
420
+ "logits/chosen": -18.295106887817383,
421
+ "logits/rejected": -18.32059669494629,
422
+ "logps/chosen": -2.4310879707336426,
423
+ "logps/rejected": -2.3889243602752686,
424
+ "loss": 1.0103,
425
+ "rewards/accuracies": 0.4437499940395355,
426
+ "rewards/chosen": -2.4310879707336426,
427
+ "rewards/margins": -0.0421634204685688,
428
+ "rewards/rejected": -2.3889243602752686,
429
+ "step": 280
430
+ },
431
+ {
432
+ "epoch": 0.044917715392061956,
433
+ "grad_norm": 0.3145431876182556,
434
+ "learning_rate": 2.9e-07,
435
+ "logits/chosen": -18.42379379272461,
436
+ "logits/rejected": -18.42411231994629,
437
+ "logps/chosen": -2.281630039215088,
438
+ "logps/rejected": -2.2772393226623535,
439
+ "loss": 0.9853,
440
+ "rewards/accuracies": 0.518750011920929,
441
+ "rewards/chosen": -2.281630039215088,
442
+ "rewards/margins": -0.004390752874314785,
443
+ "rewards/rejected": -2.2772393226623535,
444
+ "step": 290
445
+ },
446
+ {
447
+ "epoch": 0.046466602129719266,
448
+ "grad_norm": 0.4161577522754669,
449
+ "learning_rate": 3e-07,
450
+ "logits/chosen": -18.23064613342285,
451
+ "logits/rejected": -18.205217361450195,
452
+ "logps/chosen": -2.368056535720825,
453
+ "logps/rejected": -2.334916114807129,
454
+ "loss": 1.0055,
455
+ "rewards/accuracies": 0.4781250059604645,
456
+ "rewards/chosen": -2.368056535720825,
457
+ "rewards/margins": -0.0331403985619545,
458
+ "rewards/rejected": -2.334916114807129,
459
+ "step": 300
460
+ },
461
+ {
462
+ "epoch": 0.048015488867376575,
463
+ "grad_norm": 0.5838750600814819,
464
+ "learning_rate": 3.1e-07,
465
+ "logits/chosen": -18.233129501342773,
466
+ "logits/rejected": -18.22580337524414,
467
+ "logps/chosen": -2.4445388317108154,
468
+ "logps/rejected": -2.4140288829803467,
469
+ "loss": 1.0023,
470
+ "rewards/accuracies": 0.48124998807907104,
471
+ "rewards/chosen": -2.4445388317108154,
472
+ "rewards/margins": -0.03050977550446987,
473
+ "rewards/rejected": -2.4140288829803467,
474
+ "step": 310
475
+ },
476
+ {
477
+ "epoch": 0.049564375605033885,
478
+ "grad_norm": 1.2753629684448242,
479
+ "learning_rate": 3.2e-07,
480
+ "logits/chosen": -18.576799392700195,
481
+ "logits/rejected": -18.594409942626953,
482
+ "logps/chosen": -2.307619094848633,
483
+ "logps/rejected": -2.2950479984283447,
484
+ "loss": 0.9892,
485
+ "rewards/accuracies": 0.46562498807907104,
486
+ "rewards/chosen": -2.307619094848633,
487
+ "rewards/margins": -0.012571193277835846,
488
+ "rewards/rejected": -2.2950479984283447,
489
+ "step": 320
490
+ },
491
+ {
492
+ "epoch": 0.05111326234269119,
493
+ "grad_norm": 0.45568475127220154,
494
+ "learning_rate": 3.3e-07,
495
+ "logits/chosen": -18.394086837768555,
496
+ "logits/rejected": -18.384502410888672,
497
+ "logps/chosen": -2.3357903957366943,
498
+ "logps/rejected": -2.3171801567077637,
499
+ "loss": 0.9936,
500
+ "rewards/accuracies": 0.4749999940395355,
501
+ "rewards/chosen": -2.3357903957366943,
502
+ "rewards/margins": -0.018610065802931786,
503
+ "rewards/rejected": -2.3171801567077637,
504
+ "step": 330
505
+ },
506
+ {
507
+ "epoch": 0.0526621490803485,
508
+ "grad_norm": 0.33304306864738464,
509
+ "learning_rate": 3.4000000000000003e-07,
510
+ "logits/chosen": -18.469026565551758,
511
+ "logits/rejected": -18.481182098388672,
512
+ "logps/chosen": -2.3236474990844727,
513
+ "logps/rejected": -2.2752368450164795,
514
+ "loss": 1.0125,
515
+ "rewards/accuracies": 0.45625001192092896,
516
+ "rewards/chosen": -2.3236474990844727,
517
+ "rewards/margins": -0.04841040447354317,
518
+ "rewards/rejected": -2.2752368450164795,
519
+ "step": 340
520
+ },
521
+ {
522
+ "epoch": 0.05421103581800581,
523
+ "grad_norm": 0.36008933186531067,
524
+ "learning_rate": 3.5e-07,
525
+ "logits/chosen": -18.52065658569336,
526
+ "logits/rejected": -18.487810134887695,
527
+ "logps/chosen": -2.344296455383301,
528
+ "logps/rejected": -2.322968006134033,
529
+ "loss": 0.9961,
530
+ "rewards/accuracies": 0.47187501192092896,
531
+ "rewards/chosen": -2.344296455383301,
532
+ "rewards/margins": -0.021328354254364967,
533
+ "rewards/rejected": -2.322968006134033,
534
+ "step": 350
535
+ },
536
+ {
537
+ "epoch": 0.055759922555663116,
538
+ "grad_norm": 0.29652485251426697,
539
+ "learning_rate": 3.6e-07,
540
+ "logits/chosen": -18.81740951538086,
541
+ "logits/rejected": -18.82996940612793,
542
+ "logps/chosen": -2.355247974395752,
543
+ "logps/rejected": -2.290473461151123,
544
+ "loss": 1.0234,
545
+ "rewards/accuracies": 0.3843750059604645,
546
+ "rewards/chosen": -2.355247974395752,
547
+ "rewards/margins": -0.06477431207895279,
548
+ "rewards/rejected": -2.290473461151123,
549
+ "step": 360
550
+ },
551
+ {
552
+ "epoch": 0.057308809293320426,
553
+ "grad_norm": 0.30988508462905884,
554
+ "learning_rate": 3.7e-07,
555
+ "logits/chosen": -18.528207778930664,
556
+ "logits/rejected": -18.5264835357666,
557
+ "logps/chosen": -2.332744836807251,
558
+ "logps/rejected": -2.300027847290039,
559
+ "loss": 1.0028,
560
+ "rewards/accuracies": 0.4625000059604645,
561
+ "rewards/chosen": -2.332744836807251,
562
+ "rewards/margins": -0.0327170193195343,
563
+ "rewards/rejected": -2.300027847290039,
564
+ "step": 370
565
+ },
566
+ {
567
+ "epoch": 0.058857696030977735,
568
+ "grad_norm": 0.4165166914463043,
569
+ "learning_rate": 3.7999999999999996e-07,
570
+ "logits/chosen": -18.891124725341797,
571
+ "logits/rejected": -18.909482955932617,
572
+ "logps/chosen": -2.3080811500549316,
573
+ "logps/rejected": -2.259889841079712,
574
+ "loss": 1.0112,
575
+ "rewards/accuracies": 0.4593749940395355,
576
+ "rewards/chosen": -2.3080811500549316,
577
+ "rewards/margins": -0.04819165915250778,
578
+ "rewards/rejected": -2.259889841079712,
579
+ "step": 380
580
+ },
581
+ {
582
+ "epoch": 0.060406582768635045,
583
+ "grad_norm": 1.0477248430252075,
584
+ "learning_rate": 3.8999999999999997e-07,
585
+ "logits/chosen": -18.784969329833984,
586
+ "logits/rejected": -18.758953094482422,
587
+ "logps/chosen": -2.2978055477142334,
588
+ "logps/rejected": -2.2670950889587402,
589
+ "loss": 1.0005,
590
+ "rewards/accuracies": 0.4625000059604645,
591
+ "rewards/chosen": -2.2978055477142334,
592
+ "rewards/margins": -0.030710700899362564,
593
+ "rewards/rejected": -2.2670950889587402,
594
+ "step": 390
595
+ },
596
+ {
597
+ "epoch": 0.061955469506292354,
598
+ "grad_norm": 0.4697420001029968,
599
+ "learning_rate": 4e-07,
600
+ "logits/chosen": -18.358470916748047,
601
+ "logits/rejected": -18.33217430114746,
602
+ "logps/chosen": -2.4158871173858643,
603
+ "logps/rejected": -2.3744359016418457,
604
+ "loss": 1.0079,
605
+ "rewards/accuracies": 0.46562498807907104,
606
+ "rewards/chosen": -2.4158871173858643,
607
+ "rewards/margins": -0.041451383382081985,
608
+ "rewards/rejected": -2.3744359016418457,
609
+ "step": 400
610
+ },
611
+ {
612
+ "epoch": 0.06350435624394966,
613
+ "grad_norm": 0.3874570429325104,
614
+ "learning_rate": 4.0999999999999994e-07,
615
+ "logits/chosen": -18.423263549804688,
616
+ "logits/rejected": -18.414335250854492,
617
+ "logps/chosen": -2.274900436401367,
618
+ "logps/rejected": -2.2501485347747803,
619
+ "loss": 0.9968,
620
+ "rewards/accuracies": 0.503125011920929,
621
+ "rewards/chosen": -2.274900436401367,
622
+ "rewards/margins": -0.024752041324973106,
623
+ "rewards/rejected": -2.2501485347747803,
624
+ "step": 410
625
+ },
626
+ {
627
+ "epoch": 0.06505324298160697,
628
+ "grad_norm": 0.2982409596443176,
629
+ "learning_rate": 4.1999999999999995e-07,
630
+ "logits/chosen": -18.789409637451172,
631
+ "logits/rejected": -18.804325103759766,
632
+ "logps/chosen": -2.2870326042175293,
633
+ "logps/rejected": -2.2827885150909424,
634
+ "loss": 0.9837,
635
+ "rewards/accuracies": 0.5062500238418579,
636
+ "rewards/chosen": -2.2870326042175293,
637
+ "rewards/margins": -0.0042442576959729195,
638
+ "rewards/rejected": -2.2827885150909424,
639
+ "step": 420
640
+ },
641
+ {
642
+ "epoch": 0.06660212971926428,
643
+ "grad_norm": 0.3118441700935364,
644
+ "learning_rate": 4.2999999999999996e-07,
645
+ "logits/chosen": -18.53753662109375,
646
+ "logits/rejected": -18.54221534729004,
647
+ "logps/chosen": -2.3336539268493652,
648
+ "logps/rejected": -2.318925142288208,
649
+ "loss": 0.9933,
650
+ "rewards/accuracies": 0.5,
651
+ "rewards/chosen": -2.3336539268493652,
652
+ "rewards/margins": -0.014728927984833717,
653
+ "rewards/rejected": -2.318925142288208,
654
+ "step": 430
655
+ },
656
+ {
657
+ "epoch": 0.06815101645692159,
658
+ "grad_norm": 0.5136622786521912,
659
+ "learning_rate": 4.3999999999999997e-07,
660
+ "logits/chosen": -18.413028717041016,
661
+ "logits/rejected": -18.402856826782227,
662
+ "logps/chosen": -2.275359630584717,
663
+ "logps/rejected": -2.2723371982574463,
664
+ "loss": 0.982,
665
+ "rewards/accuracies": 0.5218750238418579,
666
+ "rewards/chosen": -2.275359630584717,
667
+ "rewards/margins": -0.003022759687155485,
668
+ "rewards/rejected": -2.2723371982574463,
669
+ "step": 440
670
+ },
671
+ {
672
+ "epoch": 0.0696999031945789,
673
+ "grad_norm": 0.31097057461738586,
674
+ "learning_rate": 4.5e-07,
675
+ "logits/chosen": -18.888200759887695,
676
+ "logits/rejected": -18.88901138305664,
677
+ "logps/chosen": -2.2446212768554688,
678
+ "logps/rejected": -2.2096800804138184,
679
+ "loss": 1.0031,
680
+ "rewards/accuracies": 0.453125,
681
+ "rewards/chosen": -2.2446212768554688,
682
+ "rewards/margins": -0.03494144231081009,
683
+ "rewards/rejected": -2.2096800804138184,
684
+ "step": 450
685
+ },
686
+ {
687
+ "epoch": 0.07124878993223621,
688
+ "grad_norm": 0.27818989753723145,
689
+ "learning_rate": 4.6e-07,
690
+ "logits/chosen": -18.227222442626953,
691
+ "logits/rejected": -18.229137420654297,
692
+ "logps/chosen": -2.3725333213806152,
693
+ "logps/rejected": -2.352210521697998,
694
+ "loss": 0.9959,
695
+ "rewards/accuracies": 0.4937500059604645,
696
+ "rewards/chosen": -2.3725333213806152,
697
+ "rewards/margins": -0.020322900265455246,
698
+ "rewards/rejected": -2.352210521697998,
699
+ "step": 460
700
+ },
701
+ {
702
+ "epoch": 0.07279767666989351,
703
+ "grad_norm": 0.5283552408218384,
704
+ "learning_rate": 4.6999999999999995e-07,
705
+ "logits/chosen": -18.395034790039062,
706
+ "logits/rejected": -18.404457092285156,
707
+ "logps/chosen": -2.355647325515747,
708
+ "logps/rejected": -2.332040309906006,
709
+ "loss": 0.9973,
710
+ "rewards/accuracies": 0.4749999940395355,
711
+ "rewards/chosen": -2.355647325515747,
712
+ "rewards/margins": -0.023607196286320686,
713
+ "rewards/rejected": -2.332040309906006,
714
+ "step": 470
715
+ },
716
+ {
717
+ "epoch": 0.07434656340755082,
718
+ "grad_norm": 0.29566776752471924,
719
+ "learning_rate": 4.8e-07,
720
+ "logits/chosen": -18.02447509765625,
721
+ "logits/rejected": -18.034442901611328,
722
+ "logps/chosen": -2.3172950744628906,
723
+ "logps/rejected": -2.287550449371338,
724
+ "loss": 1.0014,
725
+ "rewards/accuracies": 0.4749999940395355,
726
+ "rewards/chosen": -2.3172950744628906,
727
+ "rewards/margins": -0.029744494706392288,
728
+ "rewards/rejected": -2.287550449371338,
729
+ "step": 480
730
+ },
731
+ {
732
+ "epoch": 0.07589545014520813,
733
+ "grad_norm": 0.30026039481163025,
734
+ "learning_rate": 4.9e-07,
735
+ "logits/chosen": -18.442893981933594,
736
+ "logits/rejected": -18.461532592773438,
737
+ "logps/chosen": -2.3118157386779785,
738
+ "logps/rejected": -2.2851450443267822,
739
+ "loss": 0.9989,
740
+ "rewards/accuracies": 0.4593749940395355,
741
+ "rewards/chosen": -2.3118157386779785,
742
+ "rewards/margins": -0.026670396327972412,
743
+ "rewards/rejected": -2.2851450443267822,
744
+ "step": 490
745
+ },
746
+ {
747
+ "epoch": 0.07744433688286544,
748
+ "grad_norm": 0.2541051506996155,
749
+ "learning_rate": 5e-07,
750
+ "logits/chosen": -18.744117736816406,
751
+ "logits/rejected": -18.741052627563477,
752
+ "logps/chosen": -2.282050371170044,
753
+ "logps/rejected": -2.259800910949707,
754
+ "loss": 0.9956,
755
+ "rewards/accuracies": 0.44999998807907104,
756
+ "rewards/chosen": -2.282050371170044,
757
+ "rewards/margins": -0.022249501198530197,
758
+ "rewards/rejected": -2.259800910949707,
759
+ "step": 500
760
+ },
761
+ {
762
+ "epoch": 0.07744433688286544,
763
+ "eval_logits/chosen": -18.397302627563477,
764
+ "eval_logits/rejected": -18.406831741333008,
765
+ "eval_logps/chosen": -2.333314895629883,
766
+ "eval_logps/rejected": -2.30248761177063,
767
+ "eval_loss": 1.0014196634292603,
768
+ "eval_rewards/accuracies": 0.4621647596359253,
769
+ "eval_rewards/chosen": -2.333314895629883,
770
+ "eval_rewards/margins": -0.030827123671770096,
771
+ "eval_rewards/rejected": -2.30248761177063,
772
+ "eval_runtime": 201.9167,
773
+ "eval_samples_per_second": 10.336,
774
+ "eval_steps_per_second": 5.17,
775
+ "step": 500
776
+ },
777
+ {
778
+ "epoch": 0.07899322362052275,
779
+ "grad_norm": 0.2181754857301712,
780
+ "learning_rate": 4.999965222418799e-07,
781
+ "logits/chosen": -18.898908615112305,
782
+ "logits/rejected": -18.870685577392578,
783
+ "logps/chosen": -2.2342917919158936,
784
+ "logps/rejected": -2.2303569316864014,
785
+ "loss": 0.9848,
786
+ "rewards/accuracies": 0.4906249940395355,
787
+ "rewards/chosen": -2.2342917919158936,
788
+ "rewards/margins": -0.003935129847377539,
789
+ "rewards/rejected": -2.2303569316864014,
790
+ "step": 510
791
+ },
792
+ {
793
+ "epoch": 0.08054211035818006,
794
+ "grad_norm": 0.5725811123847961,
795
+ "learning_rate": 4.999860890642776e-07,
796
+ "logits/chosen": -18.56229019165039,
797
+ "logits/rejected": -18.55545425415039,
798
+ "logps/chosen": -2.346125841140747,
799
+ "logps/rejected": -2.311354160308838,
800
+ "loss": 1.0045,
801
+ "rewards/accuracies": 0.4781250059604645,
802
+ "rewards/chosen": -2.346125841140747,
803
+ "rewards/margins": -0.034771550446748734,
804
+ "rewards/rejected": -2.311354160308838,
805
+ "step": 520
806
+ },
807
+ {
808
+ "epoch": 0.08209099709583736,
809
+ "grad_norm": 0.30994686484336853,
810
+ "learning_rate": 4.99968700757466e-07,
811
+ "logits/chosen": -19.20330810546875,
812
+ "logits/rejected": -19.187274932861328,
813
+ "logps/chosen": -2.2387478351593018,
814
+ "logps/rejected": -2.2440736293792725,
815
+ "loss": 0.9784,
816
+ "rewards/accuracies": 0.518750011920929,
817
+ "rewards/chosen": -2.2387478351593018,
818
+ "rewards/margins": 0.0053258733823895454,
819
+ "rewards/rejected": -2.2440736293792725,
820
+ "step": 530
821
+ },
822
+ {
823
+ "epoch": 0.08363988383349467,
824
+ "grad_norm": 0.3730740249156952,
825
+ "learning_rate": 4.999443578052237e-07,
826
+ "logits/chosen": -18.384979248046875,
827
+ "logits/rejected": -18.40036964416504,
828
+ "logps/chosen": -2.3040053844451904,
829
+ "logps/rejected": -2.257456064224243,
830
+ "loss": 1.0101,
831
+ "rewards/accuracies": 0.48750001192092896,
832
+ "rewards/chosen": -2.3040053844451904,
833
+ "rewards/margins": -0.04654966667294502,
834
+ "rewards/rejected": -2.257456064224243,
835
+ "step": 540
836
+ },
837
+ {
838
+ "epoch": 0.08518877057115198,
839
+ "grad_norm": 0.46454644203186035,
840
+ "learning_rate": 4.999130608848216e-07,
841
+ "logits/chosen": -18.277536392211914,
842
+ "logits/rejected": -18.299518585205078,
843
+ "logps/chosen": -2.423797130584717,
844
+ "logps/rejected": -2.3845160007476807,
845
+ "loss": 1.0075,
846
+ "rewards/accuracies": 0.4468750059604645,
847
+ "rewards/chosen": -2.423797130584717,
848
+ "rewards/margins": -0.03928118944168091,
849
+ "rewards/rejected": -2.3845160007476807,
850
+ "step": 550
851
+ },
852
+ {
853
+ "epoch": 0.0867376573088093,
854
+ "grad_norm": 0.32287153601646423,
855
+ "learning_rate": 4.99874810867005e-07,
856
+ "logits/chosen": -18.73580551147461,
857
+ "logits/rejected": -18.721725463867188,
858
+ "logps/chosen": -2.314521074295044,
859
+ "logps/rejected": -2.272265911102295,
860
+ "loss": 1.009,
861
+ "rewards/accuracies": 0.47187501192092896,
862
+ "rewards/chosen": -2.314521074295044,
863
+ "rewards/margins": -0.04225506633520126,
864
+ "rewards/rejected": -2.272265911102295,
865
+ "step": 560
866
+ },
867
+ {
868
+ "epoch": 0.0882865440464666,
869
+ "grad_norm": 1.6651438474655151,
870
+ "learning_rate": 4.998296088159681e-07,
871
+ "logits/chosen": -17.93259620666504,
872
+ "logits/rejected": -17.944868087768555,
873
+ "logps/chosen": -2.3117213249206543,
874
+ "logps/rejected": -2.2710154056549072,
875
+ "loss": 1.0076,
876
+ "rewards/accuracies": 0.46562498807907104,
877
+ "rewards/chosen": -2.3117213249206543,
878
+ "rewards/margins": -0.04070620238780975,
879
+ "rewards/rejected": -2.2710154056549072,
880
+ "step": 570
881
+ },
882
+ {
883
+ "epoch": 0.08983543078412391,
884
+ "grad_norm": 0.5053322911262512,
885
+ "learning_rate": 4.997774559893254e-07,
886
+ "logits/chosen": -18.631776809692383,
887
+ "logits/rejected": -18.645475387573242,
888
+ "logps/chosen": -2.2970833778381348,
889
+ "logps/rejected": -2.272408962249756,
890
+ "loss": 0.9976,
891
+ "rewards/accuracies": 0.5,
892
+ "rewards/chosen": -2.2970833778381348,
893
+ "rewards/margins": -0.024674396961927414,
894
+ "rewards/rejected": -2.272408962249756,
895
+ "step": 580
896
+ },
897
+ {
898
+ "epoch": 0.09138431752178122,
899
+ "grad_norm": 0.43934622406959534,
900
+ "learning_rate": 4.997183538380762e-07,
901
+ "logits/chosen": -18.472524642944336,
902
+ "logits/rejected": -18.480520248413086,
903
+ "logps/chosen": -2.318777084350586,
904
+ "logps/rejected": -2.285399913787842,
905
+ "loss": 1.0041,
906
+ "rewards/accuracies": 0.49687498807907104,
907
+ "rewards/chosen": -2.318777084350586,
908
+ "rewards/margins": -0.03337707743048668,
909
+ "rewards/rejected": -2.285399913787842,
910
+ "step": 590
911
+ },
912
+ {
913
+ "epoch": 0.09293320425943853,
914
+ "grad_norm": 0.3377698063850403,
915
+ "learning_rate": 4.996523040065646e-07,
916
+ "logits/chosen": -19.70441436767578,
917
+ "logits/rejected": -19.69642448425293,
918
+ "logps/chosen": -2.306756019592285,
919
+ "logps/rejected": -2.268965482711792,
920
+ "loss": 1.003,
921
+ "rewards/accuracies": 0.4625000059604645,
922
+ "rewards/chosen": -2.306756019592285,
923
+ "rewards/margins": -0.03779022395610809,
924
+ "rewards/rejected": -2.268965482711792,
925
+ "step": 600
926
+ },
927
+ {
928
+ "epoch": 0.09448209099709584,
929
+ "grad_norm": 0.6424508094787598,
930
+ "learning_rate": 4.995793083324331e-07,
931
+ "logits/chosen": -18.741695404052734,
932
+ "logits/rejected": -18.73526382446289,
933
+ "logps/chosen": -2.355008363723755,
934
+ "logps/rejected": -2.332728385925293,
935
+ "loss": 0.998,
936
+ "rewards/accuracies": 0.49687498807907104,
937
+ "rewards/chosen": -2.355008363723755,
938
+ "rewards/margins": -0.022280065342783928,
939
+ "rewards/rejected": -2.332728385925293,
940
+ "step": 610
941
+ },
942
+ {
943
+ "epoch": 0.09603097773475315,
944
+ "grad_norm": 0.38619065284729004,
945
+ "learning_rate": 4.99499368846572e-07,
946
+ "logits/chosen": -18.917526245117188,
947
+ "logits/rejected": -18.91817855834961,
948
+ "logps/chosen": -2.2462356090545654,
949
+ "logps/rejected": -2.2053093910217285,
950
+ "loss": 1.0079,
951
+ "rewards/accuracies": 0.453125,
952
+ "rewards/chosen": -2.2462356090545654,
953
+ "rewards/margins": -0.040926456451416016,
954
+ "rewards/rejected": -2.2053093910217285,
955
+ "step": 620
956
+ },
957
+ {
958
+ "epoch": 0.09757986447241046,
959
+ "grad_norm": 0.3706226050853729,
960
+ "learning_rate": 4.994124877730631e-07,
961
+ "logits/chosen": -18.773418426513672,
962
+ "logits/rejected": -18.763322830200195,
963
+ "logps/chosen": -2.4011077880859375,
964
+ "logps/rejected": -2.359574794769287,
965
+ "loss": 1.0082,
966
+ "rewards/accuracies": 0.44062501192092896,
967
+ "rewards/chosen": -2.4011077880859375,
968
+ "rewards/margins": -0.04153291881084442,
969
+ "rewards/rejected": -2.359574794769287,
970
+ "step": 630
971
+ },
972
+ {
973
+ "epoch": 0.09912875121006777,
974
+ "grad_norm": 0.43836838006973267,
975
+ "learning_rate": 4.993186675291171e-07,
976
+ "logits/chosen": -18.509357452392578,
977
+ "logits/rejected": -18.474916458129883,
978
+ "logps/chosen": -2.3907310962677,
979
+ "logps/rejected": -2.367095947265625,
980
+ "loss": 0.9976,
981
+ "rewards/accuracies": 0.47187501192092896,
982
+ "rewards/chosen": -2.3907310962677,
983
+ "rewards/margins": -0.02363501489162445,
984
+ "rewards/rejected": -2.367095947265625,
985
+ "step": 640
986
+ },
987
+ {
988
+ "epoch": 0.10067763794772508,
989
+ "grad_norm": 0.3059249520301819,
990
+ "learning_rate": 4.99217910725007e-07,
991
+ "logits/chosen": -19.064586639404297,
992
+ "logits/rejected": -19.049022674560547,
993
+ "logps/chosen": -2.2733535766601562,
994
+ "logps/rejected": -2.2445480823516846,
995
+ "loss": 0.998,
996
+ "rewards/accuracies": 0.43437498807907104,
997
+ "rewards/chosen": -2.2733535766601562,
998
+ "rewards/margins": -0.02880530059337616,
999
+ "rewards/rejected": -2.2445480823516846,
1000
+ "step": 650
1001
+ },
1002
+ {
1003
+ "epoch": 0.10222652468538238,
1004
+ "grad_norm": 0.3304593861103058,
1005
+ "learning_rate": 4.991102201639952e-07,
1006
+ "logits/chosen": -18.48186683654785,
1007
+ "logits/rejected": -18.4903507232666,
1008
+ "logps/chosen": -2.3360755443573,
1009
+ "logps/rejected": -2.2711851596832275,
1010
+ "loss": 1.0228,
1011
+ "rewards/accuracies": 0.42500001192092896,
1012
+ "rewards/chosen": -2.3360755443573,
1013
+ "rewards/margins": -0.06489041447639465,
1014
+ "rewards/rejected": -2.2711851596832275,
1015
+ "step": 660
1016
+ },
1017
+ {
1018
+ "epoch": 0.10377541142303968,
1019
+ "grad_norm": 0.7939794063568115,
1020
+ "learning_rate": 4.989955988422554e-07,
1021
+ "logits/chosen": -18.29547691345215,
1022
+ "logits/rejected": -18.315994262695312,
1023
+ "logps/chosen": -2.3505773544311523,
1024
+ "logps/rejected": -2.299063205718994,
1025
+ "loss": 1.0143,
1026
+ "rewards/accuracies": 0.4000000059604645,
1027
+ "rewards/chosen": -2.3505773544311523,
1028
+ "rewards/margins": -0.051514364778995514,
1029
+ "rewards/rejected": -2.299063205718994,
1030
+ "step": 670
1031
+ },
1032
+ {
1033
+ "epoch": 0.105324298160697,
1034
+ "grad_norm": 0.49692028760910034,
1035
+ "learning_rate": 4.988740499487894e-07,
1036
+ "logits/chosen": -18.303329467773438,
1037
+ "logits/rejected": -18.300994873046875,
1038
+ "logps/chosen": -2.3320651054382324,
1039
+ "logps/rejected": -2.3294270038604736,
1040
+ "loss": 0.9833,
1041
+ "rewards/accuracies": 0.515625,
1042
+ "rewards/chosen": -2.3320651054382324,
1043
+ "rewards/margins": -0.0026380266062915325,
1044
+ "rewards/rejected": -2.3294270038604736,
1045
+ "step": 680
1046
+ },
1047
+ {
1048
+ "epoch": 0.1068731848983543,
1049
+ "grad_norm": 0.5572270154953003,
1050
+ "learning_rate": 4.987455768653385e-07,
1051
+ "logits/chosen": -18.451520919799805,
1052
+ "logits/rejected": -18.48006820678711,
1053
+ "logps/chosen": -2.2999916076660156,
1054
+ "logps/rejected": -2.266467332839966,
1055
+ "loss": 1.0027,
1056
+ "rewards/accuracies": 0.45625001192092896,
1057
+ "rewards/chosen": -2.2999916076660156,
1058
+ "rewards/margins": -0.033524490892887115,
1059
+ "rewards/rejected": -2.266467332839966,
1060
+ "step": 690
1061
+ },
1062
+ {
1063
+ "epoch": 0.10842207163601161,
1064
+ "grad_norm": 0.6092363595962524,
1065
+ "learning_rate": 4.986101831662894e-07,
1066
+ "logits/chosen": -18.52994728088379,
1067
+ "logits/rejected": -18.518688201904297,
1068
+ "logps/chosen": -2.393986940383911,
1069
+ "logps/rejected": -2.3591043949127197,
1070
+ "loss": 1.0029,
1071
+ "rewards/accuracies": 0.45625001192092896,
1072
+ "rewards/chosen": -2.393986940383911,
1073
+ "rewards/margins": -0.03488261625170708,
1074
+ "rewards/rejected": -2.3591043949127197,
1075
+ "step": 700
1076
+ },
1077
+ {
1078
+ "epoch": 0.10997095837366892,
1079
+ "grad_norm": 0.41200894117355347,
1080
+ "learning_rate": 4.984678726185739e-07,
1081
+ "logits/chosen": -18.782907485961914,
1082
+ "logits/rejected": -18.763673782348633,
1083
+ "logps/chosen": -2.326845169067383,
1084
+ "logps/rejected": -2.293470859527588,
1085
+ "loss": 1.003,
1086
+ "rewards/accuracies": 0.48124998807907104,
1087
+ "rewards/chosen": -2.326845169067383,
1088
+ "rewards/margins": -0.03337442874908447,
1089
+ "rewards/rejected": -2.293470859527588,
1090
+ "step": 710
1091
+ },
1092
+ {
1093
+ "epoch": 0.11151984511132623,
1094
+ "grad_norm": 0.34657081961631775,
1095
+ "learning_rate": 4.983186491815656e-07,
1096
+ "logits/chosen": -18.589115142822266,
1097
+ "logits/rejected": -18.618635177612305,
1098
+ "logps/chosen": -2.3275952339172363,
1099
+ "logps/rejected": -2.2913246154785156,
1100
+ "loss": 1.0062,
1101
+ "rewards/accuracies": 0.4906249940395355,
1102
+ "rewards/chosen": -2.3275952339172363,
1103
+ "rewards/margins": -0.03627028688788414,
1104
+ "rewards/rejected": -2.2913246154785156,
1105
+ "step": 720
1106
+ },
1107
+ {
1108
+ "epoch": 0.11306873184898354,
1109
+ "grad_norm": 0.3639511168003082,
1110
+ "learning_rate": 4.981625170069687e-07,
1111
+ "logits/chosen": -18.639911651611328,
1112
+ "logits/rejected": -18.627838134765625,
1113
+ "logps/chosen": -2.3085227012634277,
1114
+ "logps/rejected": -2.260363817214966,
1115
+ "loss": 1.0126,
1116
+ "rewards/accuracies": 0.4312500059604645,
1117
+ "rewards/chosen": -2.3085227012634277,
1118
+ "rewards/margins": -0.04815887287259102,
1119
+ "rewards/rejected": -2.260363817214966,
1120
+ "step": 730
1121
+ },
1122
+ {
1123
+ "epoch": 0.11461761858664085,
1124
+ "grad_norm": 0.34293103218078613,
1125
+ "learning_rate": 4.979994804387025e-07,
1126
+ "logits/chosen": -18.371139526367188,
1127
+ "logits/rejected": -18.37088966369629,
1128
+ "logps/chosen": -2.3088409900665283,
1129
+ "logps/rejected": -2.2924559116363525,
1130
+ "loss": 0.9919,
1131
+ "rewards/accuracies": 0.44999998807907104,
1132
+ "rewards/chosen": -2.3088409900665283,
1133
+ "rewards/margins": -0.016384964808821678,
1134
+ "rewards/rejected": -2.2924559116363525,
1135
+ "step": 740
1136
+ },
1137
+ {
1138
+ "epoch": 0.11616650532429816,
1139
+ "grad_norm": 0.6355793476104736,
1140
+ "learning_rate": 4.978295440127811e-07,
1141
+ "logits/chosen": -17.817703247070312,
1142
+ "logits/rejected": -17.832592010498047,
1143
+ "logps/chosen": -2.421024799346924,
1144
+ "logps/rejected": -2.3687844276428223,
1145
+ "loss": 1.0155,
1146
+ "rewards/accuracies": 0.40625,
1147
+ "rewards/chosen": -2.421024799346924,
1148
+ "rewards/margins": -0.052240293473005295,
1149
+ "rewards/rejected": -2.3687844276428223,
1150
+ "step": 750
1151
+ },
1152
+ {
1153
+ "epoch": 0.11771539206195547,
1154
+ "grad_norm": 0.5107437968254089,
1155
+ "learning_rate": 4.976527124571869e-07,
1156
+ "logits/chosen": -18.38277816772461,
1157
+ "logits/rejected": -18.363195419311523,
1158
+ "logps/chosen": -2.305868148803711,
1159
+ "logps/rejected": -2.2901623249053955,
1160
+ "loss": 0.9922,
1161
+ "rewards/accuracies": 0.46875,
1162
+ "rewards/chosen": -2.305868148803711,
1163
+ "rewards/margins": -0.01570596918463707,
1164
+ "rewards/rejected": -2.2901623249053955,
1165
+ "step": 760
1166
+ },
1167
+ {
1168
+ "epoch": 0.11926427879961278,
1169
+ "grad_norm": 0.40593504905700684,
1170
+ "learning_rate": 4.974689906917388e-07,
1171
+ "logits/chosen": -18.590545654296875,
1172
+ "logits/rejected": -18.588382720947266,
1173
+ "logps/chosen": -2.3169288635253906,
1174
+ "logps/rejected": -2.298309087753296,
1175
+ "loss": 0.9949,
1176
+ "rewards/accuracies": 0.484375,
1177
+ "rewards/chosen": -2.3169288635253906,
1178
+ "rewards/margins": -0.01861979439854622,
1179
+ "rewards/rejected": -2.298309087753296,
1180
+ "step": 770
1181
+ },
1182
+ {
1183
+ "epoch": 0.12081316553727009,
1184
+ "grad_norm": 0.5684529542922974,
1185
+ "learning_rate": 4.972783838279557e-07,
1186
+ "logits/chosen": -18.637237548828125,
1187
+ "logits/rejected": -18.637813568115234,
1188
+ "logps/chosen": -2.3667240142822266,
1189
+ "logps/rejected": -2.322007417678833,
1190
+ "loss": 1.0095,
1191
+ "rewards/accuracies": 0.43437498807907104,
1192
+ "rewards/chosen": -2.3667240142822266,
1193
+ "rewards/margins": -0.04471655189990997,
1194
+ "rewards/rejected": -2.322007417678833,
1195
+ "step": 780
1196
+ },
1197
+ {
1198
+ "epoch": 0.1223620522749274,
1199
+ "grad_norm": 0.3178389370441437,
1200
+ "learning_rate": 4.970808971689142e-07,
1201
+ "logits/chosen": -18.884435653686523,
1202
+ "logits/rejected": -18.88039779663086,
1203
+ "logps/chosen": -2.2492599487304688,
1204
+ "logps/rejected": -2.247825860977173,
1205
+ "loss": 0.9814,
1206
+ "rewards/accuracies": 0.503125011920929,
1207
+ "rewards/chosen": -2.2492599487304688,
1208
+ "rewards/margins": -0.0014338928740471601,
1209
+ "rewards/rejected": -2.247825860977173,
1210
+ "step": 790
1211
+ },
1212
+ {
1213
+ "epoch": 0.12391093901258471,
1214
+ "grad_norm": 0.40471580624580383,
1215
+ "learning_rate": 4.968765362091009e-07,
1216
+ "logits/chosen": -18.403453826904297,
1217
+ "logits/rejected": -18.45069694519043,
1218
+ "logps/chosen": -2.3634753227233887,
1219
+ "logps/rejected": -2.3154542446136475,
1220
+ "loss": 1.012,
1221
+ "rewards/accuracies": 0.40625,
1222
+ "rewards/chosen": -2.3634753227233887,
1223
+ "rewards/margins": -0.048020828515291214,
1224
+ "rewards/rejected": -2.3154542446136475,
1225
+ "step": 800
1226
+ },
1227
+ {
1228
+ "epoch": 0.125459825750242,
1229
+ "grad_norm": 0.41477352380752563,
1230
+ "learning_rate": 4.966653066342596e-07,
1231
+ "logits/chosen": -18.60321617126465,
1232
+ "logits/rejected": -18.62074851989746,
1233
+ "logps/chosen": -2.3101038932800293,
1234
+ "logps/rejected": -2.307743549346924,
1235
+ "loss": 0.9842,
1236
+ "rewards/accuracies": 0.503125011920929,
1237
+ "rewards/chosen": -2.3101038932800293,
1238
+ "rewards/margins": -0.0023605884052813053,
1239
+ "rewards/rejected": -2.307743549346924,
1240
+ "step": 810
1241
+ },
1242
+ {
1243
+ "epoch": 0.12700871248789933,
1244
+ "grad_norm": 0.6773151159286499,
1245
+ "learning_rate": 4.964472143212335e-07,
1246
+ "logits/chosen": -18.46171760559082,
1247
+ "logits/rejected": -18.4774227142334,
1248
+ "logps/chosen": -2.383174180984497,
1249
+ "logps/rejected": -2.3354432582855225,
1250
+ "loss": 1.0118,
1251
+ "rewards/accuracies": 0.4468750059604645,
1252
+ "rewards/chosen": -2.383174180984497,
1253
+ "rewards/margins": -0.04773087054491043,
1254
+ "rewards/rejected": -2.3354432582855225,
1255
+ "step": 820
1256
+ },
1257
+ {
1258
+ "epoch": 0.12855759922555662,
1259
+ "grad_norm": 0.660863995552063,
1260
+ "learning_rate": 4.962222653378009e-07,
1261
+ "logits/chosen": -18.25492286682129,
1262
+ "logits/rejected": -18.2567081451416,
1263
+ "logps/chosen": -2.344148635864258,
1264
+ "logps/rejected": -2.2953619956970215,
1265
+ "loss": 1.0101,
1266
+ "rewards/accuracies": 0.421875,
1267
+ "rewards/chosen": -2.344148635864258,
1268
+ "rewards/margins": -0.04878663271665573,
1269
+ "rewards/rejected": -2.2953619956970215,
1270
+ "step": 830
1271
+ },
1272
+ {
1273
+ "epoch": 0.13010648596321395,
1274
+ "grad_norm": 1.5914808511734009,
1275
+ "learning_rate": 4.959904659425071e-07,
1276
+ "logits/chosen": -18.26708984375,
1277
+ "logits/rejected": -18.278553009033203,
1278
+ "logps/chosen": -2.3088066577911377,
1279
+ "logps/rejected": -2.321422576904297,
1280
+ "loss": 0.9732,
1281
+ "rewards/accuracies": 0.5249999761581421,
1282
+ "rewards/chosen": -2.3088066577911377,
1283
+ "rewards/margins": 0.012615623883903027,
1284
+ "rewards/rejected": -2.321422576904297,
1285
+ "step": 840
1286
+ },
1287
+ {
1288
+ "epoch": 0.13165537270087124,
1289
+ "grad_norm": 0.41132065653800964,
1290
+ "learning_rate": 4.9575182258449e-07,
1291
+ "logits/chosen": -18.42027473449707,
1292
+ "logits/rejected": -18.392074584960938,
1293
+ "logps/chosen": -2.3201727867126465,
1294
+ "logps/rejected": -2.3016159534454346,
1295
+ "loss": 0.9955,
1296
+ "rewards/accuracies": 0.4375,
1297
+ "rewards/chosen": -2.3201727867126465,
1298
+ "rewards/margins": -0.018556838855147362,
1299
+ "rewards/rejected": -2.3016159534454346,
1300
+ "step": 850
1301
+ },
1302
+ {
1303
+ "epoch": 0.13320425943852857,
1304
+ "grad_norm": 0.4330069124698639,
1305
+ "learning_rate": 4.955063419033005e-07,
1306
+ "logits/chosen": -18.98206329345703,
1307
+ "logits/rejected": -18.960163116455078,
1308
+ "logps/chosen": -2.1484034061431885,
1309
+ "logps/rejected": -2.1282362937927246,
1310
+ "loss": 0.995,
1311
+ "rewards/accuracies": 0.4749999940395355,
1312
+ "rewards/chosen": -2.1484034061431885,
1313
+ "rewards/margins": -0.020167285576462746,
1314
+ "rewards/rejected": -2.1282362937927246,
1315
+ "step": 860
1316
+ },
1317
+ {
1318
+ "epoch": 0.13475314617618586,
1319
+ "grad_norm": 0.47932806611061096,
1320
+ "learning_rate": 4.952540307287181e-07,
1321
+ "logits/chosen": -18.520000457763672,
1322
+ "logits/rejected": -18.525272369384766,
1323
+ "logps/chosen": -2.382744789123535,
1324
+ "logps/rejected": -2.3476061820983887,
1325
+ "loss": 1.006,
1326
+ "rewards/accuracies": 0.4625000059604645,
1327
+ "rewards/chosen": -2.382744789123535,
1328
+ "rewards/margins": -0.03513883426785469,
1329
+ "rewards/rejected": -2.3476061820983887,
1330
+ "step": 870
1331
+ },
1332
+ {
1333
+ "epoch": 0.13630203291384319,
1334
+ "grad_norm": 0.3343891501426697,
1335
+ "learning_rate": 4.949948960805607e-07,
1336
+ "logits/chosen": -18.576160430908203,
1337
+ "logits/rejected": -18.56676483154297,
1338
+ "logps/chosen": -2.2828855514526367,
1339
+ "logps/rejected": -2.2378618717193604,
1340
+ "loss": 1.0109,
1341
+ "rewards/accuracies": 0.43437498807907104,
1342
+ "rewards/chosen": -2.2828855514526367,
1343
+ "rewards/margins": -0.045023877173662186,
1344
+ "rewards/rejected": -2.2378618717193604,
1345
+ "step": 880
1346
+ },
1347
+ {
1348
+ "epoch": 0.13785091965150048,
1349
+ "grad_norm": 0.4695475399494171,
1350
+ "learning_rate": 4.947289451684893e-07,
1351
+ "logits/chosen": -18.13470458984375,
1352
+ "logits/rejected": -18.110483169555664,
1353
+ "logps/chosen": -2.32598876953125,
1354
+ "logps/rejected": -2.2929863929748535,
1355
+ "loss": 1.0029,
1356
+ "rewards/accuracies": 0.453125,
1357
+ "rewards/chosen": -2.32598876953125,
1358
+ "rewards/margins": -0.033002231270074844,
1359
+ "rewards/rejected": -2.2929863929748535,
1360
+ "step": 890
1361
+ },
1362
+ {
1363
+ "epoch": 0.1393998063891578,
1364
+ "grad_norm": 0.6540862917900085,
1365
+ "learning_rate": 4.944561853918075e-07,
1366
+ "logits/chosen": -18.052448272705078,
1367
+ "logits/rejected": -18.07194709777832,
1368
+ "logps/chosen": -2.366678476333618,
1369
+ "logps/rejected": -2.3681142330169678,
1370
+ "loss": 0.9828,
1371
+ "rewards/accuracies": 0.48124998807907104,
1372
+ "rewards/chosen": -2.366678476333618,
1373
+ "rewards/margins": 0.0014358337502926588,
1374
+ "rewards/rejected": -2.3681142330169678,
1375
+ "step": 900
1376
+ },
1377
+ {
1378
+ "epoch": 0.1409486931268151,
1379
+ "grad_norm": 0.5994710326194763,
1380
+ "learning_rate": 4.941766243392554e-07,
1381
+ "logits/chosen": -18.121225357055664,
1382
+ "logits/rejected": -18.11789894104004,
1383
+ "logps/chosen": -2.383657932281494,
1384
+ "logps/rejected": -2.3472721576690674,
1385
+ "loss": 1.0044,
1386
+ "rewards/accuracies": 0.4749999940395355,
1387
+ "rewards/chosen": -2.383657932281494,
1388
+ "rewards/margins": -0.0363856740295887,
1389
+ "rewards/rejected": -2.3472721576690674,
1390
+ "step": 910
1391
+ },
1392
+ {
1393
+ "epoch": 0.14249757986447242,
1394
+ "grad_norm": 0.32324278354644775,
1395
+ "learning_rate": 4.938902697887989e-07,
1396
+ "logits/chosen": -18.380367279052734,
1397
+ "logits/rejected": -18.38044548034668,
1398
+ "logps/chosen": -2.3732495307922363,
1399
+ "logps/rejected": -2.344003438949585,
1400
+ "loss": 1.0015,
1401
+ "rewards/accuracies": 0.49687498807907104,
1402
+ "rewards/chosen": -2.3732495307922363,
1403
+ "rewards/margins": -0.029246056452393532,
1404
+ "rewards/rejected": -2.344003438949585,
1405
+ "step": 920
1406
+ },
1407
+ {
1408
+ "epoch": 0.14404646660212972,
1409
+ "grad_norm": 0.589393138885498,
1410
+ "learning_rate": 4.935971297074129e-07,
1411
+ "logits/chosen": -18.150585174560547,
1412
+ "logits/rejected": -18.17327308654785,
1413
+ "logps/chosen": -2.4686131477355957,
1414
+ "logps/rejected": -2.4409797191619873,
1415
+ "loss": 1.0025,
1416
+ "rewards/accuracies": 0.47187501192092896,
1417
+ "rewards/chosen": -2.4686131477355957,
1418
+ "rewards/margins": -0.027633434161543846,
1419
+ "rewards/rejected": -2.4409797191619873,
1420
+ "step": 930
1421
+ },
1422
+ {
1423
+ "epoch": 0.14559535333978701,
1424
+ "grad_norm": 0.2813025414943695,
1425
+ "learning_rate": 4.932972122508597e-07,
1426
+ "logits/chosen": -18.31452751159668,
1427
+ "logits/rejected": -18.32306480407715,
1428
+ "logps/chosen": -2.3727595806121826,
1429
+ "logps/rejected": -2.341411828994751,
1430
+ "loss": 1.0017,
1431
+ "rewards/accuracies": 0.45625001192092896,
1432
+ "rewards/chosen": -2.3727595806121826,
1433
+ "rewards/margins": -0.03134778141975403,
1434
+ "rewards/rejected": -2.341411828994751,
1435
+ "step": 940
1436
+ },
1437
+ {
1438
+ "epoch": 0.14714424007744434,
1439
+ "grad_norm": 0.3975474238395691,
1440
+ "learning_rate": 4.929905257634623e-07,
1441
+ "logits/chosen": -18.719608306884766,
1442
+ "logits/rejected": -18.705446243286133,
1443
+ "logps/chosen": -2.329033136367798,
1444
+ "logps/rejected": -2.3131375312805176,
1445
+ "loss": 0.9922,
1446
+ "rewards/accuracies": 0.4781250059604645,
1447
+ "rewards/chosen": -2.329033136367798,
1448
+ "rewards/margins": -0.015895655378699303,
1449
+ "rewards/rejected": -2.3131375312805176,
1450
+ "step": 950
1451
+ },
1452
+ {
1453
+ "epoch": 0.14869312681510163,
1454
+ "grad_norm": 0.4753539562225342,
1455
+ "learning_rate": 4.92677078777872e-07,
1456
+ "logits/chosen": -18.72245979309082,
1457
+ "logits/rejected": -18.711280822753906,
1458
+ "logps/chosen": -2.2939043045043945,
1459
+ "logps/rejected": -2.254490375518799,
1460
+ "loss": 1.0069,
1461
+ "rewards/accuracies": 0.4468750059604645,
1462
+ "rewards/chosen": -2.2939043045043945,
1463
+ "rewards/margins": -0.03941388428211212,
1464
+ "rewards/rejected": -2.254490375518799,
1465
+ "step": 960
1466
+ },
1467
+ {
1468
+ "epoch": 0.15024201355275896,
1469
+ "grad_norm": 0.6007782816886902,
1470
+ "learning_rate": 4.923568800148313e-07,
1471
+ "logits/chosen": -18.046024322509766,
1472
+ "logits/rejected": -18.04398536682129,
1473
+ "logps/chosen": -2.3614583015441895,
1474
+ "logps/rejected": -2.347895622253418,
1475
+ "loss": 0.9904,
1476
+ "rewards/accuracies": 0.4625000059604645,
1477
+ "rewards/chosen": -2.3614583015441895,
1478
+ "rewards/margins": -0.01356283575296402,
1479
+ "rewards/rejected": -2.347895622253418,
1480
+ "step": 970
1481
+ },
1482
+ {
1483
+ "epoch": 0.15179090029041625,
1484
+ "grad_norm": 0.49476128816604614,
1485
+ "learning_rate": 4.920299383829311e-07,
1486
+ "logits/chosen": -18.267019271850586,
1487
+ "logits/rejected": -18.278667449951172,
1488
+ "logps/chosen": -2.325904130935669,
1489
+ "logps/rejected": -2.292196750640869,
1490
+ "loss": 1.0039,
1491
+ "rewards/accuracies": 0.42500001192092896,
1492
+ "rewards/chosen": -2.325904130935669,
1493
+ "rewards/margins": -0.033707328140735626,
1494
+ "rewards/rejected": -2.292196750640869,
1495
+ "step": 980
1496
+ },
1497
+ {
1498
+ "epoch": 0.15333978702807358,
1499
+ "grad_norm": 0.32536447048187256,
1500
+ "learning_rate": 4.916962629783624e-07,
1501
+ "logits/chosen": -18.22359848022461,
1502
+ "logits/rejected": -18.20920181274414,
1503
+ "logps/chosen": -2.2321956157684326,
1504
+ "logps/rejected": -2.203826665878296,
1505
+ "loss": 0.9983,
1506
+ "rewards/accuracies": 0.44999998807907104,
1507
+ "rewards/chosen": -2.2321956157684326,
1508
+ "rewards/margins": -0.028368810191750526,
1509
+ "rewards/rejected": -2.203826665878296,
1510
+ "step": 990
1511
+ },
1512
+ {
1513
+ "epoch": 0.15488867376573087,
1514
+ "grad_norm": 0.544074535369873,
1515
+ "learning_rate": 4.913558630846644e-07,
1516
+ "logits/chosen": -18.59404754638672,
1517
+ "logits/rejected": -18.594675064086914,
1518
+ "logps/chosen": -2.4262187480926514,
1519
+ "logps/rejected": -2.3849034309387207,
1520
+ "loss": 1.0097,
1521
+ "rewards/accuracies": 0.44062501192092896,
1522
+ "rewards/chosen": -2.4262187480926514,
1523
+ "rewards/margins": -0.04131529480218887,
1524
+ "rewards/rejected": -2.3849034309387207,
1525
+ "step": 1000
1526
+ },
1527
+ {
1528
+ "epoch": 0.15488867376573087,
1529
+ "eval_logits/chosen": -18.314577102661133,
1530
+ "eval_logits/rejected": -18.323183059692383,
1531
+ "eval_logps/chosen": -2.329684019088745,
1532
+ "eval_logps/rejected": -2.299166679382324,
1533
+ "eval_loss": 1.001192331314087,
1534
+ "eval_rewards/accuracies": 0.4621647596359253,
1535
+ "eval_rewards/chosen": -2.329684019088745,
1536
+ "eval_rewards/margins": -0.030517518520355225,
1537
+ "eval_rewards/rejected": -2.299166679382324,
1538
+ "eval_runtime": 202.5434,
1539
+ "eval_samples_per_second": 10.304,
1540
+ "eval_steps_per_second": 5.154,
1541
+ "step": 1000
1542
+ }
1543
+ ],
1544
+ "logging_steps": 10,
1545
+ "max_steps": 6456,
1546
+ "num_input_tokens_seen": 0,
1547
+ "num_train_epochs": 1,
1548
+ "save_steps": 500,
1549
+ "stateful_callbacks": {
1550
+ "TrainerControl": {
1551
+ "args": {
1552
+ "should_epoch_stop": false,
1553
+ "should_evaluate": false,
1554
+ "should_log": false,
1555
+ "should_save": true,
1556
+ "should_training_stop": false
1557
+ },
1558
+ "attributes": {}
1559
+ }
1560
+ },
1561
+ "total_flos": 5.4976119008722944e+17,
1562
+ "train_batch_size": 2,
1563
+ "trial_name": null,
1564
+ "trial_params": null
1565
+ }
checkpoint-1000/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0c00fd8e0c0bb419bf14e06c220df8569292c63b6102412e14828236e7cad753
3
+ size 5432