File size: 7,275 Bytes
05f53ed
 
 
 
 
536bdce
05f53ed
 
 
 
 
 
 
 
 
 
 
 
 
89fe5c3
 
 
 
 
 
 
 
 
05f53ed
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
89fe5c3
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
05f53ed
 
 
 
 
 
536bdce
05f53ed
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
---
library_name: transformers
tags:
- trl
- dpo
- alignment-handbook
- generated_from_trainer
model-index:
- name: OpenELM-1_1B-DPO-full-least-similar
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# OpenELM-1_1B-DPO-full-least-similar

This model was trained from scratch on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0148
- Rewards/chosen: -3.4375
- Rewards/rejected: -3.625
- Rewards/accuracies: 0.4844
- Rewards/margins: 0.1973
- Logps/rejected: -652.0
- Logps/chosen: -660.0
- Logits/rejected: -11.75
- Logits/chosen: -12.1875

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 16
- seed: 42
- distributed_type: multi-GPU
- num_devices: 4
- gradient_accumulation_steps: 2
- total_train_batch_size: 64
- total_eval_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3

### Training results

| Training Loss | Epoch  | Step | Validation Loss | Rewards/chosen | Rewards/rejected | Rewards/accuracies | Rewards/margins | Logps/rejected | Logps/chosen | Logits/rejected | Logits/chosen |
|:-------------:|:------:|:----:|:---------------:|:--------------:|:----------------:|:------------------:|:---------------:|:--------------:|:------------:|:---------------:|:-------------:|
| 0.1861        | 0.1047 | 100  | 0.6770          | -0.4492        | -0.5586          | 0.5410             | 0.1079          | -344.0         | -364.0       | -14.625         | -14.6875      |
| 0.1137        | 0.2094 | 200  | 0.7199          | -0.8047        | -0.8945          | 0.5156             | 0.0869          | -378.0         | -398.0       | -11.5625        | -11.75        |
| 0.1386        | 0.3141 | 300  | 0.7895          | -1.8984        | -1.9609          | 0.4707             | 0.0583          | -484.0         | -508.0       | -14.4375        | -14.5625      |
| 0.1441        | 0.4188 | 400  | 0.7394          | -1.3359        | -1.3672          | 0.5156             | 0.0354          | -426.0         | -452.0       | -13.4375        | -13.875       |
| 0.1448        | 0.5236 | 500  | 0.8145          | -1.3906        | -1.4375          | 0.4902             | 0.0432          | -432.0         | -458.0       | -17.0           | -17.125       |
| 0.1399        | 0.6283 | 600  | 0.8182          | -2.1719        | -2.1875          | 0.4863             | 0.0214          | -508.0         | -536.0       | -8.125          | -9.125        |
| 0.1254        | 0.7330 | 700  | 0.8278          | -1.7422        | -1.8281          | 0.5039             | 0.0825          | -472.0         | -492.0       | -13.5           | -13.875       |
| 0.1165        | 0.8377 | 800  | 0.8810          | -1.7266        | -1.6953          | 0.4727             | -0.0306         | -458.0         | -492.0       | -14.125         | -14.375       |
| 0.1534        | 0.9424 | 900  | 0.8332          | -2.125         | -2.2031          | 0.4863             | 0.0776          | -510.0         | -532.0       | -11.75          | -12.375       |
| 0.0209        | 1.0471 | 1000 | 0.8379          | -2.0469        | -2.1406          | 0.4785             | 0.1011          | -504.0         | -524.0       | -14.1875        | -14.5625      |
| 0.0342        | 1.1518 | 1100 | 0.8447          | -2.4219        | -2.5469          | 0.5059             | 0.1318          | -544.0         | -560.0       | -14.625         | -15.0625      |
| 0.0166        | 1.2565 | 1200 | 0.8359          | -2.6562        | -2.75            | 0.5020             | 0.0996          | -564.0         | -584.0       | -12.8125        | -13.1875      |
| 0.0195        | 1.3613 | 1300 | 0.8762          | -2.5312        | -2.625           | 0.5039             | 0.0854          | -552.0         | -572.0       | -14.4375        | -14.75        |
| 0.0187        | 1.4660 | 1400 | 0.8860          | -2.4531        | -2.5156          | 0.5039             | 0.0684          | -540.0         | -564.0       | -15.125         | -15.25        |
| 0.0346        | 1.5707 | 1500 | 0.8857          | -2.7031        | -2.8125          | 0.5                | 0.1074          | -572.0         | -588.0       | -13.0625        | -13.4375      |
| 0.016         | 1.6754 | 1600 | 0.9007          | -2.9531        | -3.0312          | 0.4941             | 0.0728          | -592.0         | -612.0       | -12.25          | -12.625       |
| 0.0277        | 1.7801 | 1700 | 0.9100          | -2.8281        | -2.8906          | 0.4980             | 0.0571          | -576.0         | -600.0       | -12.875         | -13.125       |
| 0.0183        | 1.8848 | 1800 | 0.8937          | -2.9219        | -3.0156          | 0.4922             | 0.0991          | -592.0         | -612.0       | -11.4375        | -11.75        |
| 0.0098        | 1.9895 | 1900 | 0.8843          | -2.8125        | -2.9375          | 0.4844             | 0.1318          | -584.0         | -600.0       | -11.5           | -11.9375      |
| 0.0016        | 2.0942 | 2000 | 0.9360          | -2.9688        | -3.125           | 0.4941             | 0.1543          | -600.0         | -616.0       | -11.75          | -12.125       |
| 0.0015        | 2.1990 | 2100 | 0.9743          | -3.1875        | -3.3594          | 0.4824             | 0.1758          | -624.0         | -636.0       | -11.875         | -12.3125      |
| 0.0006        | 2.3037 | 2200 | 0.9987          | -3.3906        | -3.5781          | 0.4980             | 0.1953          | -648.0         | -656.0       | -11.75          | -12.125       |
| 0.0014        | 2.4084 | 2300 | 1.0158          | -3.4688        | -3.6719          | 0.4902             | 0.2031          | -656.0         | -664.0       | -11.75          | -12.1875      |
| 0.0019        | 2.5131 | 2400 | 1.0199          | -3.4844        | -3.6875          | 0.4863             | 0.2051          | -656.0         | -668.0       | -11.75          | -12.1875      |
| 0.001         | 2.6178 | 2500 | 1.0131          | -3.4531        | -3.6406          | 0.4902             | 0.1973          | -652.0         | -664.0       | -11.75          | -12.1875      |
| 0.0012        | 2.7225 | 2600 | 1.0130          | -3.4375        | -3.625           | 0.4941             | 0.1953          | -652.0         | -660.0       | -11.75          | -12.1875      |
| 0.0009        | 2.8272 | 2700 | 1.0145          | -3.4375        | -3.6406          | 0.4844             | 0.1973          | -652.0         | -660.0       | -11.8125        | -12.1875      |
| 0.0011        | 2.9319 | 2800 | 1.0148          | -3.4375        | -3.625           | 0.4844             | 0.1973          | -652.0         | -660.0       | -11.75          | -12.1875      |


### Framework versions

- Transformers 4.44.2
- Pytorch 2.3.0
- Datasets 3.0.0
- Tokenizers 0.19.1