File size: 5,809 Bytes
18d474f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
---
library_name: transformers
license: apache-2.0
base_model: tsavage68/IE_M2_1000steps_1e7rate_SFT
tags:
- trl
- dpo
- generated_from_trainer
model-index:
- name: IE_M2_1000steps_1e6rate_05beta_cSFTDPO
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# IE_M2_1000steps_1e6rate_05beta_cSFTDPO

This model is a fine-tuned version of [tsavage68/IE_M2_1000steps_1e7rate_SFT](https://huggingface.co/tsavage68/IE_M2_1000steps_1e7rate_SFT) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3743
- Rewards/chosen: -0.6213
- Rewards/rejected: -9.0827
- Rewards/accuracies: 0.4600
- Rewards/margins: 8.4614
- Logps/rejected: -59.1872
- Logps/chosen: -43.4481
- Logits/rejected: -2.8827
- Logits/chosen: -2.8204

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 1e-06
- train_batch_size: 2
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 4
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 100
- training_steps: 1000

### Training results

| Training Loss | Epoch | Step | Validation Loss | Rewards/chosen | Rewards/rejected | Rewards/accuracies | Rewards/margins | Logps/rejected | Logps/chosen | Logits/rejected | Logits/chosen |
|:-------------:|:-----:|:----:|:---------------:|:--------------:|:----------------:|:------------------:|:---------------:|:--------------:|:------------:|:---------------:|:-------------:|
| 0.4505        | 0.4   | 50   | 0.3743          | -0.5495        | -8.1396          | 0.4600             | 7.5901          | -57.3010       | -43.3045     | -2.8855         | -2.8234       |
| 0.3812        | 0.8   | 100  | 0.3743          | -0.5954        | -8.5105          | 0.4600             | 7.9152          | -58.0429       | -43.3963     | -2.8832         | -2.8209       |
| 0.3119        | 1.2   | 150  | 0.3743          | -0.6132        | -8.8178          | 0.4600             | 8.2046          | -58.6574       | -43.4318     | -2.8827         | -2.8203       |
| 0.3639        | 1.6   | 200  | 0.3743          | -0.6125        | -8.8526          | 0.4600             | 8.2401          | -58.7270       | -43.4304     | -2.8828         | -2.8204       |
| 0.4332        | 2.0   | 250  | 0.3743          | -0.6117        | -8.9373          | 0.4600             | 8.3256          | -58.8965       | -43.4290     | -2.8831         | -2.8207       |
| 0.3986        | 2.4   | 300  | 0.3743          | -0.6002        | -8.9298          | 0.4600             | 8.3296          | -58.8814       | -43.4059     | -2.8831         | -2.8207       |
| 0.3986        | 2.8   | 350  | 0.3743          | -0.6140        | -8.9769          | 0.4600             | 8.3630          | -58.9757       | -43.4335     | -2.8828         | -2.8204       |
| 0.4505        | 3.2   | 400  | 0.3743          | -0.6256        | -8.9903          | 0.4600             | 8.3647          | -59.0024       | -43.4567     | -2.8829         | -2.8205       |
| 0.4505        | 3.6   | 450  | 0.3743          | -0.6066        | -8.9960          | 0.4600             | 8.3894          | -59.0138       | -43.4187     | -2.8831         | -2.8207       |
| 0.4332        | 4.0   | 500  | 0.3743          | -0.6183        | -9.0594          | 0.4600             | 8.4410          | -59.1406       | -43.4422     | -2.8830         | -2.8206       |
| 0.3292        | 4.4   | 550  | 0.3743          | -0.6163        | -9.0734          | 0.4600             | 8.4571          | -59.1686       | -43.4381     | -2.8830         | -2.8206       |
| 0.3639        | 4.8   | 600  | 0.3743          | -0.6171        | -9.0601          | 0.4600             | 8.4430          | -59.1421       | -43.4397     | -2.8828         | -2.8204       |
| 0.4505        | 5.2   | 650  | 0.3743          | -0.6207        | -9.0642          | 0.4600             | 8.4435          | -59.1503       | -43.4470     | -2.8830         | -2.8207       |
| 0.4505        | 5.6   | 700  | 0.3743          | -0.6061        | -9.0651          | 0.4600             | 8.4589          | -59.1519       | -43.4178     | -2.8831         | -2.8206       |
| 0.3639        | 6.0   | 750  | 0.3743          | -0.6217        | -9.0731          | 0.4600             | 8.4514          | -59.1681       | -43.4490     | -2.8829         | -2.8206       |
| 0.2426        | 6.4   | 800  | 0.3743          | -0.6241        | -9.0805          | 0.4600             | 8.4564          | -59.1827       | -43.4537     | -2.8829         | -2.8205       |
| 0.5025        | 6.8   | 850  | 0.3743          | -0.6248        | -9.0702          | 0.4600             | 8.4454          | -59.1623       | -43.4552     | -2.8827         | -2.8204       |
| 0.3119        | 7.2   | 900  | 0.3743          | -0.6258        | -9.0760          | 0.4600             | 8.4502          | -59.1739       | -43.4571     | -2.8826         | -2.8203       |
| 0.3466        | 7.6   | 950  | 0.3743          | -0.6208        | -9.0821          | 0.4600             | 8.4613          | -59.1860       | -43.4471     | -2.8827         | -2.8204       |
| 0.3812        | 8.0   | 1000 | 0.3743          | -0.6213        | -9.0827          | 0.4600             | 8.4614          | -59.1872       | -43.4481     | -2.8827         | -2.8204       |


### Framework versions

- Transformers 4.44.2
- Pytorch 2.0.0+cu117
- Datasets 3.0.0
- Tokenizers 0.19.1