OpenELM-1_1B-KTO
This model was trained from scratch on the None dataset. It achieves the following results on the evaluation set:
- Kl: 0.0
- Logits/chosen: -1428926080.0
- Logits/rejected: -1154165888.0
- Logps/chosen: -548.5443
- Logps/rejected: -840.7200
- Loss: 0.4491
- Rewards/chosen: -2.1686
- Rewards/margins: 3.2859
- Rewards/rejected: -5.4545
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 16
- seed: 42
- distributed_type: multi-GPU
- num_devices: 4
- gradient_accumulation_steps: 2
- total_train_batch_size: 64
- total_eval_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
Training results
Training Loss | Epoch | Step | Kl | Logits/chosen | Logits/rejected | Logps/chosen | Logps/rejected | Validation Loss | Rewards/chosen | Rewards/margins | Rewards/rejected |
---|---|---|---|---|---|---|---|---|---|---|---|
0.4859 | 0.0523 | 100 | 0.0 | -3175687424.0 | -3128284928.0 | -393.0411 | -370.7767 | 0.4855 | -0.6136 | 0.1415 | -0.7551 |
0.4556 | 0.1047 | 200 | 0.0 | -3518155520.0 | -3502177280.0 | -452.2593 | -491.0910 | 0.4658 | -1.2023 | 0.7579 | -1.9602 |
0.4658 | 0.1570 | 300 | 0.0 | -3454242304.0 | -3428943360.0 | -416.2217 | -439.7414 | 0.4630 | -0.8430 | 0.6032 | -1.4462 |
0.4543 | 0.2094 | 400 | 0.0 | -3808428032.0 | -3728536576.0 | -562.5413 | -589.5574 | 0.4728 | -2.3096 | 0.6357 | -2.9453 |
0.4445 | 0.2617 | 500 | 0.0 | -3128418048.0 | -2977955840.0 | -534.3256 | -585.3565 | 0.4657 | -2.0271 | 0.8745 | -2.9015 |
0.4654 | 0.3141 | 600 | 0.0 | -3596449280.0 | -3444389120.0 | -529.1321 | -582.2695 | 0.4658 | -1.9723 | 0.8963 | -2.8686 |
0.4517 | 0.3664 | 700 | 0.0 | -3631068928.0 | -3583666688.0 | -452.7501 | -492.3958 | 0.4690 | -1.2099 | 0.7624 | -1.9723 |
0.4701 | 0.4187 | 800 | 0.0 | -3269426688.0 | -3207910144.0 | -550.3810 | -656.9627 | 0.4595 | -2.1877 | 1.4299 | -3.6176 |
0.4711 | 0.4711 | 900 | 0.0 | -3022695168.0 | -3008713984.0 | -592.4354 | -758.0388 | 0.4626 | -2.6066 | 2.0224 | -4.6290 |
0.4534 | 0.5234 | 1000 | 0.0 | -2621240320.0 | -2359728640.0 | -548.9560 | -656.3580 | 0.4594 | -2.1704 | 1.4414 | -3.6118 |
0.4428 | 0.5758 | 1100 | 0.0 | -2962243840.0 | -2838944512.0 | -587.9386 | -759.0254 | 0.4583 | -2.5622 | 2.0794 | -4.6416 |
0.4619 | 0.6281 | 1200 | 0.0 | -2887944704.0 | -2875961088.0 | -570.1098 | -742.3173 | 0.4549 | -2.3843 | 2.0891 | -4.4734 |
0.4627 | 0.6805 | 1300 | 0.0 | -2942404096.0 | -2700332800.0 | -705.3934 | -1037.3665 | 0.4565 | -3.7379 | 3.6828 | -7.4207 |
0.4622 | 0.7328 | 1400 | 0.0 | -3010711296.0 | -2812979968.0 | -783.2637 | -1089.1775 | 0.4620 | -4.5181 | 3.4185 | -7.9366 |
0.4571 | 0.7851 | 1500 | 0.0 | -2759985152.0 | -2638283776.0 | -567.8931 | -773.6330 | 0.4572 | -2.3622 | 2.4227 | -4.7849 |
0.4714 | 0.8375 | 1600 | 0.0 | -2749732352.0 | -2697137152.0 | -526.0604 | -662.9458 | 0.4618 | -1.9416 | 1.7369 | -3.6785 |
0.4266 | 0.8898 | 1700 | 0.0 | -1904080896.0 | -1675191680.0 | -554.2761 | -683.2819 | 0.4548 | -2.2239 | 1.6573 | -3.8812 |
0.449 | 0.9422 | 1800 | 0.0 | -2063198080.0 | -1781980032.0 | -630.7531 | -780.4754 | 0.4612 | -2.9899 | 1.8639 | -4.8538 |
0.4773 | 0.9945 | 1900 | 0.0 | -2405932544.0 | -2133635840.0 | -658.1771 | -792.9826 | 0.4612 | -3.2661 | 1.7145 | -4.9806 |
0.4654 | 1.0468 | 2000 | 0.0 | -2678495744.0 | -2597272832.0 | -528.9421 | -616.7041 | 0.4568 | -1.9690 | 1.2455 | -3.2144 |
0.4228 | 1.0992 | 2100 | 0.0 | -1897423232.0 | -1554289280.0 | -605.4824 | -827.4172 | 0.4540 | -2.7379 | 2.5883 | -5.3262 |
0.4094 | 1.1515 | 2200 | 0.0 | -1938966784.0 | -1658281344.0 | -495.0579 | -585.3565 | 0.4561 | -1.6318 | 1.2711 | -2.9029 |
0.3779 | 1.2039 | 2300 | 0.0 | -2399674624.0 | -2125380352.0 | -514.6284 | -627.5883 | 0.4564 | -1.8295 | 1.4960 | -3.3256 |
0.3731 | 1.2562 | 2400 | 0.0 | -1886105216.0 | -1661210752.0 | -496.7679 | -605.2153 | 0.4578 | -1.6493 | 1.4530 | -3.1023 |
0.3538 | 1.3086 | 2500 | 0.0 | -1523131520.0 | -1173539584.0 | -520.2969 | -641.4639 | 0.4578 | -1.8841 | 1.5789 | -3.4630 |
0.3699 | 1.3609 | 2600 | 0.0 | -2620041984.0 | -2228706560.0 | -464.2454 | -571.5127 | 0.4552 | -1.3250 | 1.4383 | -2.7633 |
0.3293 | 1.4132 | 2700 | 0.0 | -1547698176.0 | -1145178112.0 | -559.4696 | -805.1397 | 0.4508 | -2.2760 | 2.8229 | -5.0990 |
0.3376 | 1.4656 | 2800 | 0.0 | -1543770240.0 | -1201035648.0 | -495.7229 | -643.6280 | 0.4576 | -1.6388 | 1.8453 | -3.4841 |
0.3545 | 1.5179 | 2900 | 0.0 | -1516407296.0 | -1264482816.0 | -736.2058 | -1167.2123 | 0.4523 | -4.0433 | 4.6733 | -8.7166 |
0.3399 | 1.5703 | 3000 | 0.0 | -2308331776.0 | -2173847808.0 | -634.2682 | -983.0731 | 0.4497 | -3.0224 | 3.8548 | -6.8772 |
0.3429 | 1.6226 | 3100 | 0.0 | -2065728000.0 | -1775322368.0 | -641.7100 | -945.9015 | 0.4497 | -3.1008 | 3.4052 | -6.5060 |
0.3005 | 1.6750 | 3200 | 0.0 | -2172250112.0 | -2024051328.0 | -515.8318 | -719.3396 | 0.4492 | -1.8422 | 2.3998 | -4.2420 |
0.3468 | 1.7273 | 3300 | 0.0 | -2299277568.0 | -2052279552.0 | -650.1019 | -1102.9896 | 0.4503 | -3.1821 | 4.8952 | -8.0773 |
0.3361 | 1.7796 | 3400 | 0.0 | -1953080960.0 | -1726854912.0 | -586.8936 | -953.3486 | 0.4488 | -2.5504 | 4.0314 | -6.5818 |
0.3405 | 1.8320 | 3500 | 0.0 | -1600359936.0 | -1368940928.0 | -595.8555 | -917.2273 | 0.4473 | -2.6417 | 3.5806 | -6.2223 |
0.362 | 1.8843 | 3600 | 0.0 | -1622263552.0 | -1444837888.0 | -594.8738 | -933.2352 | 0.4472 | -2.6312 | 3.7478 | -6.3789 |
0.3153 | 1.9367 | 3700 | 0.0 | -1449431680.0 | -1236653952.0 | -548.1009 | -828.2765 | 0.4482 | -2.1611 | 3.1750 | -5.3362 |
0.332 | 1.9890 | 3800 | 0.0 | -1527192704.0 | -1300766848.0 | -517.2568 | -750.3690 | 0.4465 | -1.8534 | 2.6999 | -4.5532 |
0.3281 | 2.0414 | 3900 | 0.0 | -1679585792.0 | -1427328256.0 | -596.9639 | -954.6852 | 0.4463 | -2.6517 | 3.9427 | -6.5944 |
0.3181 | 2.0937 | 4000 | 0.0 | -1560081408.0 | -1263351040.0 | -525.1103 | -727.0094 | 0.4486 | -1.9321 | 2.3847 | -4.3169 |
0.2603 | 2.1460 | 4100 | 0.0 | -1459418112.0 | -1138054528.0 | -524.6037 | -721.5356 | 0.4512 | -1.9273 | 2.3376 | -4.2649 |
0.2388 | 2.1984 | 4200 | 0.0 | -1377462656.0 | -1113288064.0 | -484.2593 | -613.2352 | 0.4556 | -1.5242 | 1.6601 | -3.1844 |
0.224 | 2.2507 | 4300 | 0.0 | -1391976320.0 | -1133061248.0 | -479.9842 | -590.8304 | 0.4576 | -1.4813 | 1.4770 | -2.9584 |
0.26 | 2.3031 | 4400 | 0.0 | -1215682432.0 | -961427712.0 | -496.7046 | -633.5714 | 0.4562 | -1.6511 | 1.7328 | -3.3839 |
0.2234 | 2.3554 | 4500 | 0.0 | -1145577600.0 | -884432256.0 | -540.8808 | -735.6340 | 0.4557 | -2.0918 | 2.3138 | -4.4056 |
0.235 | 2.4077 | 4600 | 0.0 | -1404625792.0 | -1132328960.0 | -488.9302 | -651.3615 | 0.4559 | -1.5702 | 1.9911 | -3.5613 |
0.2246 | 2.4601 | 4700 | 0.0 | -1378527872.0 | -1094446976.0 | -506.0782 | -691.9065 | 0.4523 | -1.7423 | 2.2245 | -3.9668 |
0.24 | 2.5124 | 4800 | 0.0 | -1400231808.0 | -1111490560.0 | -509.5299 | -701.2312 | 0.4519 | -1.7776 | 2.2838 | -4.0614 |
0.2557 | 2.5648 | 4900 | 0.0 | -1397235840.0 | -1123008256.0 | -528.5304 | -749.2869 | 0.4514 | -1.9623 | 2.5779 | -4.5403 |
0.2413 | 2.6171 | 5000 | 0.0 | -1477659904.0 | -1194644352.0 | -527.0421 | -771.0870 | 0.4506 | -1.9521 | 2.8087 | -4.7608 |
0.2273 | 2.6695 | 5100 | 0.0 | -1420737280.0 | -1144246016.0 | -524.3503 | -753.7106 | 0.4495 | -1.9271 | 2.6609 | -4.5880 |
0.2645 | 2.7218 | 5200 | 0.0 | -1426662528.0 | -1147708032.0 | -529.0688 | -767.9682 | 0.4501 | -1.9730 | 2.7549 | -4.7279 |
0.2637 | 2.7741 | 5300 | 0.0 | -1414479104.0 | -1139319424.0 | -551.0460 | -835.7553 | 0.4497 | -2.1932 | 3.2146 | -5.4078 |
0.2683 | 2.8265 | 5400 | 0.0 | -1420271232.0 | -1144445824.0 | -552.9777 | -847.9125 | 0.4496 | -2.2117 | 3.3187 | -5.5303 |
0.2551 | 2.8788 | 5500 | 0.0 | -1425663872.0 | -1148307200.0 | -551.4577 | -846.3531 | 0.4494 | -2.1969 | 3.3124 | -5.5093 |
0.2695 | 2.9312 | 5600 | 0.0 | -1428859520.0 | -1154099328.0 | -548.8293 | -841.0065 | 0.4490 | -2.1721 | 3.2870 | -5.4591 |
0.2664 | 2.9835 | 5700 | 0.0 | -1428926080.0 | -1154165888.0 | -548.5443 | -840.7200 | 0.4491 | -2.1686 | 3.2859 | -5.4545 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.3.0
- Datasets 3.0.0
- Tokenizers 0.19.1
- Downloads last month
- 9
Inference API (serverless) does not yet support model repos that contain custom code.