gemma2b-summarize-gemini1.5flash
Collection
9 items
•
Updated
This model is a fine-tuned version of google/gemma-2b on the llama-duo/synth_summarize_dataset_dedup dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
2.0246 | 0.9811 | 26 | 2.6613 |
1.3202 | 2.0 | 53 | 2.5405 |
1.1694 | 2.9811 | 79 | 2.5125 |
1.1076 | 4.0 | 106 | 2.5138 |
1.0651 | 4.9811 | 132 | 2.5086 |
1.0394 | 6.0 | 159 | 2.5248 |
1.0232 | 6.9811 | 185 | 2.5264 |
1.0042 | 8.0 | 212 | 2.5296 |
1.0109 | 8.9811 | 238 | 2.5319 |
1.0064 | 9.8113 | 260 | 2.5319 |
Base model
google/gemma-2b