Configuration Parsing
Warning:
In adapter_config.json: "peft.task_type" must be a string
0.5-1.0-1.0_0.001_alllora
This model is a fine-tuned version of openai/clip-vit-base-patch32 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 2.0976
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 128
- eval_batch_size: 128
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 50
- num_epochs: 10.0
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
0.2591 | 0.5043 | 58 | 1.8737 |
0.2307 | 1.0087 | 116 | 1.8561 |
0.168 | 1.5130 | 174 | 1.9298 |
0.1804 | 2.0174 | 232 | 1.8496 |
0.1204 | 2.5217 | 290 | 1.8696 |
0.0984 | 3.0261 | 348 | 1.9012 |
0.0967 | 3.5304 | 406 | 1.9590 |
0.0872 | 4.0348 | 464 | 2.0099 |
0.0642 | 4.5391 | 522 | 2.0354 |
0.063 | 5.0435 | 580 | 1.9165 |
0.0445 | 5.5478 | 638 | 2.0544 |
0.0528 | 6.0522 | 696 | 1.9761 |
0.0545 | 6.5565 | 754 | 2.0307 |
0.0332 | 7.0609 | 812 | 2.0031 |
0.0417 | 7.5652 | 870 | 2.0609 |
0.0398 | 8.0696 | 928 | 2.0922 |
0.0363 | 8.5739 | 986 | 2.0674 |
0.0393 | 9.0783 | 1044 | 2.0836 |
0.0342 | 9.5826 | 1102 | 2.0976 |
Framework versions
- PEFT 0.10.0
- Transformers 4.40.2
- Pytorch 2.1.2+cu121
- Datasets 2.16.1
- Tokenizers 0.19.1
- Downloads last month
- 3
Model tree for coastalcph/0.5-1.0-1.0_0.001_alllora
Base model
openai/clip-vit-base-patch32