Edit model card

vit-base-patch16-224-finetuned-eurosat

This model is a fine-tuned version of google/vit-base-patch16-224 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.8960
  • Model Preparation Time: 0.0037

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Model Preparation Time
7.7174 0.9874 59 7.7848 0.0037
7.6016 1.9916 119 7.7339 0.0037
7.4761 2.9958 179 7.6441 0.0037
7.2852 4.0 239 7.5057 0.0037
7.083 4.9874 298 7.3286 0.0037
6.8119 5.9916 358 7.1090 0.0037
6.5497 6.9958 418 6.8711 0.0037
6.1656 8.0 478 6.6169 0.0037
5.8334 8.9874 537 6.3286 0.0037
5.3878 9.9916 597 6.0292 0.0037
5.0134 10.9958 657 5.7486 0.0037
4.6087 12.0 717 5.4834 0.0037
4.2544 12.9874 776 5.2186 0.0037
3.8669 13.9916 836 4.9842 0.0037
3.5993 14.9958 896 4.7566 0.0037
3.2331 16.0 956 4.5623 0.0037
2.9124 16.9874 1015 4.3663 0.0037
2.6122 17.9916 1075 4.1944 0.0037
2.466 18.9958 1135 4.0160 0.0037
2.2074 20.0 1195 3.8582 0.0037
2.0851 20.9874 1254 3.7160 0.0037
1.8354 21.9916 1314 3.5740 0.0037
1.7343 22.9958 1374 3.4548 0.0037
1.5804 24.0 1434 3.3600 0.0037
1.3193 24.9874 1493 3.2336 0.0037
1.328 25.9916 1553 3.1294 0.0037
1.163 26.9958 1613 3.0355 0.0037
1.0761 28.0 1673 2.9737 0.0037
0.9834 28.9874 1732 2.8952 0.0037
0.9141 29.9916 1792 2.7900 0.0037
0.8862 30.9958 1852 2.7381 0.0037
0.7757 32.0 1912 2.6868 0.0037
0.7475 32.9874 1971 2.6134 0.0037
0.6518 33.9916 2031 2.5770 0.0037
0.6766 34.9958 2091 2.5278 0.0037
0.5741 36.0 2151 2.5009 0.0037
0.5877 36.9874 2210 2.4436 0.0037
0.4996 37.9916 2270 2.4148 0.0037
0.5316 38.9958 2330 2.3809 0.0037
0.4896 40.0 2390 2.3330 0.0037
0.501 40.9874 2449 2.3055 0.0037
0.4052 41.9916 2509 2.3000 0.0037
0.398 42.9958 2569 2.2854 0.0037
0.3702 44.0 2629 2.2536 0.0037
0.3629 44.9874 2688 2.2342 0.0037
0.3729 45.9916 2748 2.2190 0.0037
0.3206 46.9958 2808 2.2078 0.0037
0.38 48.0 2868 2.1726 0.0037
0.3379 48.9874 2927 2.1600 0.0037
0.3248 49.9916 2987 2.1453 0.0037
0.3577 50.9958 3047 2.1153 0.0037
0.2946 52.0 3107 2.1232 0.0037
0.2938 52.9874 3166 2.1076 0.0037
0.289 53.9916 3226 2.0892 0.0037
0.3044 54.9958 3286 2.0692 0.0037
0.277 56.0 3346 2.0667 0.0037
0.2774 56.9874 3405 2.0554 0.0037
0.2717 57.9916 3465 2.0369 0.0037
0.2722 58.9958 3525 2.0261 0.0037
0.2325 60.0 3585 2.0419 0.0037
0.2387 60.9874 3644 2.0073 0.0037
0.2343 61.9916 3704 2.0230 0.0037
0.2281 62.9958 3764 2.0228 0.0037
0.2597 64.0 3824 1.9956 0.0037
0.223 64.9874 3883 1.9902 0.0037
0.2213 65.9916 3943 1.9778 0.0037
0.1835 66.9958 4003 1.9945 0.0037
0.2247 68.0 4063 1.9703 0.0037
0.1819 68.9874 4122 1.9623 0.0037
0.2096 69.9916 4182 1.9686 0.0037
0.186 70.9958 4242 1.9764 0.0037
0.1956 72.0 4302 1.9606 0.0037
0.197 72.9874 4361 1.9432 0.0037
0.1867 73.9916 4421 1.9461 0.0037
0.1994 74.9958 4481 1.9547 0.0037
0.1631 76.0 4541 1.9373 0.0037
0.184 76.9874 4600 1.9329 0.0037
0.1518 77.9916 4660 1.9355 0.0037
0.1774 78.9958 4720 1.9367 0.0037
0.1558 80.0 4780 1.9211 0.0037
0.1859 80.9874 4839 1.9256 0.0037
0.1673 81.9916 4899 1.9271 0.0037
0.1531 82.9958 4959 1.9332 0.0037
0.1763 84.0 5019 1.9154 0.0037
0.1594 84.9874 5078 1.9143 0.0037
0.17 85.9916 5138 1.9098 0.0037
0.1246 86.9958 5198 1.9123 0.0037
0.1699 88.0 5258 1.9066 0.0037
0.1627 88.9874 5317 1.9054 0.0037
0.1663 89.9916 5377 1.9040 0.0037
0.1349 90.9958 5437 1.9031 0.0037
0.1578 92.0 5497 1.9065 0.0037
0.1553 92.9874 5556 1.8997 0.0037
0.1393 93.9916 5616 1.8972 0.0037
0.1652 94.9958 5676 1.8960 0.0037
0.1677 96.0 5736 1.9002 0.0037
0.1544 96.9874 5795 1.8966 0.0037
0.1359 97.9916 5855 1.8966 0.0037
0.1495 98.7448 5900 1.8965 0.0037

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.1+cu121
  • Datasets 3.0.0
  • Tokenizers 0.19.1
Downloads last month
20
Safetensors
Model size
87.7M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for viniFiedler/vit-base-patch16-224-finetuned-eurosat

Finetuned
(488)
this model