BilalMuftuoglu's picture
End of training
0e7d8a1 verified
metadata
license: apache-2.0
base_model: facebook/deit-base-distilled-patch16-224
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: deit-base-distilled-patch16-224-hasta-85-fold5
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: train
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.7272727272727273

deit-base-distilled-patch16-224-hasta-85-fold5

This model is a fine-tuned version of facebook/deit-base-distilled-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7292
  • Accuracy: 0.7273

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 1 1.3530 0.2727
No log 2.0 2 1.1775 0.2727
No log 3.0 3 0.9128 0.5455
No log 4.0 4 0.7292 0.7273
No log 5.0 5 0.7753 0.7273
No log 6.0 6 0.9952 0.7273
No log 7.0 7 1.1799 0.7273
No log 8.0 8 1.2699 0.7273
No log 9.0 9 1.2729 0.7273
0.38 10.0 10 1.2451 0.7273
0.38 11.0 11 1.2341 0.7273
0.38 12.0 12 1.2153 0.7273
0.38 13.0 13 1.1847 0.7273
0.38 14.0 14 1.1903 0.7273
0.38 15.0 15 1.2128 0.7273
0.38 16.0 16 1.1724 0.7273
0.38 17.0 17 1.0705 0.7273
0.38 18.0 18 1.0106 0.7273
0.38 19.0 19 1.0238 0.7273
0.1729 20.0 20 1.0594 0.7273
0.1729 21.0 21 1.1695 0.7273
0.1729 22.0 22 1.2466 0.7273
0.1729 23.0 23 1.2861 0.7273
0.1729 24.0 24 1.2790 0.7273
0.1729 25.0 25 1.3177 0.7273
0.1729 26.0 26 1.4130 0.7273
0.1729 27.0 27 1.5155 0.7273
0.1729 28.0 28 1.6224 0.7273
0.1729 29.0 29 1.5918 0.7273
0.0904 30.0 30 1.4099 0.7273
0.0904 31.0 31 1.2681 0.7273
0.0904 32.0 32 1.1371 0.7273
0.0904 33.0 33 1.0387 0.7273
0.0904 34.0 34 1.0128 0.7273
0.0904 35.0 35 1.0423 0.7273
0.0904 36.0 36 1.1730 0.7273
0.0904 37.0 37 1.3305 0.7273
0.0904 38.0 38 1.3820 0.7273
0.0904 39.0 39 1.3682 0.7273
0.0591 40.0 40 1.2854 0.7273
0.0591 41.0 41 1.2000 0.7273
0.0591 42.0 42 1.2079 0.7273
0.0591 43.0 43 1.2967 0.7273
0.0591 44.0 44 1.3672 0.7273
0.0591 45.0 45 1.3709 0.7273
0.0591 46.0 46 1.4370 0.7273
0.0591 47.0 47 1.4863 0.7273
0.0591 48.0 48 1.5696 0.7273
0.0591 49.0 49 1.5678 0.7273
0.0289 50.0 50 1.5444 0.7273
0.0289 51.0 51 1.5086 0.7273
0.0289 52.0 52 1.4333 0.7273
0.0289 53.0 53 1.3644 0.7273
0.0289 54.0 54 1.3099 0.7273
0.0289 55.0 55 1.3355 0.7273
0.0289 56.0 56 1.4286 0.7273
0.0289 57.0 57 1.6121 0.7273
0.0289 58.0 58 1.8074 0.7273
0.0289 59.0 59 1.9461 0.7273
0.0342 60.0 60 2.0314 0.7273
0.0342 61.0 61 2.0408 0.7273
0.0342 62.0 62 2.0476 0.7273
0.0342 63.0 63 2.0517 0.7273
0.0342 64.0 64 2.0217 0.7273
0.0342 65.0 65 1.9582 0.7273
0.0342 66.0 66 1.8825 0.7273
0.0342 67.0 67 1.8085 0.7273
0.0342 68.0 68 1.7880 0.7273
0.0342 69.0 69 1.7796 0.7273
0.0203 70.0 70 1.7929 0.7273
0.0203 71.0 71 1.8213 0.7273
0.0203 72.0 72 1.8388 0.7273
0.0203 73.0 73 1.8488 0.7273
0.0203 74.0 74 1.8753 0.7273
0.0203 75.0 75 1.9079 0.7273
0.0203 76.0 76 1.9396 0.7273
0.0203 77.0 77 1.9556 0.7273
0.0203 78.0 78 1.9591 0.7273
0.0203 79.0 79 1.9668 0.7273
0.0324 80.0 80 1.9865 0.7273
0.0324 81.0 81 2.0098 0.7273
0.0324 82.0 82 2.0384 0.7273
0.0324 83.0 83 2.0659 0.7273
0.0324 84.0 84 2.0952 0.7273
0.0324 85.0 85 2.1069 0.7273
0.0324 86.0 86 2.1130 0.7273
0.0324 87.0 87 2.1181 0.7273
0.0324 88.0 88 2.1180 0.7273
0.0324 89.0 89 2.1173 0.7273
0.0244 90.0 90 2.1193 0.7273
0.0244 91.0 91 2.1226 0.7273
0.0244 92.0 92 2.1256 0.7273
0.0244 93.0 93 2.1312 0.7273
0.0244 94.0 94 2.1333 0.7273
0.0244 95.0 95 2.1371 0.7273
0.0244 96.0 96 2.1413 0.7273
0.0244 97.0 97 2.1438 0.7273
0.0244 98.0 98 2.1454 0.7273
0.0244 99.0 99 2.1451 0.7273
0.027 100.0 100 2.1448 0.7273

Framework versions

  • Transformers 4.41.0
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1