habibi26 commited on
Commit
9e38882
1 Parent(s): 9240b00

Model save

Browse files
Files changed (2) hide show
  1. README.md +20 -20
  2. model.safetensors +1 -1
README.md CHANGED
@@ -21,7 +21,7 @@ model-index:
21
  metrics:
22
  - name: Accuracy
23
  type: accuracy
24
- value: 0.9428571428571428
25
  ---
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -31,8 +31,8 @@ should probably proofread and complete it, then remove this comment. -->
31
 
32
  This model is a fine-tuned version of [openai/clip-vit-base-patch32](https://huggingface.co/openai/clip-vit-base-patch32) on the imagefolder dataset.
33
  It achieves the following results on the evaluation set:
34
- - Loss: 0.2587
35
- - Accuracy: 0.9429
36
 
37
  ## Model description
38
 
@@ -66,23 +66,23 @@ The following hyperparameters were used during training:
66
 
67
  | Training Loss | Epoch | Step | Validation Loss | Accuracy |
68
  |:-------------:|:-------:|:----:|:---------------:|:--------:|
69
- | No log | 0.8421 | 4 | 0.2112 | 0.9571 |
70
- | No log | 1.8947 | 9 | 0.1227 | 0.9857 |
71
- | 0.295 | 2.9474 | 14 | 0.1203 | 0.9571 |
72
- | 0.295 | 4.0 | 19 | 0.0635 | 0.9714 |
73
- | 0.0962 | 4.8421 | 23 | 0.2939 | 0.9429 |
74
- | 0.0962 | 5.8947 | 28 | 0.2483 | 0.9286 |
75
- | 0.163 | 6.9474 | 33 | 0.0712 | 0.9857 |
76
- | 0.163 | 8.0 | 38 | 0.0474 | 0.9714 |
77
- | 0.0646 | 8.8421 | 42 | 0.2012 | 0.9429 |
78
- | 0.0646 | 9.8947 | 47 | 0.3587 | 0.9 |
79
- | 0.1048 | 10.9474 | 52 | 0.0427 | 0.9857 |
80
- | 0.1048 | 12.0 | 57 | 0.0149 | 0.9857 |
81
- | 0.0519 | 12.8421 | 61 | 0.1616 | 0.9571 |
82
- | 0.0519 | 13.8947 | 66 | 0.2286 | 0.9571 |
83
- | 0.0151 | 14.9474 | 71 | 0.1369 | 0.9571 |
84
- | 0.0151 | 16.0 | 76 | 0.2154 | 0.9571 |
85
- | 0.0455 | 16.8421 | 80 | 0.2587 | 0.9429 |
86
 
87
 
88
  ### Framework versions
 
21
  metrics:
22
  - name: Accuracy
23
  type: accuracy
24
+ value: 0.9857142857142858
25
  ---
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
31
 
32
  This model is a fine-tuned version of [openai/clip-vit-base-patch32](https://huggingface.co/openai/clip-vit-base-patch32) on the imagefolder dataset.
33
  It achieves the following results on the evaluation set:
34
+ - Loss: 0.1558
35
+ - Accuracy: 0.9857
36
 
37
  ## Model description
38
 
 
66
 
67
  | Training Loss | Epoch | Step | Validation Loss | Accuracy |
68
  |:-------------:|:-------:|:----:|:---------------:|:--------:|
69
+ | No log | 0.8421 | 4 | 0.5162 | 0.8286 |
70
+ | No log | 1.8947 | 9 | 0.5838 | 0.7714 |
71
+ | 0.4911 | 2.9474 | 14 | 0.1588 | 0.9429 |
72
+ | 0.4911 | 4.0 | 19 | 0.2043 | 0.9429 |
73
+ | 0.0819 | 4.8421 | 23 | 0.1261 | 0.9714 |
74
+ | 0.0819 | 5.8947 | 28 | 0.6375 | 0.9143 |
75
+ | 0.1739 | 6.9474 | 33 | 1.1991 | 0.8429 |
76
+ | 0.1739 | 8.0 | 38 | 0.8591 | 0.8571 |
77
+ | 0.2177 | 8.8421 | 42 | 0.2559 | 0.9571 |
78
+ | 0.2177 | 9.8947 | 47 | 0.1431 | 0.9286 |
79
+ | 0.022 | 10.9474 | 52 | 0.1024 | 0.9857 |
80
+ | 0.022 | 12.0 | 57 | 0.1136 | 0.9857 |
81
+ | 0.0007 | 12.8421 | 61 | 0.1609 | 0.9714 |
82
+ | 0.0007 | 13.8947 | 66 | 0.1507 | 0.9857 |
83
+ | 0.0006 | 14.9474 | 71 | 0.2276 | 0.9714 |
84
+ | 0.0006 | 16.0 | 76 | 0.1707 | 0.9714 |
85
+ | 0.0002 | 16.8421 | 80 | 0.1558 | 0.9857 |
86
 
87
 
88
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:8582f3cc7e6fa328985bb0064c6c3fb75ea0b7f50e6ea30d13c6536cfd107ade
3
  size 349854120
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:18c8155d03fc5528407ef2294a6aa58559af47aa42136038b881476e8d3d78ac
3
  size 349854120