NemesisAlm
commited on
Commit
•
53f1d29
1
Parent(s):
3254238
Update README.md
Browse files
README.md
CHANGED
@@ -10,6 +10,7 @@ datasets:
|
|
10 |
- blanchon/UC_Merced
|
11 |
metrics:
|
12 |
- accuracy
|
|
|
13 |
---
|
14 |
# clip-fine-tuned-satellite
|
15 |
|
@@ -28,6 +29,9 @@ The model is a fine-tuned version of CLIP.\
|
|
28 |
The model is to be used to classify satellite images.\
|
29 |
It was trained on the UC_Merced dataset that comprises 21 classes: agricultural, airplane, baseballdiamond, beach, buildings, chaparral, denseresidential, forest, freeway, golfcourse, harbor, intersection, mediumresidential, mobilehomepark, overpass, parkinglot, river, runway, sparseresidential, storagetanks, tenniscourt
|
30 |
|
|
|
|
|
|
|
31 |
## Training and evaluation data
|
32 |
|
33 |
30% of the parameters trained.\
|
|
|
10 |
- blanchon/UC_Merced
|
11 |
metrics:
|
12 |
- accuracy
|
13 |
+
library_name: transformers
|
14 |
---
|
15 |
# clip-fine-tuned-satellite
|
16 |
|
|
|
29 |
The model is to be used to classify satellite images.\
|
30 |
It was trained on the UC_Merced dataset that comprises 21 classes: agricultural, airplane, baseballdiamond, beach, buildings, chaparral, denseresidential, forest, freeway, golfcourse, harbor, intersection, mediumresidential, mobilehomepark, overpass, parkinglot, river, runway, sparseresidential, storagetanks, tenniscourt
|
31 |
|
32 |
+
To see how to use it, refer to the [CLIP documentation](https://huggingface.co/openai/clip-vit-base-patch32) or check the app using this model:\
|
33 |
+
https://huggingface.co/spaces/NemesisAlm/clip-satellite-demo
|
34 |
+
|
35 |
## Training and evaluation data
|
36 |
|
37 |
30% of the parameters trained.\
|