samitizerxu commited on
Commit
d7d3d46
1 Parent(s): 4673567

End of training

Browse files
Files changed (4) hide show
  1. README.md +327 -198
  2. config.json +69 -0
  3. model.safetensors +3 -0
  4. training_args.bin +3 -0
README.md CHANGED
@@ -1,201 +1,330 @@
1
  ---
2
- library_name: transformers
3
- tags: []
 
 
 
 
 
4
  ---
5
 
6
- # Model Card for Model ID
7
-
8
- <!-- Provide a quick summary of what the model is/does. -->
9
-
10
-
11
-
12
- ## Model Details
13
-
14
- ### Model Description
15
-
16
- <!-- Provide a longer summary of what this model is. -->
17
-
18
- This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
19
-
20
- - **Developed by:** [More Information Needed]
21
- - **Funded by [optional]:** [More Information Needed]
22
- - **Shared by [optional]:** [More Information Needed]
23
- - **Model type:** [More Information Needed]
24
- - **Language(s) (NLP):** [More Information Needed]
25
- - **License:** [More Information Needed]
26
- - **Finetuned from model [optional]:** [More Information Needed]
27
-
28
- ### Model Sources [optional]
29
-
30
- <!-- Provide the basic links for the model. -->
31
-
32
- - **Repository:** [More Information Needed]
33
- - **Paper [optional]:** [More Information Needed]
34
- - **Demo [optional]:** [More Information Needed]
35
-
36
- ## Uses
37
-
38
- <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
39
-
40
- ### Direct Use
41
-
42
- <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
43
-
44
- [More Information Needed]
45
-
46
- ### Downstream Use [optional]
47
-
48
- <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
49
-
50
- [More Information Needed]
51
-
52
- ### Out-of-Scope Use
53
-
54
- <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
55
-
56
- [More Information Needed]
57
-
58
- ## Bias, Risks, and Limitations
59
-
60
- <!-- This section is meant to convey both technical and sociotechnical limitations. -->
61
-
62
- [More Information Needed]
63
-
64
- ### Recommendations
65
-
66
- <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
67
-
68
- Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
69
-
70
- ## How to Get Started with the Model
71
-
72
- Use the code below to get started with the model.
73
-
74
- [More Information Needed]
75
-
76
- ## Training Details
77
-
78
- ### Training Data
79
-
80
- <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
81
-
82
- [More Information Needed]
83
-
84
- ### Training Procedure
85
-
86
- <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
87
-
88
- #### Preprocessing [optional]
89
-
90
- [More Information Needed]
91
-
92
-
93
- #### Training Hyperparameters
94
-
95
- - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
96
-
97
- #### Speeds, Sizes, Times [optional]
98
-
99
- <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
100
-
101
- [More Information Needed]
102
-
103
- ## Evaluation
104
-
105
- <!-- This section describes the evaluation protocols and provides the results. -->
106
-
107
- ### Testing Data, Factors & Metrics
108
-
109
- #### Testing Data
110
-
111
- <!-- This should link to a Dataset Card if possible. -->
112
-
113
- [More Information Needed]
114
-
115
- #### Factors
116
-
117
- <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
118
-
119
- [More Information Needed]
120
-
121
- #### Metrics
122
-
123
- <!-- These are the evaluation metrics being used, ideally with a description of why. -->
124
-
125
- [More Information Needed]
126
-
127
- ### Results
128
-
129
- [More Information Needed]
130
-
131
- #### Summary
132
-
133
-
134
-
135
- ## Model Examination [optional]
136
-
137
- <!-- Relevant interpretability work for the model goes here -->
138
-
139
- [More Information Needed]
140
-
141
- ## Environmental Impact
142
-
143
- <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
144
-
145
- Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
146
-
147
- - **Hardware Type:** [More Information Needed]
148
- - **Hours used:** [More Information Needed]
149
- - **Cloud Provider:** [More Information Needed]
150
- - **Compute Region:** [More Information Needed]
151
- - **Carbon Emitted:** [More Information Needed]
152
-
153
- ## Technical Specifications [optional]
154
-
155
- ### Model Architecture and Objective
156
-
157
- [More Information Needed]
158
-
159
- ### Compute Infrastructure
160
-
161
- [More Information Needed]
162
-
163
- #### Hardware
164
-
165
- [More Information Needed]
166
-
167
- #### Software
168
-
169
- [More Information Needed]
170
-
171
- ## Citation [optional]
172
-
173
- <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
174
-
175
- **BibTeX:**
176
-
177
- [More Information Needed]
178
-
179
- **APA:**
180
-
181
- [More Information Needed]
182
-
183
- ## Glossary [optional]
184
-
185
- <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
186
-
187
- [More Information Needed]
188
-
189
- ## More Information [optional]
190
-
191
- [More Information Needed]
192
-
193
- ## Model Card Authors [optional]
194
-
195
- [More Information Needed]
196
-
197
- ## Model Card Contact
198
-
199
- [More Information Needed]
200
-
201
-
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ tags:
3
+ - vision
4
+ - image-segmentation
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: segformer-b1-from-scratch-run1
8
+ results: []
9
  ---
10
 
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # segformer-b1-from-scratch-run1
15
+
16
+ This model is a fine-tuned version of [](https://huggingface.co/) on the samitizerxu/kelp_data_rgbaa_swin_nir dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Iou Kelp: 0.0067
19
+ - Loss: 0.9872
20
+
21
+ ## Model description
22
+
23
+ More information needed
24
+
25
+ ## Intended uses & limitations
26
+
27
+ More information needed
28
+
29
+ ## Training and evaluation data
30
+
31
+ More information needed
32
+
33
+ ## Training procedure
34
+
35
+ ### Training hyperparameters
36
+
37
+ The following hyperparameters were used during training:
38
+ - learning_rate: 0.1
39
+ - train_batch_size: 22
40
+ - eval_batch_size: 22
41
+ - seed: 42
42
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
43
+ - lr_scheduler_type: cosine
44
+ - num_epochs: 40
45
+
46
+ ### Training results
47
+
48
+ | Training Loss | Epoch | Step | Iou Kelp | Validation Loss |
49
+ |:-------------:|:-----:|:----:|:--------:|:---------------:|
50
+ | 0.9999 | 0.15 | 30 | 0.0067 | 0.9872 |
51
+ | 1.0 | 0.29 | 60 | 0.0067 | 0.9872 |
52
+ | 0.9933 | 0.44 | 90 | 0.0067 | 0.9872 |
53
+ | 0.998 | 0.59 | 120 | 0.0067 | 0.9872 |
54
+ | 1.0 | 0.73 | 150 | 0.0067 | 0.9872 |
55
+ | 0.9998 | 0.88 | 180 | 0.0067 | 0.9872 |
56
+ | 0.9998 | 1.02 | 210 | 0.0067 | 0.9872 |
57
+ | 1.0 | 1.17 | 240 | 0.0082 | 0.9861 |
58
+ | 0.9976 | 1.32 | 270 | 0.0069 | 0.9869 |
59
+ | 0.9995 | 1.46 | 300 | 0.0070 | 0.9868 |
60
+ | 0.9967 | 1.61 | 330 | 0.0067 | 0.9872 |
61
+ | 0.9945 | 1.76 | 360 | 0.0067 | 0.9872 |
62
+ | 1.0 | 1.9 | 390 | 0.0067 | 0.9872 |
63
+ | 0.9992 | 2.05 | 420 | 0.0067 | 0.9872 |
64
+ | 0.9991 | 2.2 | 450 | 0.0067 | 0.9872 |
65
+ | 0.997 | 2.34 | 480 | 0.0067 | 0.9872 |
66
+ | 0.999 | 2.49 | 510 | 0.0067 | 0.9872 |
67
+ | 0.9999 | 2.63 | 540 | 0.0067 | 0.9872 |
68
+ | 0.9991 | 2.78 | 570 | 0.0067 | 0.9872 |
69
+ | 0.9987 | 2.93 | 600 | 0.0067 | 0.9872 |
70
+ | 0.9999 | 3.07 | 630 | 0.0067 | 0.9872 |
71
+ | 0.9983 | 3.22 | 660 | 0.0067 | 0.9872 |
72
+ | 0.9973 | 3.37 | 690 | 0.0067 | 0.9872 |
73
+ | 0.9987 | 3.51 | 720 | 0.0067 | 0.9872 |
74
+ | 0.9915 | 3.66 | 750 | 0.0067 | 0.9872 |
75
+ | 0.9984 | 3.8 | 780 | 0.0067 | 0.9872 |
76
+ | 0.9992 | 3.95 | 810 | 0.0067 | 0.9872 |
77
+ | 0.9993 | 4.1 | 840 | 0.0067 | 0.9872 |
78
+ | 1.0 | 4.24 | 870 | 0.0067 | 0.9872 |
79
+ | 0.9998 | 4.39 | 900 | 0.0067 | 0.9872 |
80
+ | 0.9999 | 4.54 | 930 | 0.0067 | 0.9872 |
81
+ | 0.9995 | 4.68 | 960 | 0.0067 | 0.9872 |
82
+ | 0.998 | 4.83 | 990 | 0.0067 | 0.9872 |
83
+ | 0.9989 | 4.98 | 1020 | 0.0067 | 0.9872 |
84
+ | 0.9975 | 5.12 | 1050 | 0.0067 | 0.9872 |
85
+ | 0.9993 | 5.27 | 1080 | 0.0067 | 0.9872 |
86
+ | 0.9971 | 5.41 | 1110 | 0.0067 | 0.9872 |
87
+ | 0.9944 | 5.56 | 1140 | 0.0067 | 0.9872 |
88
+ | 0.9967 | 5.71 | 1170 | 0.0067 | 0.9872 |
89
+ | 0.9986 | 5.85 | 1200 | 0.0067 | 0.9872 |
90
+ | 0.9994 | 6.0 | 1230 | 0.0067 | 0.9872 |
91
+ | 0.9997 | 6.15 | 1260 | 0.0067 | 0.9872 |
92
+ | 0.9998 | 6.29 | 1290 | 0.0067 | 0.9872 |
93
+ | 0.999 | 6.44 | 1320 | 0.0067 | 0.9872 |
94
+ | 0.9996 | 6.59 | 1350 | 0.0067 | 0.9872 |
95
+ | 1.0 | 6.73 | 1380 | 0.0067 | 0.9872 |
96
+ | 0.9999 | 6.88 | 1410 | 0.0067 | 0.9872 |
97
+ | 0.9933 | 7.02 | 1440 | 0.0067 | 0.9872 |
98
+ | 0.998 | 7.17 | 1470 | 0.0067 | 0.9872 |
99
+ | 0.9968 | 7.32 | 1500 | 0.0067 | 0.9872 |
100
+ | 0.997 | 7.46 | 1530 | 0.0067 | 0.9872 |
101
+ | 0.9981 | 7.61 | 1560 | 0.0067 | 0.9872 |
102
+ | 0.9992 | 7.76 | 1590 | 0.0067 | 0.9872 |
103
+ | 0.9999 | 7.9 | 1620 | 0.0067 | 0.9872 |
104
+ | 0.9964 | 8.05 | 1650 | 0.0067 | 0.9872 |
105
+ | 0.9999 | 8.2 | 1680 | 0.0067 | 0.9872 |
106
+ | 0.9941 | 8.34 | 1710 | 0.0067 | 0.9872 |
107
+ | 0.9963 | 8.49 | 1740 | 0.0067 | 0.9872 |
108
+ | 0.998 | 8.63 | 1770 | 0.0067 | 0.9872 |
109
+ | 0.9989 | 8.78 | 1800 | 0.0067 | 0.9872 |
110
+ | 1.0 | 8.93 | 1830 | 0.0067 | 0.9872 |
111
+ | 1.0 | 9.07 | 1860 | 0.0067 | 0.9872 |
112
+ | 0.9974 | 9.22 | 1890 | 0.0067 | 0.9872 |
113
+ | 0.9989 | 9.37 | 1920 | 0.0067 | 0.9872 |
114
+ | 0.9989 | 9.51 | 1950 | 0.0067 | 0.9872 |
115
+ | 0.996 | 9.66 | 1980 | 0.0067 | 0.9872 |
116
+ | 0.9995 | 9.8 | 2010 | 0.0067 | 0.9872 |
117
+ | 0.9973 | 9.95 | 2040 | 0.0067 | 0.9872 |
118
+ | 0.9957 | 10.1 | 2070 | 0.0067 | 0.9872 |
119
+ | 0.9996 | 10.24 | 2100 | 0.0067 | 0.9872 |
120
+ | 1.0 | 10.39 | 2130 | 0.0067 | 0.9872 |
121
+ | 0.9967 | 10.54 | 2160 | 0.0067 | 0.9872 |
122
+ | 0.9989 | 10.68 | 2190 | 0.0067 | 0.9872 |
123
+ | 0.9989 | 10.83 | 2220 | 0.0067 | 0.9872 |
124
+ | 0.9994 | 10.98 | 2250 | 0.0067 | 0.9872 |
125
+ | 0.9992 | 11.12 | 2280 | 0.0067 | 0.9872 |
126
+ | 0.9973 | 11.27 | 2310 | 0.0067 | 0.9872 |
127
+ | 0.9993 | 11.41 | 2340 | 0.0067 | 0.9872 |
128
+ | 0.9973 | 11.56 | 2370 | 0.0067 | 0.9872 |
129
+ | 0.9996 | 11.71 | 2400 | 0.0067 | 0.9872 |
130
+ | 1.0 | 11.85 | 2430 | 0.0067 | 0.9872 |
131
+ | 0.9989 | 12.0 | 2460 | 0.0067 | 0.9872 |
132
+ | 1.0 | 12.15 | 2490 | 0.0067 | 0.9872 |
133
+ | 0.9987 | 12.29 | 2520 | 0.0067 | 0.9872 |
134
+ | 0.9914 | 12.44 | 2550 | 0.0067 | 0.9872 |
135
+ | 0.9974 | 12.59 | 2580 | 0.0067 | 0.9872 |
136
+ | 1.0 | 12.73 | 2610 | 0.0067 | 0.9872 |
137
+ | 0.999 | 12.88 | 2640 | 0.0067 | 0.9872 |
138
+ | 1.0 | 13.02 | 2670 | 0.0067 | 0.9872 |
139
+ | 0.9991 | 13.17 | 2700 | 0.0067 | 0.9872 |
140
+ | 0.9979 | 13.32 | 2730 | 0.0067 | 0.9872 |
141
+ | 1.0 | 13.46 | 2760 | 0.0067 | 0.9872 |
142
+ | 0.9973 | 13.61 | 2790 | 0.0067 | 0.9872 |
143
+ | 0.9995 | 13.76 | 2820 | 0.0067 | 0.9872 |
144
+ | 0.9973 | 13.9 | 2850 | 0.0067 | 0.9872 |
145
+ | 0.9961 | 14.05 | 2880 | 0.0067 | 0.9872 |
146
+ | 0.9907 | 14.2 | 2910 | 0.0067 | 0.9872 |
147
+ | 0.9984 | 14.34 | 2940 | 0.0067 | 0.9872 |
148
+ | 0.9986 | 14.49 | 2970 | 0.0067 | 0.9872 |
149
+ | 0.9935 | 14.63 | 3000 | 0.0067 | 0.9872 |
150
+ | 0.998 | 14.78 | 3030 | 0.0067 | 0.9872 |
151
+ | 0.9982 | 14.93 | 3060 | 0.0067 | 0.9872 |
152
+ | 0.9956 | 15.07 | 3090 | 0.0067 | 0.9872 |
153
+ | 0.9991 | 15.22 | 3120 | 0.0067 | 0.9872 |
154
+ | 0.9985 | 15.37 | 3150 | 0.0067 | 0.9872 |
155
+ | 0.9958 | 15.51 | 3180 | 0.0067 | 0.9872 |
156
+ | 0.9998 | 15.66 | 3210 | 0.0067 | 0.9872 |
157
+ | 0.9972 | 15.8 | 3240 | 0.0067 | 0.9872 |
158
+ | 0.9996 | 15.95 | 3270 | 0.0067 | 0.9872 |
159
+ | 0.9965 | 16.1 | 3300 | 0.0067 | 0.9872 |
160
+ | 0.9983 | 16.24 | 3330 | 0.0067 | 0.9872 |
161
+ | 0.9993 | 16.39 | 3360 | 0.0067 | 0.9872 |
162
+ | 0.9962 | 16.54 | 3390 | 0.0067 | 0.9872 |
163
+ | 0.9985 | 16.68 | 3420 | 0.0067 | 0.9872 |
164
+ | 0.9998 | 16.83 | 3450 | 0.0067 | 0.9872 |
165
+ | 0.9993 | 16.98 | 3480 | 0.0067 | 0.9872 |
166
+ | 0.9993 | 17.12 | 3510 | 0.0067 | 0.9872 |
167
+ | 0.9998 | 17.27 | 3540 | 0.0067 | 0.9872 |
168
+ | 1.0 | 17.41 | 3570 | 0.0067 | 0.9872 |
169
+ | 0.9999 | 17.56 | 3600 | 0.0067 | 0.9872 |
170
+ | 0.9993 | 17.71 | 3630 | 0.0067 | 0.9872 |
171
+ | 0.999 | 17.85 | 3660 | 0.0067 | 0.9872 |
172
+ | 0.9975 | 18.0 | 3690 | 0.0067 | 0.9872 |
173
+ | 0.9993 | 18.15 | 3720 | 0.0067 | 0.9872 |
174
+ | 1.0 | 18.29 | 3750 | 0.0067 | 0.9872 |
175
+ | 0.9983 | 18.44 | 3780 | 0.0067 | 0.9872 |
176
+ | 0.9994 | 18.59 | 3810 | 0.0067 | 0.9872 |
177
+ | 0.9993 | 18.73 | 3840 | 0.0067 | 0.9872 |
178
+ | 0.9982 | 18.88 | 3870 | 0.0067 | 0.9872 |
179
+ | 0.9997 | 19.02 | 3900 | 0.0067 | 0.9872 |
180
+ | 0.9955 | 19.17 | 3930 | 0.0067 | 0.9872 |
181
+ | 0.9992 | 19.32 | 3960 | 0.0067 | 0.9872 |
182
+ | 0.9592 | 19.46 | 3990 | 0.0067 | 0.9872 |
183
+ | 0.9897 | 19.61 | 4020 | 0.0067 | 0.9872 |
184
+ | 0.9994 | 19.76 | 4050 | 0.0067 | 0.9872 |
185
+ | 0.9989 | 19.9 | 4080 | 0.0067 | 0.9872 |
186
+ | 0.9995 | 20.05 | 4110 | 0.0067 | 0.9872 |
187
+ | 0.9995 | 20.2 | 4140 | 0.0067 | 0.9872 |
188
+ | 0.9938 | 20.34 | 4170 | 0.0067 | 0.9872 |
189
+ | 0.9987 | 20.49 | 4200 | 0.0067 | 0.9872 |
190
+ | 0.9999 | 20.63 | 4230 | 0.0067 | 0.9872 |
191
+ | 0.9994 | 20.78 | 4260 | 0.0067 | 0.9872 |
192
+ | 0.9954 | 20.93 | 4290 | 0.0067 | 0.9872 |
193
+ | 0.9975 | 21.07 | 4320 | 0.0067 | 0.9872 |
194
+ | 0.9997 | 21.22 | 4350 | 0.0067 | 0.9872 |
195
+ | 0.9978 | 21.37 | 4380 | 0.0067 | 0.9872 |
196
+ | 0.9994 | 21.51 | 4410 | 0.0067 | 0.9872 |
197
+ | 0.9985 | 21.66 | 4440 | 0.0067 | 0.9872 |
198
+ | 0.9998 | 21.8 | 4470 | 0.0067 | 0.9872 |
199
+ | 0.998 | 21.95 | 4500 | 0.0067 | 0.9872 |
200
+ | 0.9983 | 22.1 | 4530 | 0.0067 | 0.9872 |
201
+ | 0.9989 | 22.24 | 4560 | 0.0067 | 0.9872 |
202
+ | 0.9973 | 22.39 | 4590 | 0.0067 | 0.9872 |
203
+ | 0.9961 | 22.54 | 4620 | 0.0067 | 0.9872 |
204
+ | 0.9984 | 22.68 | 4650 | 0.0067 | 0.9872 |
205
+ | 1.0 | 22.83 | 4680 | 0.0067 | 0.9872 |
206
+ | 0.9949 | 22.98 | 4710 | 0.0067 | 0.9872 |
207
+ | 0.9989 | 23.12 | 4740 | 0.0067 | 0.9872 |
208
+ | 0.9998 | 23.27 | 4770 | 0.0067 | 0.9872 |
209
+ | 0.9999 | 23.41 | 4800 | 0.0067 | 0.9872 |
210
+ | 0.9996 | 23.56 | 4830 | 0.0067 | 0.9872 |
211
+ | 0.9974 | 23.71 | 4860 | 0.0067 | 0.9872 |
212
+ | 0.9997 | 23.85 | 4890 | 0.0067 | 0.9872 |
213
+ | 0.9999 | 24.0 | 4920 | 0.0067 | 0.9872 |
214
+ | 0.9962 | 24.15 | 4950 | 0.0067 | 0.9872 |
215
+ | 0.9996 | 24.29 | 4980 | 0.0067 | 0.9872 |
216
+ | 0.9999 | 24.44 | 5010 | 0.0067 | 0.9872 |
217
+ | 0.9973 | 24.59 | 5040 | 0.0067 | 0.9872 |
218
+ | 0.9996 | 24.73 | 5070 | 0.0067 | 0.9872 |
219
+ | 0.9995 | 24.88 | 5100 | 0.0067 | 0.9872 |
220
+ | 0.9999 | 25.02 | 5130 | 0.0067 | 0.9872 |
221
+ | 0.9988 | 25.17 | 5160 | 0.0067 | 0.9872 |
222
+ | 1.0 | 25.32 | 5190 | 0.0067 | 0.9872 |
223
+ | 1.0 | 25.46 | 5220 | 0.0067 | 0.9872 |
224
+ | 0.9996 | 25.61 | 5250 | 0.0067 | 0.9872 |
225
+ | 0.9965 | 25.76 | 5280 | 0.0067 | 0.9872 |
226
+ | 0.9976 | 25.9 | 5310 | 0.0067 | 0.9872 |
227
+ | 1.0 | 26.05 | 5340 | 0.0067 | 0.9872 |
228
+ | 0.9989 | 26.2 | 5370 | 0.0067 | 0.9872 |
229
+ | 0.9996 | 26.34 | 5400 | 0.0067 | 0.9872 |
230
+ | 0.9998 | 26.49 | 5430 | 0.0067 | 0.9872 |
231
+ | 1.0 | 26.63 | 5460 | 0.0067 | 0.9872 |
232
+ | 0.9996 | 26.78 | 5490 | 0.0067 | 0.9872 |
233
+ | 0.9972 | 26.93 | 5520 | 0.0067 | 0.9872 |
234
+ | 0.9984 | 27.07 | 5550 | 0.0067 | 0.9872 |
235
+ | 0.9961 | 27.22 | 5580 | 0.0067 | 0.9872 |
236
+ | 1.0 | 27.37 | 5610 | 0.0067 | 0.9872 |
237
+ | 0.9977 | 27.51 | 5640 | 0.0067 | 0.9872 |
238
+ | 0.9969 | 27.66 | 5670 | 0.0067 | 0.9872 |
239
+ | 0.9971 | 27.8 | 5700 | 0.0067 | 0.9872 |
240
+ | 0.9986 | 27.95 | 5730 | 0.0067 | 0.9872 |
241
+ | 0.9995 | 28.1 | 5760 | 0.0067 | 0.9872 |
242
+ | 0.9992 | 28.24 | 5790 | 0.0067 | 0.9872 |
243
+ | 0.9976 | 28.39 | 5820 | 0.0067 | 0.9872 |
244
+ | 0.9994 | 28.54 | 5850 | 0.0067 | 0.9872 |
245
+ | 0.998 | 28.68 | 5880 | 0.0067 | 0.9872 |
246
+ | 0.9952 | 28.83 | 5910 | 0.0067 | 0.9872 |
247
+ | 0.9998 | 28.98 | 5940 | 0.0067 | 0.9872 |
248
+ | 0.9937 | 29.12 | 5970 | 0.0067 | 0.9872 |
249
+ | 0.9989 | 29.27 | 6000 | 0.0067 | 0.9872 |
250
+ | 0.9993 | 29.41 | 6030 | 0.0067 | 0.9872 |
251
+ | 0.9989 | 29.56 | 6060 | 0.0067 | 0.9872 |
252
+ | 0.999 | 29.71 | 6090 | 0.0067 | 0.9872 |
253
+ | 0.9939 | 29.85 | 6120 | 0.0067 | 0.9872 |
254
+ | 1.0 | 30.0 | 6150 | 0.0067 | 0.9872 |
255
+ | 0.9996 | 30.15 | 6180 | 0.0067 | 0.9872 |
256
+ | 0.9994 | 30.29 | 6210 | 0.0067 | 0.9872 |
257
+ | 0.999 | 30.44 | 6240 | 0.0067 | 0.9872 |
258
+ | 1.0 | 30.59 | 6270 | 0.0067 | 0.9872 |
259
+ | 0.9956 | 30.73 | 6300 | 0.0067 | 0.9872 |
260
+ | 0.9971 | 30.88 | 6330 | 0.0067 | 0.9872 |
261
+ | 0.9985 | 31.02 | 6360 | 0.0067 | 0.9872 |
262
+ | 1.0 | 31.17 | 6390 | 0.0067 | 0.9872 |
263
+ | 0.9987 | 31.32 | 6420 | 0.0067 | 0.9872 |
264
+ | 0.9992 | 31.46 | 6450 | 0.0067 | 0.9872 |
265
+ | 0.9996 | 31.61 | 6480 | 0.0067 | 0.9872 |
266
+ | 0.9998 | 31.76 | 6510 | 0.0067 | 0.9872 |
267
+ | 0.9989 | 31.9 | 6540 | 0.0067 | 0.9872 |
268
+ | 1.0 | 32.05 | 6570 | 0.0067 | 0.9872 |
269
+ | 0.9966 | 32.2 | 6600 | 0.0067 | 0.9872 |
270
+ | 0.9994 | 32.34 | 6630 | 0.0067 | 0.9872 |
271
+ | 0.9987 | 32.49 | 6660 | 0.0067 | 0.9872 |
272
+ | 0.9993 | 32.63 | 6690 | 0.0067 | 0.9872 |
273
+ | 0.9971 | 32.78 | 6720 | 0.0067 | 0.9872 |
274
+ | 0.9971 | 32.93 | 6750 | 0.0067 | 0.9872 |
275
+ | 0.9929 | 33.07 | 6780 | 0.0067 | 0.9872 |
276
+ | 0.9997 | 33.22 | 6810 | 0.0067 | 0.9872 |
277
+ | 0.9978 | 33.37 | 6840 | 0.0067 | 0.9872 |
278
+ | 1.0 | 33.51 | 6870 | 0.0067 | 0.9872 |
279
+ | 0.9991 | 33.66 | 6900 | 0.0067 | 0.9872 |
280
+ | 0.9971 | 33.8 | 6930 | 0.0067 | 0.9872 |
281
+ | 0.9999 | 33.95 | 6960 | 0.0067 | 0.9872 |
282
+ | 0.9999 | 34.1 | 6990 | 0.0067 | 0.9872 |
283
+ | 0.9997 | 34.24 | 7020 | 0.0067 | 0.9872 |
284
+ | 1.0 | 34.39 | 7050 | 0.0067 | 0.9872 |
285
+ | 0.9986 | 34.54 | 7080 | 0.0067 | 0.9872 |
286
+ | 0.9996 | 34.68 | 7110 | 0.0067 | 0.9872 |
287
+ | 0.9994 | 34.83 | 7140 | 0.0067 | 0.9872 |
288
+ | 0.9997 | 34.98 | 7170 | 0.0067 | 0.9872 |
289
+ | 0.9999 | 35.12 | 7200 | 0.0067 | 0.9872 |
290
+ | 0.9969 | 35.27 | 7230 | 0.0067 | 0.9872 |
291
+ | 1.0 | 35.41 | 7260 | 0.0067 | 0.9872 |
292
+ | 0.9984 | 35.56 | 7290 | 0.0067 | 0.9872 |
293
+ | 0.9961 | 35.71 | 7320 | 0.0067 | 0.9872 |
294
+ | 0.9988 | 35.85 | 7350 | 0.0067 | 0.9872 |
295
+ | 0.9985 | 36.0 | 7380 | 0.0067 | 0.9872 |
296
+ | 0.9997 | 36.15 | 7410 | 0.0067 | 0.9872 |
297
+ | 1.0 | 36.29 | 7440 | 0.0067 | 0.9872 |
298
+ | 0.9987 | 36.44 | 7470 | 0.0067 | 0.9872 |
299
+ | 0.9991 | 36.59 | 7500 | 0.0067 | 0.9872 |
300
+ | 0.9992 | 36.73 | 7530 | 0.0067 | 0.9872 |
301
+ | 0.9999 | 36.88 | 7560 | 0.0067 | 0.9872 |
302
+ | 0.9996 | 37.02 | 7590 | 0.0067 | 0.9872 |
303
+ | 0.9995 | 37.17 | 7620 | 0.0067 | 0.9872 |
304
+ | 0.9998 | 37.32 | 7650 | 0.0067 | 0.9872 |
305
+ | 0.9969 | 37.46 | 7680 | 0.0067 | 0.9872 |
306
+ | 0.9989 | 37.61 | 7710 | 0.0067 | 0.9872 |
307
+ | 0.9992 | 37.76 | 7740 | 0.0067 | 0.9872 |
308
+ | 0.9959 | 37.9 | 7770 | 0.0067 | 0.9872 |
309
+ | 0.9987 | 38.05 | 7800 | 0.0067 | 0.9872 |
310
+ | 0.998 | 38.2 | 7830 | 0.0067 | 0.9872 |
311
+ | 0.9992 | 38.34 | 7860 | 0.0067 | 0.9872 |
312
+ | 0.9992 | 38.49 | 7890 | 0.0067 | 0.9872 |
313
+ | 0.9993 | 38.63 | 7920 | 0.0067 | 0.9872 |
314
+ | 0.9997 | 38.78 | 7950 | 0.0067 | 0.9872 |
315
+ | 0.9976 | 38.93 | 7980 | 0.0067 | 0.9872 |
316
+ | 1.0 | 39.07 | 8010 | 0.0067 | 0.9872 |
317
+ | 0.9959 | 39.22 | 8040 | 0.0067 | 0.9872 |
318
+ | 0.9973 | 39.37 | 8070 | 0.0067 | 0.9872 |
319
+ | 0.9996 | 39.51 | 8100 | 0.0067 | 0.9872 |
320
+ | 1.0 | 39.66 | 8130 | 0.0067 | 0.9872 |
321
+ | 0.9986 | 39.8 | 8160 | 0.0067 | 0.9872 |
322
+ | 0.9999 | 39.95 | 8190 | 0.0067 | 0.9872 |
323
+
324
+
325
+ ### Framework versions
326
+
327
+ - Transformers 4.37.1
328
+ - Pytorch 2.1.2
329
+ - Datasets 2.16.1
330
+ - Tokenizers 0.15.1
config.json ADDED
@@ -0,0 +1,69 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "SegformerForKelpSemanticSegmentation"
4
+ ],
5
+ "attention_probs_dropout_prob": 0.0,
6
+ "classifier_dropout_prob": 0.1,
7
+ "decoder_hidden_size": 256,
8
+ "depths": [
9
+ 2,
10
+ 2,
11
+ 2,
12
+ 2
13
+ ],
14
+ "downsampling_rates": [
15
+ 1,
16
+ 4,
17
+ 8,
18
+ 16
19
+ ],
20
+ "drop_path_rate": 0.1,
21
+ "hidden_act": "gelu",
22
+ "hidden_dropout_prob": 0.0,
23
+ "hidden_sizes": [
24
+ 64,
25
+ 128,
26
+ 320,
27
+ 512
28
+ ],
29
+ "image_size": 350,
30
+ "initializer_range": 0.02,
31
+ "layer_norm_eps": 1e-06,
32
+ "mlp_ratios": [
33
+ 4,
34
+ 4,
35
+ 4,
36
+ 4
37
+ ],
38
+ "model_type": "segformer",
39
+ "num_attention_heads": [
40
+ 1,
41
+ 2,
42
+ 5,
43
+ 8
44
+ ],
45
+ "num_channels": 3,
46
+ "num_encoder_blocks": 4,
47
+ "patch_sizes": [
48
+ 7,
49
+ 3,
50
+ 3,
51
+ 3
52
+ ],
53
+ "reshape_last_stage": true,
54
+ "semantic_loss_ignore_index": 255,
55
+ "sr_ratios": [
56
+ 8,
57
+ 4,
58
+ 2,
59
+ 1
60
+ ],
61
+ "strides": [
62
+ 4,
63
+ 2,
64
+ 2,
65
+ 2
66
+ ],
67
+ "torch_dtype": "float32",
68
+ "transformers_version": "4.37.1"
69
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6b84152bdd8ba38414e64c3bec20b7ebd52127d8fc5fd8ef4124cd6181a63ff5
3
+ size 54737376
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bc503e055debf899e065b3abe00f21eccc5e337f419796ed4761cb8504d5e28f
3
+ size 4792