--- license: other tags: - generated_from_trainer model-index: - name: segformer-b0-finetuned-segments-sidewalk-2 results: [] --- # segformer-b0-finetuned-segments-sidewalk-2 This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1.5568 - Mean Iou: 0.1429 - Mean Accuracy: 0.1909 - Overall Accuracy: 0.7302 - Per Category Iou: [nan, 0.4939249651377763, 0.7719350693388762, 0.0, 0.03491527266588522, 0.0007851043658245269, nan, 0.0, 0.0, 0.0, 0.5957492947164502, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.5578896542272563, 0.0, 8.731498772678703e-06, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.763823679694206, 0.5627622811191442, 0.7914659567091414, 0.0, 0.0, 3.4412391213828277e-06, 0.0] - Per Category Accuracy: [nan, 0.8311952182095418, 0.9317484161484766, 0.0, 0.03491984657702897, 0.0007870032398194515, nan, 0.0, 0.0, 0.0, 0.8874888349993965, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.8733272803927883, 0.0, 8.732022947756307e-06, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.9141159811565875, 0.788047139121296, 0.8472913943123015, 0.0, 0.0, 3.4413693897075525e-06, 0.0] ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 1 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Per Category Iou | Per Category Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:| | 2.1501 | 0.5 | 100 | 1.7902 | 0.1355 | 0.1841 | 0.7104 | [nan, 0.45744119291698754, 0.7571272493181429, 0.0, 0.00033640367932780314, 0.0003751385454855486, nan, 1.1301520807983395e-05, 0.0, 0.0, 0.603134432234208, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.5277902723725074, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.7525759481063863, 0.5102277252814887, 0.7259971515863731, 0.0, 0.0, 0.0, 0.0] | [nan, 0.8342415185752861, 0.8927152239695044, 0.0, 0.00033640367932780314, 0.00037609004380752546, nan, 1.1301659177747787e-05, 0.0, 0.0, 0.8649657094576059, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.9071099090067092, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.9070157155961565, 0.719362898131699, 0.7640401968695343, 0.0, 0.0, 0.0, 0.0] | | 1.5623 | 1.0 | 200 | 1.5568 | 0.1429 | 0.1909 | 0.7302 | [nan, 0.4939249651377763, 0.7719350693388762, 0.0, 0.03491527266588522, 0.0007851043658245269, nan, 0.0, 0.0, 0.0, 0.5957492947164502, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.5578896542272563, 0.0, 8.731498772678703e-06, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.763823679694206, 0.5627622811191442, 0.7914659567091414, 0.0, 0.0, 3.4412391213828277e-06, 0.0] | [nan, 0.8311952182095418, 0.9317484161484766, 0.0, 0.03491984657702897, 0.0007870032398194515, nan, 0.0, 0.0, 0.0, 0.8874888349993965, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.8733272803927883, 0.0, 8.732022947756307e-06, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.9141159811565875, 0.788047139121296, 0.8472913943123015, 0.0, 0.0, 3.4413693897075525e-06, 0.0] | ### Framework versions - Transformers 4.21.1 - Pytorch 1.12.1 - Datasets 2.4.0 - Tokenizers 0.12.1