--- license: apache-2.0 base_model: bert-base-multilingual-uncased tags: - generated_from_trainer metrics: - recall - accuracy model-index: - name: multibert1110_lrate10b16 results: [] --- # multibert1110_lrate10b16 This model is a fine-tuned version of [bert-base-multilingual-uncased](https://huggingface.co/bert-base-multilingual-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.5469 - Precisions: 0.8548 - Recall: 0.8081 - F-measure: 0.8287 - Accuracy: 0.9073 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 14 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precisions | Recall | F-measure | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:----------:|:------:|:---------:|:--------:| | 0.6155 | 1.0 | 236 | 0.4117 | 0.8520 | 0.6645 | 0.6887 | 0.8660 | | 0.3587 | 2.0 | 472 | 0.3608 | 0.7877 | 0.7387 | 0.7562 | 0.8864 | | 0.2315 | 3.0 | 708 | 0.3620 | 0.8962 | 0.7550 | 0.7918 | 0.8977 | | 0.1551 | 4.0 | 944 | 0.4640 | 0.8523 | 0.7478 | 0.7834 | 0.8931 | | 0.1117 | 5.0 | 1180 | 0.4567 | 0.8269 | 0.7425 | 0.7712 | 0.8958 | | 0.0885 | 6.0 | 1416 | 0.4916 | 0.8679 | 0.7882 | 0.8206 | 0.9037 | | 0.0646 | 7.0 | 1652 | 0.5469 | 0.8548 | 0.8081 | 0.8287 | 0.9073 | | 0.0385 | 8.0 | 1888 | 0.5638 | 0.8665 | 0.7813 | 0.8064 | 0.8999 | | 0.0262 | 9.0 | 2124 | 0.5864 | 0.8872 | 0.7415 | 0.7881 | 0.9045 | | 0.0231 | 10.0 | 2360 | 0.5984 | 0.8577 | 0.7582 | 0.7966 | 0.9017 | | 0.0114 | 11.0 | 2596 | 0.6513 | 0.8594 | 0.7532 | 0.7930 | 0.9032 | | 0.0119 | 12.0 | 2832 | 0.6270 | 0.8717 | 0.7591 | 0.8007 | 0.9034 | | 0.006 | 13.0 | 3068 | 0.6814 | 0.8733 | 0.7411 | 0.7864 | 0.9034 | | 0.0041 | 14.0 | 3304 | 0.6782 | 0.8722 | 0.7505 | 0.7943 | 0.9040 | ### Framework versions - Transformers 4.34.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.14.1