Edit model card

results

This model is a fine-tuned version of HooshvareLab/bert-fa-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.7347
  • Precision: 0.5347
  • Recall: 0.4718
  • F1: 0.4704
  • Accuracy: 0.4718

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 3

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
2.0985 0.0261 10 2.0359 0.1658 0.0730 0.0372 0.0730
2.0739 0.0522 20 1.9996 0.1472 0.0904 0.0635 0.0904
2.0404 0.0783 30 1.9585 0.1803 0.1434 0.1169 0.1434
1.9715 0.1044 40 1.9330 0.2206 0.1798 0.1338 0.1798
1.8596 0.1305 50 1.9552 0.2684 0.1738 0.0824 0.1738
1.8302 0.1567 60 2.0219 0.3429 0.1685 0.0516 0.1685
1.8838 0.1828 70 2.0038 0.1478 0.1677 0.0502 0.1677
1.9153 0.2089 80 1.9334 0.1546 0.1764 0.0823 0.1764
1.839 0.2350 90 1.9126 0.2046 0.1842 0.1002 0.1842
1.8358 0.2611 100 1.8918 0.2365 0.1972 0.1159 0.1972
1.8559 0.2872 110 1.8925 0.2209 0.2068 0.1269 0.2068
1.7707 0.3133 120 1.8970 0.2445 0.1833 0.1001 0.1833
1.7514 0.3394 130 1.9215 0.3953 0.1825 0.0943 0.1825
1.7569 0.3655 140 1.9472 0.2027 0.1746 0.0708 0.1746
1.7906 0.3916 150 1.8767 0.4575 0.2320 0.1791 0.2320
1.6752 0.4178 160 1.9244 0.4945 0.1885 0.0895 0.1885
1.7293 0.4439 170 1.8418 0.3536 0.2606 0.2013 0.2606
1.6713 0.4700 180 1.7744 0.4128 0.2702 0.2311 0.2702
1.5645 0.4961 190 1.7981 0.3822 0.2407 0.1775 0.2407
1.6074 0.5222 200 1.7513 0.4290 0.2789 0.2311 0.2789
1.4986 0.5483 210 1.7598 0.5202 0.2424 0.1861 0.2424
1.6157 0.5744 220 1.7453 0.4631 0.2798 0.2366 0.2798
1.4205 0.6005 230 1.6524 0.4198 0.3527 0.3373 0.3527
1.4854 0.6266 240 1.6375 0.4522 0.3484 0.3230 0.3484
1.4207 0.6527 250 1.6410 0.4348 0.3579 0.3279 0.3579
1.2455 0.6789 260 1.6365 0.4472 0.3588 0.3092 0.3588
1.3996 0.7050 270 1.5261 0.5027 0.4275 0.4212 0.4275
1.3084 0.7311 280 1.5914 0.4964 0.3831 0.3707 0.3831
1.3386 0.7572 290 1.5884 0.4888 0.3858 0.3633 0.3858
1.4334 0.7833 300 1.5438 0.4418 0.4231 0.4170 0.4231
1.3354 0.8094 310 1.6510 0.5115 0.3788 0.3471 0.3788
1.364 0.8355 320 1.6162 0.4985 0.3805 0.3747 0.3805
1.2291 0.8616 330 1.5523 0.4596 0.4057 0.4056 0.4057
1.2571 0.8877 340 1.5834 0.5378 0.4014 0.3990 0.4014
1.392 0.9138 350 1.4810 0.5012 0.4448 0.4413 0.4448
1.3909 0.9399 360 1.5218 0.5046 0.4301 0.4271 0.4301
1.2083 0.9661 370 1.5714 0.5127 0.4101 0.4013 0.4101
1.1827 0.9922 380 1.5607 0.5365 0.4196 0.4181 0.4196
1.2544 1.0183 390 1.4977 0.4942 0.4440 0.4392 0.4440
1.0718 1.0444 400 1.5737 0.5124 0.4257 0.4239 0.4257
1.1034 1.0705 410 1.5629 0.5218 0.4162 0.4128 0.4162
1.1171 1.0966 420 1.5049 0.4958 0.4718 0.4702 0.4718
1.1174 1.1227 430 1.5840 0.5175 0.4057 0.4019 0.4057
1.2966 1.1488 440 1.5740 0.5178 0.4214 0.4214 0.4214
1.0597 1.1749 450 1.7422 0.5221 0.3944 0.3808 0.3944
1.027 1.2010 460 1.5282 0.4853 0.4509 0.4457 0.4509
1.0327 1.2272 470 1.6277 0.4810 0.4005 0.3922 0.4005
1.127 1.2533 480 1.6321 0.4847 0.4275 0.4238 0.4275
1.1265 1.2794 490 1.6081 0.4854 0.4257 0.4148 0.4257
1.0853 1.3055 500 1.7379 0.4871 0.3884 0.3697 0.3884
1.1961 1.3316 510 1.6069 0.5028 0.4361 0.4182 0.4361
1.0534 1.3577 520 1.4849 0.5123 0.4831 0.4745 0.4831
1.1954 1.3838 530 1.6723 0.5260 0.4205 0.4078 0.4205
1.28 1.4099 540 1.8150 0.5381 0.3614 0.3311 0.3614
1.122 1.4360 550 1.4803 0.5268 0.4761 0.4738 0.4761
1.1675 1.4621 560 1.6255 0.5431 0.4170 0.4105 0.4170
1.1381 1.4883 570 1.5229 0.5410 0.4500 0.4285 0.4500
1.1103 1.5144 580 1.5931 0.5449 0.4526 0.4387 0.4526
1.0581 1.5405 590 1.5439 0.5312 0.4596 0.4504 0.4596
0.9962 1.5666 600 1.5441 0.5339 0.4579 0.4452 0.4579
1.0863 1.5927 610 1.5504 0.5364 0.4761 0.4578 0.4761
1.0893 1.6188 620 1.5631 0.5224 0.4770 0.4606 0.4770
1.1396 1.6449 630 1.5557 0.5045 0.4500 0.4469 0.4500
1.0648 1.6710 640 1.6417 0.5462 0.4431 0.4336 0.4431
1.2972 1.6971 650 1.6543 0.5509 0.4431 0.4206 0.4431
1.1413 1.7232 660 1.5779 0.5438 0.4440 0.4400 0.4440
1.076 1.7493 670 1.4805 0.5208 0.4666 0.4682 0.4666
1.1984 1.7755 680 1.5434 0.5126 0.4518 0.4482 0.4518
0.9841 1.8016 690 1.4483 0.5229 0.4865 0.4869 0.4865
1.235 1.8277 700 1.4452 0.5239 0.4935 0.4935 0.4935
1.0239 1.8538 710 1.5506 0.5414 0.4466 0.4395 0.4466
0.9993 1.8799 720 1.5191 0.5388 0.4579 0.4521 0.4579
0.8789 1.9060 730 1.5620 0.5662 0.4509 0.4497 0.4509
0.9412 1.9321 740 1.4985 0.5489 0.4726 0.4623 0.4726
1.0592 1.9582 750 1.5027 0.5366 0.4700 0.4609 0.4700
0.9971 1.9843 760 1.4782 0.5427 0.4726 0.4591 0.4726
0.9067 2.0104 770 1.4520 0.5386 0.4831 0.4790 0.4831
0.7288 2.0366 780 1.6074 0.5414 0.4474 0.4518 0.4474
0.7942 2.0627 790 1.4652 0.5256 0.4961 0.4964 0.4961
0.56 2.0888 800 1.4838 0.5312 0.4996 0.5013 0.4996
0.6195 2.1149 810 1.6563 0.5676 0.4692 0.4506 0.4692
0.6324 2.1410 820 1.7346 0.5614 0.4657 0.4666 0.4657
0.5347 2.1671 830 1.5751 0.5405 0.5065 0.5045 0.5065
0.5954 2.1932 840 1.6409 0.5521 0.4900 0.4878 0.4900
0.5179 2.2193 850 1.6171 0.5450 0.5004 0.4995 0.5004
0.5723 2.2454 860 1.6798 0.5494 0.4874 0.4861 0.4874
0.6294 2.2715 870 1.6615 0.5341 0.4857 0.4872 0.4857
0.6877 2.2977 880 1.6713 0.5305 0.4839 0.4837 0.4839
0.6666 2.3238 890 1.7254 0.5381 0.4744 0.4715 0.4744
0.6233 2.3499 900 1.6712 0.5264 0.4831 0.4805 0.4831
0.545 2.3760 910 1.6675 0.5309 0.4839 0.4808 0.4839
0.6514 2.4021 920 1.7287 0.5382 0.4692 0.4695 0.4692
0.6389 2.4282 930 1.6598 0.5237 0.4761 0.4724 0.4761
0.6108 2.4543 940 1.6726 0.5232 0.4761 0.4678 0.4761
0.6409 2.4804 950 1.6736 0.5368 0.4848 0.4782 0.4848
0.4708 2.5065 960 1.7309 0.5504 0.4787 0.4760 0.4787
0.6782 2.5326 970 1.6217 0.5280 0.4805 0.4760 0.4805
0.514 2.5587 980 1.6088 0.5196 0.4839 0.4825 0.4839
0.5716 2.5849 990 1.6967 0.5361 0.4787 0.4780 0.4787
0.5028 2.6110 1000 1.7347 0.5347 0.4718 0.4704 0.4718
0.487 2.6371 1010 1.7448 0.5275 0.4666 0.4562 0.4666
0.5283 2.6632 1020 1.7680 0.5380 0.4709 0.4567 0.4709
0.467 2.6893 1030 1.7712 0.5476 0.4735 0.4638 0.4735
0.6161 2.7154 1040 1.6711 0.5423 0.4952 0.4901 0.4952
0.5924 2.7415 1050 1.5968 0.5343 0.5056 0.5035 0.5056
0.5925 2.7676 1060 1.6077 0.5273 0.4909 0.4867 0.4909
0.5044 2.7937 1070 1.6327 0.5390 0.4917 0.4889 0.4917
0.5258 2.8198 1080 1.6310 0.5353 0.4909 0.4882 0.4909
0.6329 2.8460 1090 1.6199 0.5271 0.4865 0.4837 0.4865
0.5266 2.8721 1100 1.6065 0.5215 0.4865 0.4848 0.4865
0.5093 2.8982 1110 1.6174 0.5232 0.4874 0.4854 0.4874
0.6284 2.9243 1120 1.6325 0.5271 0.4874 0.4851 0.4874
0.4167 2.9504 1130 1.6336 0.5274 0.4865 0.4846 0.4865
0.4789 2.9765 1140 1.6295 0.5266 0.4857 0.4836 0.4857

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.3.0+cu121
  • Tokenizers 0.19.1
Downloads last month
6
Safetensors
Model size
163M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Rasooli/results

Finetuned
(9)
this model