--- library_name: peft tags: - generated_from_trainer base_model: slplab/polyglot-ko-1.3b-pretrained-asd model-index: - name: pretrain-asd_w-cot_w-asd_text-features results: [] --- # pretrain-asd_w-cot_w-asd_text-features This model was trained from scratch on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.2919 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 8 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:------:|:-----:|:---------------:| | 0.5331 | 0.1725 | 1000 | 0.3109 | | 0.3076 | 0.3450 | 2000 | 0.3025 | | 0.2963 | 0.5174 | 3000 | 0.3007 | | 0.3008 | 0.6899 | 4000 | 0.2992 | | 0.2969 | 0.8624 | 5000 | 0.2984 | | 0.2956 | 1.0349 | 6000 | 0.2977 | | 0.2943 | 1.2074 | 7000 | 0.3000 | | 0.2973 | 1.3798 | 8000 | 0.2968 | | 0.2927 | 1.5523 | 9000 | 0.2953 | | 0.2949 | 1.7248 | 10000 | 0.2943 | | 0.2915 | 1.8973 | 11000 | 0.2931 | | 0.2897 | 2.0698 | 12000 | 0.2937 | | 0.2885 | 2.2422 | 13000 | 0.2926 | | 0.2945 | 2.4147 | 14000 | 0.2928 | | 0.2907 | 2.5872 | 15000 | 0.2923 | | 0.292 | 2.7597 | 16000 | 0.2922 | | 0.2899 | 2.9322 | 17000 | 0.2919 | ### Framework versions - PEFT 0.11.1 - Transformers 4.41.1 - Pytorch 2.3.0+cu121 - Datasets 2.19.1 - Tokenizers 0.19.1