sanchit-gandhi HF staff commited on
Commit
31566ec
β€’
1 Parent(s): 64b847e

End of training

Browse files
README.md CHANGED
@@ -1,12 +1,16 @@
1
  ---
2
  base_model: sanchit-gandhi/Mistral-7B-v0.1-6-layer
3
  tags:
 
 
 
 
4
  - trl
5
  - sft
6
  - alignment-handbook
7
  - generated_from_trainer
8
  datasets:
9
- - generator
10
  model-index:
11
  - name: sanchit-gandhi/Mistral-7B-v0.1-6-layer
12
  results: []
@@ -17,7 +21,7 @@ should probably proofread and complete it, then remove this comment. -->
17
 
18
  # sanchit-gandhi/Mistral-7B-v0.1-6-layer
19
 
20
- This model is a fine-tuned version of [sanchit-gandhi/Mistral-7B-v0.1-6-layer](https://huggingface.co/sanchit-gandhi/Mistral-7B-v0.1-6-layer) on the generator dataset.
21
  It achieves the following results on the evaluation set:
22
  - Loss: 1.0042
23
 
 
1
  ---
2
  base_model: sanchit-gandhi/Mistral-7B-v0.1-6-layer
3
  tags:
4
+ - alignment-handbook
5
+ - trl
6
+ - sft
7
+ - generated_from_trainer
8
  - trl
9
  - sft
10
  - alignment-handbook
11
  - generated_from_trainer
12
  datasets:
13
+ - stingning/ultrachat
14
  model-index:
15
  - name: sanchit-gandhi/Mistral-7B-v0.1-6-layer
16
  results: []
 
21
 
22
  # sanchit-gandhi/Mistral-7B-v0.1-6-layer
23
 
24
+ This model is a fine-tuned version of [sanchit-gandhi/Mistral-7B-v0.1-6-layer](https://huggingface.co/sanchit-gandhi/Mistral-7B-v0.1-6-layer) on the stingning/ultrachat dataset.
25
  It achieves the following results on the evaluation set:
26
  - Loss: 1.0042
27
 
config.json CHANGED
@@ -21,6 +21,6 @@
21
  "tie_word_embeddings": false,
22
  "torch_dtype": "bfloat16",
23
  "transformers_version": "4.40.1",
24
- "use_cache": false,
25
  "vocab_size": 32000
26
  }
 
21
  "tie_word_embeddings": false,
22
  "torch_dtype": "bfloat16",
23
  "transformers_version": "4.40.1",
24
+ "use_cache": true,
25
  "vocab_size": 32000
26
  }
wandb/debug-internal.log CHANGED
@@ -87,3 +87,9 @@
87
  2024-04-25 23:50:58,539 DEBUG SenderThread:213806 [sender.py:send_request():406] send_request: stop_status
88
  2024-04-25 23:50:58,539 DEBUG HandlerThread:213806 [handler.py:handle_request():146] handle_request: internal_messages
89
  2024-04-25 23:50:59,774 INFO Thread-12 :213806 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_235040-eo45cto5/files/output.log
 
 
 
 
 
 
 
87
  2024-04-25 23:50:58,539 DEBUG SenderThread:213806 [sender.py:send_request():406] send_request: stop_status
88
  2024-04-25 23:50:58,539 DEBUG HandlerThread:213806 [handler.py:handle_request():146] handle_request: internal_messages
89
  2024-04-25 23:50:59,774 INFO Thread-12 :213806 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_235040-eo45cto5/files/output.log
90
+ 2024-04-25 23:51:01,776 INFO Thread-12 :213806 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_235040-eo45cto5/files/output.log
91
+ 2024-04-25 23:51:02,336 DEBUG HandlerThread:213806 [handler.py:handle_request():146] handle_request: status_report
92
+ 2024-04-25 23:51:03,778 INFO Thread-12 :213806 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_235040-eo45cto5/files/output.log
93
+ 2024-04-25 23:51:04,779 INFO Thread-12 :213806 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_235040-eo45cto5/files/output.log
94
+ 2024-04-25 23:51:07,341 DEBUG HandlerThread:213806 [handler.py:handle_request():146] handle_request: status_report
95
+ 2024-04-25 23:51:07,782 INFO Thread-12 :213806 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_235040-eo45cto5/files/output.log
wandb/run-20240425_235040-eo45cto5/files/output.log CHANGED
@@ -37,3 +37,20 @@ Training completed. Do not forget to share your model on huggingface.co/models =
37
  [INFO|tokenization_utils_base.py:2488] 2024-04-25 23:50:57,950 >> tokenizer config file saved in ./tokenizer_config.json
38
  [INFO|tokenization_utils_base.py:2497] 2024-04-25 23:50:57,951 >> Special tokens file saved in ./special_tokens_map.json
39
  [INFO|modelcard.py:450] 2024-04-25 23:50:57,994 >> Dropping the following result as it does not have all the necessary fields:
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
37
  [INFO|tokenization_utils_base.py:2488] 2024-04-25 23:50:57,950 >> tokenizer config file saved in ./tokenizer_config.json
38
  [INFO|tokenization_utils_base.py:2497] 2024-04-25 23:50:57,951 >> Special tokens file saved in ./special_tokens_map.json
39
  [INFO|modelcard.py:450] 2024-04-25 23:50:57,994 >> Dropping the following result as it does not have all the necessary fields:
40
+ {'task': {'name': 'Causal Language Modeling', 'type': 'text-generation'}, 'dataset': {'name': 'generator', 'type': 'generator', 'config': 'default', 'split': 'train', 'args': 'default'}}
41
+ events.out.tfevents.1714089039.ip-26-0-167-177.213140.0: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 5.07k/5.07k [00:00<00:00, 33.9kB/s]
42
+ training_args.bin: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 4.98k/4.98k [00:00<00:00, 30.1kB/s] | 0.00/5.07k [00:00<?, ?B/s]
43
+ events.out.tfevents.1714089047.ip-26-0-167-177.213140.1: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 364/364 [00:00<00:00, 2.09kB/s]
44
+ Upload 3 LFS files: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 3/3 [00:00<00:00, 7.88it/s]β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 364/364 [00:00<00:00, 3.36kB/s]
45
+ [INFO|modelcard.py:450] 2024-04-25 23:51:02,356 >> Dropping the following result as it does not have all the necessary fields:
46
+ {'task': {'name': 'Causal Language Modeling', 'type': 'text-generation'}, 'dataset': {'name': 'stingning/ultrachat', 'type': 'stingning/ultrachat', 'config': 'default', 'split': 'train', 'args': 'default'}}
47
+ [INFO|configuration_utils.py:471] 2024-04-25 23:51:02,360 >> Configuration saved in ./config.json
48
+ [INFO|trainer.py:3305] 2024-04-25 23:51:02,361 >> Saving model checkpoint to ./
49
+ [INFO|configuration_utils.py:471] 2024-04-25 23:51:02,362 >> Configuration saved in ./config.json
50
+ [INFO|configuration_utils.py:697] 2024-04-25 23:51:02,364 >> Configuration saved in ./generation_config.json
51
+ 2024-04-25 23:51:02 - INFO - __main__ - Model saved to ./
52
+ 2024-04-25 23:51:02 - INFO - __main__ - Pushing to hub...
53
+ [INFO|modeling_utils.py:2590] 2024-04-25 23:51:07,339 >> Model weights saved in ./model.safetensors
54
+ [INFO|tokenization_utils_base.py:2488] 2024-04-25 23:51:07,342 >> tokenizer config file saved in ./tokenizer_config.json
55
+ [INFO|tokenization_utils_base.py:2497] 2024-04-25 23:51:07,344 >> Special tokens file saved in ./special_tokens_map.json
56
+ [INFO|modelcard.py:450] 2024-04-25 23:51:07,388 >> Dropping the following result as it does not have all the necessary fields:
wandb/run-20240425_235040-eo45cto5/logs/debug-internal.log CHANGED
@@ -87,3 +87,9 @@
87
  2024-04-25 23:50:58,539 DEBUG SenderThread:213806 [sender.py:send_request():406] send_request: stop_status
88
  2024-04-25 23:50:58,539 DEBUG HandlerThread:213806 [handler.py:handle_request():146] handle_request: internal_messages
89
  2024-04-25 23:50:59,774 INFO Thread-12 :213806 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_235040-eo45cto5/files/output.log
 
 
 
 
 
 
 
87
  2024-04-25 23:50:58,539 DEBUG SenderThread:213806 [sender.py:send_request():406] send_request: stop_status
88
  2024-04-25 23:50:58,539 DEBUG HandlerThread:213806 [handler.py:handle_request():146] handle_request: internal_messages
89
  2024-04-25 23:50:59,774 INFO Thread-12 :213806 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_235040-eo45cto5/files/output.log
90
+ 2024-04-25 23:51:01,776 INFO Thread-12 :213806 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_235040-eo45cto5/files/output.log
91
+ 2024-04-25 23:51:02,336 DEBUG HandlerThread:213806 [handler.py:handle_request():146] handle_request: status_report
92
+ 2024-04-25 23:51:03,778 INFO Thread-12 :213806 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_235040-eo45cto5/files/output.log
93
+ 2024-04-25 23:51:04,779 INFO Thread-12 :213806 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_235040-eo45cto5/files/output.log
94
+ 2024-04-25 23:51:07,341 DEBUG HandlerThread:213806 [handler.py:handle_request():146] handle_request: status_report
95
+ 2024-04-25 23:51:07,782 INFO Thread-12 :213806 [dir_watcher.py:_on_file_modified():288] file/dir modified: /fsx/sanchit/distil-zephyr-1.5b-ssft-ultrachat/wandb/run-20240425_235040-eo45cto5/files/output.log