v1
Browse files- .gitattributes +1 -0
- act_dict.json +3 -0
- config.json +3 -0
- default_qcfg.json +3 -0
- generation_config.json +3 -0
- log_rank0_1724147667.txt +66 -0
- merges.txt +0 -0
- pytorch_model-00001-of-00002.bin +3 -0
- pytorch_model-00002-of-00002.bin +3 -0
- pytorch_model.bin.index.json +3 -0
- results.json +3 -0
- special_tokens_map.json +3 -0
- tokenizer_config.json +3 -0
- vocab.json +3 -0
.gitattributes
CHANGED
@@ -33,3 +33,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
|
33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
|
|
|
33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
36 |
+
*.json filter=lfs diff=lfs merge=lfs -text
|
act_dict.json
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:f0d9cdc4255ff80e9088b357ee939b34554d3c49e7dcd16cf5c2df581749a60f
|
3 |
+
size 58017
|
config.json
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:21db0c5d6ee36ab9fc2518d9e19d5e23fb1af96b49680271f0b74abc7d4fbaf6
|
3 |
+
size 1297
|
default_qcfg.json
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:023c671e427b83b2e34f8958bd5a0c7ffccf89bd178b652115f8ef337029796d
|
3 |
+
size 153506
|
generation_config.json
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:8eb4f4781119e7385a3c609ae370dbd37f5c205061c617038319b84a3e121f7a
|
3 |
+
size 121
|
log_rank0_1724147667.txt
ADDED
@@ -0,0 +1,66 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
[2024-08-20 09:54:27 root] (mobilequant.py 132): INFO Namespace(hf_path='checkpoints/hfmodels/stablelm-2-1_6b', dtype='float32', output_dir='results/stablelm-2-1_6b-e2e-w4a8-s1024-e60', cache_dir='./cache', resume=None, calib_dataset='pile', nsamples=1024, seqlen=2048, act_dict_path='checkpoints/hfmodels/stablelm-2-1_6b/act_dict.json', override_qcfg_path='checkpoints/hfmodels/stablelm-2-1_6b/default_qcfg.json', weight_bitwidth=4, weight_group_size=-1, weight_is_per_channel=True, weight_is_symmetric=False, weight_is_dynamic=False, act_bitwidth=8, act_group_size=-1, act_is_per_channel=False, act_is_symmetric=False, act_is_dynamic=False, let=True, lwc=True, lrl=True, let_lr=0.001, lwc_lr=0.005, lrl_lr=1e-06, let_min_lr=0.0005, lwc_min_lr=0.001, lrl_min_lr=1e-07, wd=0, epochs=60, warmup_epochs=0, use_shift=False, aug_loss=False, deactive_amp=True, batch_size=1, num_fewshot=0, tasks='wikitext', mode='e2e', original_omniquant=False, cache_in_gpu=False, use_8bit_softmax_input=False, use_8bit_softmax_output=False, model_family='stablelm')
|
2 |
+
[2024-08-20 09:54:41 root] (mobilequant.py 218): INFO === start quantization ===
|
3 |
+
[2024-08-20 09:54:51 root] (algorithm.py 588): INFO Starting ...
|
4 |
+
[2024-08-20 11:20:56 root] (algorithm.py 759): INFO Epoch 0 loss:16.338321685791016 norm:51798.05078125 max memory_allocated 20622.3701171875
|
5 |
+
[2024-08-20 12:36:04 root] (algorithm.py 759): INFO Epoch 1 loss:10.52186393737793 norm:1675.00537109375 max memory_allocated 20622.5283203125
|
6 |
+
[2024-08-20 13:51:03 root] (algorithm.py 759): INFO Epoch 2 loss:8.82116413116455 norm:705.8970947265625 max memory_allocated 20622.5283203125
|
7 |
+
[2024-08-20 15:05:58 root] (algorithm.py 759): INFO Epoch 3 loss:7.890827655792236 norm:326.4710693359375 max memory_allocated 20622.5283203125
|
8 |
+
[2024-08-20 16:20:58 root] (algorithm.py 759): INFO Epoch 4 loss:7.310484886169434 norm:229.57003784179688 max memory_allocated 20622.5283203125
|
9 |
+
[2024-08-20 17:35:57 root] (algorithm.py 759): INFO Epoch 5 loss:6.838724136352539 norm:193.25379943847656 max memory_allocated 20622.5283203125
|
10 |
+
[2024-08-20 18:50:53 root] (algorithm.py 759): INFO Epoch 6 loss:6.561842918395996 norm:207.46063232421875 max memory_allocated 20622.5283203125
|
11 |
+
[2024-08-20 20:05:46 root] (algorithm.py 759): INFO Epoch 7 loss:6.334963798522949 norm:203.60311889648438 max memory_allocated 20622.5283203125
|
12 |
+
[2024-08-20 21:20:41 root] (algorithm.py 759): INFO Epoch 8 loss:6.098388671875 norm:219.23358154296875 max memory_allocated 20622.5283203125
|
13 |
+
[2024-08-20 22:35:35 root] (algorithm.py 759): INFO Epoch 9 loss:5.938597202301025 norm:198.63034057617188 max memory_allocated 20622.5283203125
|
14 |
+
[2024-08-20 23:50:31 root] (algorithm.py 759): INFO Epoch 10 loss:5.798902988433838 norm:214.7950897216797 max memory_allocated 20622.5283203125
|
15 |
+
[2024-08-21 01:05:25 root] (algorithm.py 759): INFO Epoch 11 loss:5.690804958343506 norm:285.5950622558594 max memory_allocated 20622.5283203125
|
16 |
+
[2024-08-21 02:20:23 root] (algorithm.py 759): INFO Epoch 12 loss:5.543519496917725 norm:330.52496337890625 max memory_allocated 20622.5283203125
|
17 |
+
[2024-08-21 03:35:18 root] (algorithm.py 759): INFO Epoch 13 loss:5.482985973358154 norm:288.53997802734375 max memory_allocated 20622.5283203125
|
18 |
+
[2024-08-21 04:50:15 root] (algorithm.py 759): INFO Epoch 14 loss:5.3748273849487305 norm:256.6081848144531 max memory_allocated 20622.5283203125
|
19 |
+
[2024-08-21 06:05:15 root] (algorithm.py 759): INFO Epoch 15 loss:5.294487953186035 norm:257.7286376953125 max memory_allocated 20622.5283203125
|
20 |
+
[2024-08-21 07:20:13 root] (algorithm.py 759): INFO Epoch 16 loss:5.192619323730469 norm:256.7176208496094 max memory_allocated 20622.5283203125
|
21 |
+
[2024-08-21 08:35:09 root] (algorithm.py 759): INFO Epoch 17 loss:5.146603584289551 norm:269.418701171875 max memory_allocated 20622.5283203125
|
22 |
+
[2024-08-21 09:50:05 root] (algorithm.py 759): INFO Epoch 18 loss:5.106363773345947 norm:398.9476623535156 max memory_allocated 20622.5283203125
|
23 |
+
[2024-08-21 11:05:02 root] (algorithm.py 759): INFO Epoch 19 loss:5.064048767089844 norm:260.1558837890625 max memory_allocated 20622.5283203125
|
24 |
+
[2024-08-21 12:19:57 root] (algorithm.py 759): INFO Epoch 20 loss:5.037179946899414 norm:253.26937866210938 max memory_allocated 20622.5283203125
|
25 |
+
[2024-08-21 13:34:53 root] (algorithm.py 759): INFO Epoch 21 loss:4.980165004730225 norm:463.13140869140625 max memory_allocated 20622.5283203125
|
26 |
+
[2024-08-21 14:49:48 root] (algorithm.py 759): INFO Epoch 22 loss:4.960757255554199 norm:311.7716979980469 max memory_allocated 20622.5283203125
|
27 |
+
[2024-08-21 16:04:46 root] (algorithm.py 759): INFO Epoch 23 loss:4.9068498611450195 norm:259.2088623046875 max memory_allocated 20622.5283203125
|
28 |
+
[2024-08-21 17:19:42 root] (algorithm.py 759): INFO Epoch 24 loss:4.883824348449707 norm:248.73025512695312 max memory_allocated 20622.5283203125
|
29 |
+
[2024-08-21 18:34:37 root] (algorithm.py 759): INFO Epoch 25 loss:4.853567600250244 norm:239.97923278808594 max memory_allocated 20622.5283203125
|
30 |
+
[2024-08-21 19:49:41 root] (algorithm.py 759): INFO Epoch 26 loss:4.816624641418457 norm:256.3294677734375 max memory_allocated 20622.5283203125
|
31 |
+
[2024-08-21 21:04:38 root] (algorithm.py 759): INFO Epoch 27 loss:4.791512489318848 norm:252.91900634765625 max memory_allocated 20622.5283203125
|
32 |
+
[2024-08-21 22:19:35 root] (algorithm.py 759): INFO Epoch 28 loss:4.753429412841797 norm:254.8272705078125 max memory_allocated 20622.5283203125
|
33 |
+
[2024-08-21 23:34:31 root] (algorithm.py 759): INFO Epoch 29 loss:4.717947006225586 norm:251.19081115722656 max memory_allocated 20622.5283203125
|
34 |
+
[2024-08-22 00:49:28 root] (algorithm.py 759): INFO Epoch 30 loss:4.684601783752441 norm:266.73040771484375 max memory_allocated 20622.5283203125
|
35 |
+
[2024-08-22 02:04:24 root] (algorithm.py 759): INFO Epoch 31 loss:4.684401988983154 norm:300.70233154296875 max memory_allocated 20622.5283203125
|
36 |
+
[2024-08-22 03:19:20 root] (algorithm.py 759): INFO Epoch 32 loss:4.659708023071289 norm:288.23919677734375 max memory_allocated 20622.5283203125
|
37 |
+
[2024-08-22 04:34:18 root] (algorithm.py 759): INFO Epoch 33 loss:4.6803059577941895 norm:282.48272705078125 max memory_allocated 20622.5283203125
|
38 |
+
[2024-08-22 05:49:14 root] (algorithm.py 759): INFO Epoch 34 loss:4.659721374511719 norm:304.5974426269531 max memory_allocated 20622.5283203125
|
39 |
+
[2024-08-22 07:04:10 root] (algorithm.py 759): INFO Epoch 35 loss:4.620914936065674 norm:322.1010437011719 max memory_allocated 20622.5283203125
|
40 |
+
[2024-08-22 08:19:09 root] (algorithm.py 759): INFO Epoch 36 loss:4.597235679626465 norm:324.0323791503906 max memory_allocated 20622.5283203125
|
41 |
+
[2024-08-22 09:34:06 root] (algorithm.py 759): INFO Epoch 37 loss:4.583020210266113 norm:360.5958251953125 max memory_allocated 20622.5283203125
|
42 |
+
[2024-08-22 10:49:06 root] (algorithm.py 759): INFO Epoch 38 loss:4.5779595375061035 norm:306.8477783203125 max memory_allocated 20622.5283203125
|
43 |
+
[2024-08-22 12:04:03 root] (algorithm.py 759): INFO Epoch 39 loss:4.5875349044799805 norm:347.96331787109375 max memory_allocated 20622.5283203125
|
44 |
+
[2024-08-22 13:19:00 root] (algorithm.py 759): INFO Epoch 40 loss:4.5766520500183105 norm:318.9112548828125 max memory_allocated 20622.5283203125
|
45 |
+
[2024-08-22 14:33:56 root] (algorithm.py 759): INFO Epoch 41 loss:4.554592609405518 norm:333.951904296875 max memory_allocated 20622.5283203125
|
46 |
+
[2024-08-22 15:48:51 root] (algorithm.py 759): INFO Epoch 42 loss:4.543224334716797 norm:326.85931396484375 max memory_allocated 20622.5283203125
|
47 |
+
[2024-08-22 17:03:46 root] (algorithm.py 759): INFO Epoch 43 loss:4.5428314208984375 norm:443.6500549316406 max memory_allocated 20622.5283203125
|
48 |
+
[2024-08-22 18:18:43 root] (algorithm.py 759): INFO Epoch 44 loss:4.520886421203613 norm:340.5072021484375 max memory_allocated 20622.5283203125
|
49 |
+
[2024-08-22 19:33:39 root] (algorithm.py 759): INFO Epoch 45 loss:4.519357204437256 norm:346.5722961425781 max memory_allocated 20622.5283203125
|
50 |
+
[2024-08-22 20:48:37 root] (algorithm.py 759): INFO Epoch 46 loss:4.516678810119629 norm:354.92498779296875 max memory_allocated 20622.5283203125
|
51 |
+
[2024-08-22 22:03:34 root] (algorithm.py 759): INFO Epoch 47 loss:4.5128278732299805 norm:312.1018981933594 max memory_allocated 20622.5283203125
|
52 |
+
[2024-08-22 23:18:29 root] (algorithm.py 759): INFO Epoch 48 loss:4.495702266693115 norm:314.8588562011719 max memory_allocated 20622.5283203125
|
53 |
+
[2024-08-23 00:33:32 root] (algorithm.py 759): INFO Epoch 49 loss:4.470466136932373 norm:272.0577392578125 max memory_allocated 20622.5283203125
|
54 |
+
[2024-08-23 01:48:29 root] (algorithm.py 759): INFO Epoch 50 loss:4.4552998542785645 norm:254.8807830810547 max memory_allocated 20622.5283203125
|
55 |
+
[2024-08-23 03:03:25 root] (algorithm.py 759): INFO Epoch 51 loss:4.4674153327941895 norm:265.9458923339844 max memory_allocated 20622.5283203125
|
56 |
+
[2024-08-23 04:18:24 root] (algorithm.py 759): INFO Epoch 52 loss:4.474529266357422 norm:315.57940673828125 max memory_allocated 20622.5283203125
|
57 |
+
[2024-08-23 05:33:25 root] (algorithm.py 759): INFO Epoch 53 loss:4.463663578033447 norm:268.48394775390625 max memory_allocated 20622.5283203125
|
58 |
+
[2024-08-23 06:48:22 root] (algorithm.py 759): INFO Epoch 54 loss:4.44440221786499 norm:257.33636474609375 max memory_allocated 20622.5283203125
|
59 |
+
[2024-08-23 08:03:28 root] (algorithm.py 759): INFO Epoch 55 loss:4.440710067749023 norm:262.2379150390625 max memory_allocated 20622.5283203125
|
60 |
+
[2024-08-23 09:18:24 root] (algorithm.py 759): INFO Epoch 56 loss:4.4348883628845215 norm:262.80181884765625 max memory_allocated 20622.5283203125
|
61 |
+
[2024-08-23 10:33:21 root] (algorithm.py 759): INFO Epoch 57 loss:4.438338756561279 norm:269.5920104980469 max memory_allocated 20622.5283203125
|
62 |
+
[2024-08-23 11:48:19 root] (algorithm.py 759): INFO Epoch 58 loss:4.458076477050781 norm:964.7642211914062 max memory_allocated 20622.5283203125
|
63 |
+
[2024-08-23 13:03:15 root] (algorithm.py 759): INFO Epoch 59 loss:4.440016746520996 norm:293.9288635253906 max memory_allocated 20622.5283203125
|
64 |
+
[2024-08-23 13:03:17 root] (mobilequant.py 233): INFO 270515.80507063866
|
65 |
+
[2024-08-23 13:03:18 huggingface_hub.repocard] (repocard.py 107): WARNING Repo card metadata block was not found. Setting CardData to empty.
|
66 |
+
[2024-08-23 13:08:27 root] (mobilequant.py 110): INFO {'results': {'wikitext': {'word_perplexity': 33.547321024106495, 'byte_perplexity': 1.9288819242711195, 'bits_per_byte': 0.9477648320950163}}, 'versions': {'wikitext': 1}, 'config': {'model': None, 'model_args': None, 'num_fewshot': 0, 'batch_size': 1, 'batch_sizes': [], 'device': None, 'no_cache': True, 'limit': None, 'bootstrap_iters': 100000, 'description_dict': None}}
|
merges.txt
ADDED
The diff for this file is too large to render.
See raw diff
|
|
pytorch_model-00001-of-00002.bin
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:d1934287496fd712797f2477e399a00e17eb7dbad64be145af7a5db79f5e71dd
|
3 |
+
size 4985361128
|
pytorch_model-00002-of-00002.bin
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:3e43dd3866f5c76574fb2add23a362165c436b060e932c54c87412877042e0b0
|
3 |
+
size 1594326057
|
pytorch_model.bin.index.json
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:0f9e8912b07638ca134ef64aac3496db0a310d543464d2448143870234282f04
|
3 |
+
size 34462
|
results.json
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:1c6ba0c86ef23d4165f58d8fffc0a1a61296a202bb77313063603135a4c04eec
|
3 |
+
size 579
|
special_tokens_map.json
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:375fd2a9148d90e6532169431041f29501b8dd6dd27ff0b7553964e306d34099
|
3 |
+
size 1127
|
tokenizer_config.json
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:7825f431748d9f0a28e6a5510140cf684ce1420fa21295b78153871f49f1efd7
|
3 |
+
size 6964
|
vocab.json
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:f676e1596f8af5f6e33d35adacd6d5546b0135670c2cd87bcb569ba67074c23e
|
3 |
+
size 2012402
|