imfinethx's picture
v1
77fa62a
[2024-08-20 10:20:55 root] (mobilequant.py 132): INFO Namespace(hf_path='checkpoints/hfmodels/stablelm-2-1_6b', dtype='float32', output_dir='results/stablelm-2-1_6b-e2e-w4a8-s1024-e60-sym', cache_dir='./cache', resume=None, calib_dataset='pile', nsamples=1024, seqlen=2048, act_dict_path='checkpoints/hfmodels/stablelm-2-1_6b/act_dict.json', override_qcfg_path='checkpoints/hfmodels/stablelm-2-1_6b/default_qcfg.json', weight_bitwidth=4, weight_group_size=-1, weight_is_per_channel=True, weight_is_symmetric=True, weight_is_dynamic=False, act_bitwidth=8, act_group_size=-1, act_is_per_channel=False, act_is_symmetric=False, act_is_dynamic=False, let=True, lwc=True, lrl=True, let_lr=0.001, lwc_lr=0.005, lrl_lr=1e-08, let_min_lr=0.0005, lwc_min_lr=0.001, lrl_min_lr=1e-08, wd=0, epochs=60, warmup_epochs=0, use_shift=False, aug_loss=False, deactive_amp=True, batch_size=1, num_fewshot=0, tasks='wikitext', mode='e2e', original_omniquant=False, cache_in_gpu=False, use_8bit_softmax_input=False, use_8bit_softmax_output=False, model_family='stablelm')
[2024-08-20 10:21:10 root] (mobilequant.py 218): INFO === start quantization ===
[2024-08-20 10:21:19 root] (algorithm.py 588): INFO Starting ...
[2024-08-20 11:47:42 root] (algorithm.py 759): INFO Epoch 0 loss:26.730098724365234 norm:154457.15625 max memory_allocated 20623.3544921875
[2024-08-20 13:02:58 root] (algorithm.py 759): INFO Epoch 1 loss:14.59599494934082 norm:31563.0703125 max memory_allocated 20623.5126953125
[2024-08-20 14:18:13 root] (algorithm.py 759): INFO Epoch 2 loss:11.569694519042969 norm:20945.201171875 max memory_allocated 20623.5126953125
[2024-08-20 15:33:29 root] (algorithm.py 759): INFO Epoch 3 loss:9.921379089355469 norm:15760.3720703125 max memory_allocated 20623.5126953125
[2024-08-20 16:48:47 root] (algorithm.py 759): INFO Epoch 4 loss:8.987866401672363 norm:10227.7607421875 max memory_allocated 20623.5126953125
[2024-08-20 18:04:07 root] (algorithm.py 759): INFO Epoch 5 loss:8.282011985778809 norm:7816.171875 max memory_allocated 20623.5126953125
[2024-08-20 19:19:25 root] (algorithm.py 759): INFO Epoch 6 loss:7.791882038116455 norm:8762.71484375 max memory_allocated 20623.5126953125
[2024-08-20 20:34:42 root] (algorithm.py 759): INFO Epoch 7 loss:7.361362457275391 norm:7639.37890625 max memory_allocated 20623.5126953125
[2024-08-20 21:50:02 root] (algorithm.py 759): INFO Epoch 8 loss:7.068630695343018 norm:7230.576171875 max memory_allocated 20623.5126953125
[2024-08-20 23:05:21 root] (algorithm.py 759): INFO Epoch 9 loss:6.86011266708374 norm:11696.0732421875 max memory_allocated 20623.5126953125
[2024-08-21 00:20:40 root] (algorithm.py 759): INFO Epoch 10 loss:6.662522315979004 norm:8324.439453125 max memory_allocated 20623.5126953125
[2024-08-21 01:35:59 root] (algorithm.py 759): INFO Epoch 11 loss:6.517264366149902 norm:8439.5390625 max memory_allocated 20623.5126953125
[2024-08-21 02:51:14 root] (algorithm.py 759): INFO Epoch 12 loss:6.40146017074585 norm:7479.6396484375 max memory_allocated 20623.5126953125
[2024-08-21 04:06:30 root] (algorithm.py 759): INFO Epoch 13 loss:6.278970718383789 norm:7454.52783203125 max memory_allocated 20623.5126953125
[2024-08-21 05:21:47 root] (algorithm.py 759): INFO Epoch 14 loss:6.14576530456543 norm:6593.73779296875 max memory_allocated 20623.5126953125
[2024-08-21 06:37:02 root] (algorithm.py 759): INFO Epoch 15 loss:6.038482189178467 norm:6197.99267578125 max memory_allocated 20623.5126953125
[2024-08-21 07:52:21 root] (algorithm.py 759): INFO Epoch 16 loss:5.959897518157959 norm:6284.556640625 max memory_allocated 20623.5126953125
[2024-08-21 09:07:36 root] (algorithm.py 759): INFO Epoch 17 loss:5.932336330413818 norm:7253.76513671875 max memory_allocated 20623.5126953125
[2024-08-21 10:22:53 root] (algorithm.py 759): INFO Epoch 18 loss:5.8416948318481445 norm:5788.38623046875 max memory_allocated 20623.5126953125
[2024-08-21 11:38:09 root] (algorithm.py 759): INFO Epoch 19 loss:5.810154914855957 norm:6902.89111328125 max memory_allocated 20623.5126953125
[2024-08-21 12:53:24 root] (algorithm.py 759): INFO Epoch 20 loss:5.747200965881348 norm:7247.01025390625 max memory_allocated 20623.5126953125
[2024-08-21 14:08:42 root] (algorithm.py 759): INFO Epoch 21 loss:5.694100379943848 norm:7314.0712890625 max memory_allocated 20623.5126953125
[2024-08-21 15:24:00 root] (algorithm.py 759): INFO Epoch 22 loss:5.682767868041992 norm:7056.49951171875 max memory_allocated 20623.5126953125
[2024-08-21 16:39:16 root] (algorithm.py 759): INFO Epoch 23 loss:5.591974258422852 norm:7202.66845703125 max memory_allocated 20623.5126953125
[2024-08-21 17:54:33 root] (algorithm.py 759): INFO Epoch 24 loss:5.532505512237549 norm:6134.50244140625 max memory_allocated 20623.5126953125
[2024-08-21 19:09:51 root] (algorithm.py 759): INFO Epoch 25 loss:5.522594928741455 norm:6979.7685546875 max memory_allocated 20623.5126953125
[2024-08-21 20:25:09 root] (algorithm.py 759): INFO Epoch 26 loss:5.5915350914001465 norm:8239.1484375 max memory_allocated 20623.5126953125
[2024-08-21 21:40:27 root] (algorithm.py 759): INFO Epoch 27 loss:5.470322608947754 norm:6033.7578125 max memory_allocated 20623.5126953125
[2024-08-21 22:55:45 root] (algorithm.py 759): INFO Epoch 28 loss:5.444573879241943 norm:6152.4833984375 max memory_allocated 20623.5126953125
[2024-08-22 00:11:05 root] (algorithm.py 759): INFO Epoch 29 loss:5.38990592956543 norm:5839.91845703125 max memory_allocated 20623.5126953125
[2024-08-22 01:26:22 root] (algorithm.py 759): INFO Epoch 30 loss:5.363168239593506 norm:5975.125 max memory_allocated 20623.5126953125
[2024-08-22 02:41:39 root] (algorithm.py 759): INFO Epoch 31 loss:5.356024265289307 norm:5470.513671875 max memory_allocated 20623.5126953125
[2024-08-22 03:56:58 root] (algorithm.py 759): INFO Epoch 32 loss:5.276673793792725 norm:4915.10791015625 max memory_allocated 20623.5126953125
[2024-08-22 05:12:16 root] (algorithm.py 759): INFO Epoch 33 loss:5.250897407531738 norm:6178.38818359375 max memory_allocated 20623.5126953125
[2024-08-22 06:27:35 root] (algorithm.py 759): INFO Epoch 34 loss:5.25920295715332 norm:5816.39697265625 max memory_allocated 20623.5126953125
[2024-08-22 07:42:56 root] (algorithm.py 759): INFO Epoch 35 loss:5.2382307052612305 norm:6143.68505859375 max memory_allocated 20623.5126953125
[2024-08-22 08:58:13 root] (algorithm.py 759): INFO Epoch 36 loss:5.241503715515137 norm:6338.25244140625 max memory_allocated 20623.5126953125
[2024-08-22 10:13:31 root] (algorithm.py 759): INFO Epoch 37 loss:5.201595306396484 norm:6519.982421875 max memory_allocated 20623.5126953125
[2024-08-22 11:28:48 root] (algorithm.py 759): INFO Epoch 38 loss:5.194555282592773 norm:6877.61376953125 max memory_allocated 20623.5126953125
[2024-08-22 12:44:06 root] (algorithm.py 759): INFO Epoch 39 loss:5.250302314758301 norm:6530.8330078125 max memory_allocated 20623.5126953125
[2024-08-22 13:59:24 root] (algorithm.py 759): INFO Epoch 40 loss:5.178772449493408 norm:5352.7294921875 max memory_allocated 20623.5126953125
[2024-08-22 15:14:40 root] (algorithm.py 759): INFO Epoch 41 loss:5.153307914733887 norm:6240.927734375 max memory_allocated 20623.5126953125
[2024-08-22 16:29:58 root] (algorithm.py 759): INFO Epoch 42 loss:5.161324977874756 norm:6242.61962890625 max memory_allocated 20623.5126953125
[2024-08-22 17:45:14 root] (algorithm.py 759): INFO Epoch 43 loss:5.111675262451172 norm:4540.1044921875 max memory_allocated 20623.5126953125
[2024-08-22 19:00:31 root] (algorithm.py 759): INFO Epoch 44 loss:5.112933158874512 norm:6280.0361328125 max memory_allocated 20623.5126953125
[2024-08-22 20:15:46 root] (algorithm.py 759): INFO Epoch 45 loss:5.0947265625 norm:6525.57080078125 max memory_allocated 20623.5126953125
[2024-08-22 21:31:03 root] (algorithm.py 759): INFO Epoch 46 loss:5.081794261932373 norm:5266.62109375 max memory_allocated 20623.5126953125
[2024-08-22 22:46:19 root] (algorithm.py 759): INFO Epoch 47 loss:5.082479953765869 norm:5125.6845703125 max memory_allocated 20623.5126953125
[2024-08-23 00:01:38 root] (algorithm.py 759): INFO Epoch 48 loss:5.058719158172607 norm:6544.69921875 max memory_allocated 20623.5126953125
[2024-08-23 01:16:57 root] (algorithm.py 759): INFO Epoch 49 loss:5.053481578826904 norm:5061.23828125 max memory_allocated 20623.5126953125
[2024-08-23 02:32:12 root] (algorithm.py 759): INFO Epoch 50 loss:5.040632247924805 norm:5612.8369140625 max memory_allocated 20623.5126953125
[2024-08-23 03:47:28 root] (algorithm.py 759): INFO Epoch 51 loss:5.017019271850586 norm:5763.54833984375 max memory_allocated 20623.5126953125
[2024-08-23 05:02:44 root] (algorithm.py 759): INFO Epoch 52 loss:5.024693965911865 norm:5505.98486328125 max memory_allocated 20623.5126953125
[2024-08-23 06:18:00 root] (algorithm.py 759): INFO Epoch 53 loss:5.040638446807861 norm:5578.37890625 max memory_allocated 20623.5126953125
[2024-08-23 07:33:16 root] (algorithm.py 759): INFO Epoch 54 loss:5.020227432250977 norm:5826.90380859375 max memory_allocated 20623.5126953125
[2024-08-23 08:48:31 root] (algorithm.py 759): INFO Epoch 55 loss:4.999330043792725 norm:6163.14794921875 max memory_allocated 20623.5126953125
[2024-08-23 10:03:50 root] (algorithm.py 759): INFO Epoch 56 loss:4.990512847900391 norm:6249.76953125 max memory_allocated 20623.5126953125
[2024-08-23 11:19:13 root] (algorithm.py 759): INFO Epoch 57 loss:4.992715835571289 norm:5693.6689453125 max memory_allocated 20623.5126953125
[2024-08-23 12:34:31 root] (algorithm.py 759): INFO Epoch 58 loss:4.979547023773193 norm:5596.0205078125 max memory_allocated 20623.5126953125
[2024-08-23 13:49:51 root] (algorithm.py 759): INFO Epoch 59 loss:4.991926193237305 norm:18962.552734375 max memory_allocated 20623.5126953125
[2024-08-23 13:49:53 root] (mobilequant.py 233): INFO 271722.98099660873
[2024-08-23 13:49:55 huggingface_hub.repocard] (repocard.py 107): WARNING Repo card metadata block was not found. Setting CardData to empty.
[2024-08-23 13:55:21 root] (mobilequant.py 110): INFO {'results': {'wikitext': {'word_perplexity': 36.434425337393264, 'byte_perplexity': 1.9588921928789669, 'bits_per_byte': 0.9700380014448762}}, 'versions': {'wikitext': 1}, 'config': {'model': None, 'model_args': None, 'num_fewshot': 0, 'batch_size': 1, 'batch_sizes': [], 'device': None, 'no_cache': True, 'limit': None, 'bootstrap_iters': 100000, 'description_dict': None}}