yixinsong commited on
Commit
83af9c0
1 Parent(s): 71acee4

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -3
README.md CHANGED
@@ -59,11 +59,12 @@ We evaluate the model on the datasets of [Open LLM Leaderboard](https://huggingf
59
 
60
  ### Inference Tool
61
 
62
- We utilize [PowerInfer](https://github.com/SJTU-IPADS/PowerInfer) for pure CPU inference, here we list the inference speed of pure CPU inference with fp16 precision.
 
63
 
64
- Dense Inference: 0.85 tokens/s
65
 
66
- Sparse Inference: 2.26 tokens/s
67
 
68
  ### License Disclaimer:
69
 
 
59
 
60
  ### Inference Tool
61
 
62
+ We utilize [PowerInfer](https://github.com/SJTU-IPADS/PowerInfer) for inference, here we present the inference speeds of pure CPU-based inference with fp16 precision.
63
+ The CPU configuration includes an Intel i9-13900K processor (eight performance cores at 5.4GHz) and 192GB of host memory (with a memory bandwidth of 67.2 GB/s).
64
 
65
+ Dense Inference: 5.17 tokens/s
66
 
67
+ Sparse Inference: 8.21 tokens/s
68
 
69
  ### License Disclaimer:
70