willhe-xverse commited on
Commit
128f3de
1 Parent(s): 75872ca

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -39,7 +39,7 @@ We advise you to clone [`vllm`](https://github.com/vllm-project/vllm.git) and in
39
 
40
  ## 使用方法
41
 
42
- 我们演示了如何使用 `vllm` 来运行XVERSE-7B-Chat-GPTQ-Int4量化模型:
43
 
44
  ```python
45
  from vllm import LLM, SamplingParams
@@ -67,7 +67,7 @@ for output in outputs:
67
 
68
  ## Usage
69
 
70
- We demonstrated how to use 'vllm' to run the XVERSE-7B-Chat-GPTQ-Int4 quantization model:
71
 
72
  ```python
73
  from vllm import LLM, SamplingParams
 
39
 
40
  ## 使用方法
41
 
42
+ 我们演示了如何使用 vLLM 来运行XVERSE-7B-Chat-GPTQ-Int4量化模型:
43
 
44
  ```python
45
  from vllm import LLM, SamplingParams
 
67
 
68
  ## Usage
69
 
70
+ We demonstrated how to use vLLM to run the XVERSE-7B-Chat-GPTQ-Int4 quantization model:
71
 
72
  ```python
73
  from vllm import LLM, SamplingParams