trillionmonster commited on
Commit
4d3469c
1 Parent(s): 8fe0aed

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -17
README.md CHANGED
@@ -29,22 +29,7 @@ Baichuan-13B-Chat is the aligned version in the Baichuan-13B series of models, a
29
  4. **Open-source, free, and commercially usable**: Baichuan-13B is not only fully open to academic research, but developers can also use it for free commercially after applying for and receiving official commercial permission via email.
30
 
31
 
32
- ## 使用方式
33
-
34
- 如下是一个使用Baichuan-13B-Chat进行对话的示例,正确输出为"乔戈里峰。世界第二高峰———乔戈里峰西方登山者称其为k2峰,海拔高度是8611米,位于喀喇昆仑山脉的中巴边境上"
35
- ```python
36
- import torch
37
- from transformers import AutoModelForCausalLM, AutoTokenizer
38
- from transformers.generation.utils import GenerationConfig
39
- tokenizer = AutoTokenizer.from_pretrained("baichuan-inc/Baichuan-13B-Chat", use_fast=False, trust_remote_code=True)
40
- model = AutoModelForCausalLM.from_pretrained("baichuan-inc/Baichuan-13B-Chat", device_map="auto", torch_dtype=torch.float16, trust_remote_code=True)
41
- model.generation_config = GenerationConfig.from_pretrained("baichuan-inc/Baichuan-13B-Chat")
42
- messages = []
43
- messages.append({"role": "user", "content": "世界上第二高的山峰是哪座"})
44
- response = model.chat(tokenizer, messages)
45
- print(response)
46
- ```
47
- ## int8 量化部署
48
 
49
  ```python
50
  import torch
@@ -54,7 +39,7 @@ tokenizer = AutoTokenizer.from_pretrained("trillionmonster/Baichuan-13B-Chat-8bi
54
  model = AutoModelForCausalLM.from_pretrained("trillionmonster/Baichuan-13B-Chat-8bit", device_map="auto", trust_remote_code=True)
55
  model.generation_config = GenerationConfig.from_pretrained("trillionmonster/Baichuan-13B-Chat-8bit")
56
  messages = []
57
- messages.append({"role": "user", "content": "Which moutain is the second highest one in the world?"})
58
  response = model.chat(tokenizer, messages)
59
  print(response)
60
  ```
 
29
  4. **Open-source, free, and commercially usable**: Baichuan-13B is not only fully open to academic research, but developers can also use it for free commercially after applying for and receiving official commercial permission via email.
30
 
31
 
32
+ ## 使用方式(int8)
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
33
 
34
  ```python
35
  import torch
 
39
  model = AutoModelForCausalLM.from_pretrained("trillionmonster/Baichuan-13B-Chat-8bit", device_map="auto", trust_remote_code=True)
40
  model.generation_config = GenerationConfig.from_pretrained("trillionmonster/Baichuan-13B-Chat-8bit")
41
  messages = []
42
+ messages.append({"role": "user", "content": "世界上第二高的山峰是哪座"})
43
  response = model.chat(tokenizer, messages)
44
  print(response)
45
  ```