Update README.md
Browse files
README.md
CHANGED
@@ -30,28 +30,34 @@ Baichuan-13B is an open-source, commercially available large-scale language mode
|
|
30 |
|
31 |
## How to Get Started with the Model
|
32 |
|
33 |
-
如下是一个使用Baichuan-13B
|
34 |
```python
|
35 |
-
|
36 |
-
|
37 |
-
|
38 |
-
|
39 |
-
|
40 |
-
|
41 |
-
|
42 |
-
|
|
|
|
|
43 |
```
|
44 |
|
45 |
-
|
|
|
|
|
46 |
```python
|
47 |
-
|
48 |
-
|
49 |
-
|
50 |
-
|
51 |
-
|
52 |
-
|
53 |
-
|
54 |
-
|
|
|
|
|
55 |
```
|
56 |
|
57 |
## Model Details
|
|
|
30 |
|
31 |
## How to Get Started with the Model
|
32 |
|
33 |
+
如下是一个使用Baichuan-13B-Chat进行对话的示例,正确输出为"乔戈里峰。世界第二高峰———乔戈里峰西方登山者称其为k2峰,海拔高度是8611米,位于喀喇昆仑山脉的中巴边境上"
|
34 |
```python
|
35 |
+
import torch
|
36 |
+
from transformers import AutoModel, AutoTokenizer
|
37 |
+
from transformers.generation.utils import GenerationConfig
|
38 |
+
tokenizer = AutoTokenizer.from_pretrained("baichuan-inc/Baichuan-13B-Chat", use_fast=False, trust_remote_code=True)
|
39 |
+
model = AutoModel.from_pretrained("baichuan-inc/Baichuan-13B-Chat", device_map="auto", torch_dtype=torch.float16, trust_remote_code=True)
|
40 |
+
model.generation_config = GenerationConfig.from_pretrained("baichuan-inc/Baichuan-13B-Chat")
|
41 |
+
messages = []
|
42 |
+
messages.append({"role": "user", "content": "世界上第二高的山峰是哪座"})
|
43 |
+
response = model.chat(tokenizer, messages)
|
44 |
+
print(response)
|
45 |
```
|
46 |
|
47 |
+
```
|
48 |
+
|
49 |
+
Here is an example of a conversation using Baichuan-13B-Chat, the correct output is "K2. The world's second highest peak - K2, also known as Mount Godwin-Austen or Chhogori, with an altitude of 8611 meters, is located on the China-Pakistan border in the Karakoram Range."
|
50 |
```python
|
51 |
+
import torch
|
52 |
+
from transformers import AutoModel, AutoTokenizer
|
53 |
+
from transformers.generation.utils import GenerationConfig
|
54 |
+
tokenizer = AutoTokenizer.from_pretrained("baichuan-inc/Baichuan-13B-Chat", use_fast=False, trust_remote_code=True)
|
55 |
+
model = AutoModel.from_pretrained("baichuan-inc/Baichuan-13B-Chat", device_map="auto", torch_dtype=torch.float16, trust_remote_code=True)
|
56 |
+
model.generation_config = GenerationConfig.from_pretrained("baichuan-inc/Baichuan-13B-Chat")
|
57 |
+
messages = []
|
58 |
+
messages.append({"role": "user", "content": "The second highest mountain in the world is K2."})
|
59 |
+
response = model.chat(tokenizer, messages)
|
60 |
+
print(response)
|
61 |
```
|
62 |
|
63 |
## Model Details
|