TheBloke commited on
Commit
694ef9c
1 Parent(s): 3fb765b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -2
README.md CHANGED
@@ -45,12 +45,12 @@ To use these files you need:
45
 
46
  Example command:
47
  ```
48
- /workspace/git/llama.cpp/main -m llama-2-70b-chat/ggml/llama-2-70b-chat.ggmlv3.q4_0.bin -gqa 8 -t 13 -p "[INST] <<SYS>>You are a helpful assistant<</SYS>>Write a story about llamas[/INST]"
49
  ```
50
 
51
  There is no CUDA support at this time, but it should be coming soon.
52
 
53
- There is no support in third-party UIs or Python libraries (llama-cpp-python, ctransformers) yet. That will come in due course.
54
 
55
  ## Repositories available
56
 
@@ -64,6 +64,8 @@ There is no support in third-party UIs or Python libraries (llama-cpp-python, ct
64
  {prompt}
65
  ```
66
 
 
 
67
  <!-- compatibility_ggml start -->
68
  ## Compatibility
69
 
 
45
 
46
  Example command:
47
  ```
48
+ ./main -m llama-2-70b/ggml/llama-2-70b.ggmlv3.q4_0.bin -gqa 8 -t 13 -p "Llamas are"
49
  ```
50
 
51
  There is no CUDA support at this time, but it should be coming soon.
52
 
53
+ There is no support in third-party UIs (eg. text-generation-webui, KoboldCpp), or Python libraries (llama-cpp-python, ctransformers) yet. That will come in due course.
54
 
55
  ## Repositories available
56
 
 
64
  {prompt}
65
  ```
66
 
67
+ **Remeber that this is a foundation model, not a fine tuned one. It may not be good at answering questions or following instructions.
68
+
69
  <!-- compatibility_ggml start -->
70
  ## Compatibility
71