ThomasBaruzier commited on
Commit
ae54722
1 Parent(s): a4e4946

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +13 -0
README.md CHANGED
@@ -190,6 +190,19 @@ extra_gated_description: The information you provide will be collected, stored,
190
  extra_gated_button_content: Submit
191
  ---
192
 
 
 
 
 
 
 
 
 
 
 
 
 
 
193
  ## Model Information
194
 
195
  The Meta Llama 3.1 collection of multilingual large language models (LLMs) is a collection of pretrained and instruction tuned generative models in 8B, 70B and 405B sizes (text in/text out). The Llama 3.1 instruction tuned text only models (8B, 70B, 405B) are optimized for multilingual dialogue use cases and outperform many of the available open source and closed chat models on common industry benchmarks.
 
190
  extra_gated_button_content: Submit
191
  ---
192
 
193
+ # Llama.cpp imatrix quantizations of meta-llama/Meta-Llama-3-405B-Instruct
194
+
195
+ <!-- Better pic but I would like to talk about my quants on Linkedin so yeah <img src="https://cdn-uploads.huggingface.co/production/uploads/646410e04bf9122922289dc7/xlkSJli8IQ9KoTAuTKOF2.png" alt="llama" width="30%"/> -->
196
+ <img src="https://cdn-uploads.huggingface.co/production/uploads/646410e04bf9122922289dc7/LQUL7YII8okA8CG54mQSI.jpeg" alt="llama" width="60%"/>
197
+
198
+ Using llama.cpp commit [cfac111](https://github.com/ggerganov/llama.cpp/commit/cfac111e2b3953cdb6b0126e67a2487687646971) for quantization.
199
+
200
+ Original model: https://huggingface.co/meta-llama/Meta-Llama-3.1-405B-Instruct
201
+
202
+ All quants were made using the imatrix option and Bartowski's [calibration file](https://gist.github.com/bartowski1182/eb213dccb3571f863da82e99418f81e8).
203
+
204
+ <hr>
205
+
206
  ## Model Information
207
 
208
  The Meta Llama 3.1 collection of multilingual large language models (LLMs) is a collection of pretrained and instruction tuned generative models in 8B, 70B and 405B sizes (text in/text out). The Llama 3.1 instruction tuned text only models (8B, 70B, 405B) are optimized for multilingual dialogue use cases and outperform many of the available open source and closed chat models on common industry benchmarks.