File size: 741 Bytes
db659da
 
 
 
 
 
 
4e6fe10
db659da
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
---
datasets:
- togethercomputer/RedPajama-Data-1T-Sample
tags:
- llama2
- llama
---
A second model merge by [chargoddard](https://huggingface.co/chargoddard). A GGML conversion of the previous merge can be found [here](https://huggingface.co/IHaveNoClueAndIMustPost/Llama-2-22B-GGML).<br>
I have no idea what I'm doing so if something doesn't work as it should or not at all that's likely on me, not the models themselves.<br><br>
Description copied from the [original repo](https://huggingface.co/chargoddard/llama2-22b-blocktriangular) below.

<i>
Similar to llama2-22b, but with BLOCK_DIAGONAL=false in the merge and twice the fine-tuning tokens. 

Again, not intended for direct use - meant as a base for further tuning and merging.</i>