File size: 1,114 Bytes
508c4ee e6c75ff 508c4ee 517aaa6 a2cf198 e76a41e e6c75ff d57784c c3b6b10 2e81b69 c3b6b10 cf342ac e6c75ff 8f8cd65 8d26f12 8f8cd65 e6c75ff |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 |
---
base_model:
- Fischerboot/InternLM2-ToxicRP-QLORA-4Bit
library_name: transformers
tags:
- mergekit
- merge
---
![download.png](https://raw.githubusercontent.com/Fischherboot/Aculi/main/watermark-no-bg.png)
# InternLM2-chat-20B-ToxicRP-QLORA-Merged
This Model was Finetuned by me, using the Machine Power of g4rg.
Big Thanks to all people that helped me.
Do whatever you want with this Model, just dont do anything illegal.
GGUF here: Aculi/InternLM2-Chat-20B-ToxicRP-GGUF
### Have fun
### This Model uses CHATML btw.
### Merge Method
This model was merged using the passthrough merge method.
### Models Merged
The following models were included in the merge:
* output/intervitens_internlm2-limarp-chat-20b-2 + [Fischerboot/InternLM2-ToxicRP-QLORA-4Bit](https://huggingface.co/Fischerboot/InternLM2-ToxicRP-QLORA-4Bit)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
dtype: bfloat16
merge_method: passthrough
slices:
- sources:
- layer_range: [0, 48]
model: output/intervitens_internlm2-limarp-chat-20b-2+Fischerboot/InternLM2-ToxicRP-QLORA-4Bit
```
|