File size: 2,249 Bytes
3778ca0 118683b 3778ca0 09971a8 99340c2 118683b 09971a8 3778ca0 f386f3a 3778ca0 b3c3ce5 1113eb1 3778ca0 118683b |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 |
---
base_model:
- Fischerboot/InternLM2-ToxicRP-QLORA-4Bit
library_name: transformers
tags:
- mergekit
- merge
language:
- en
---
# More Quants follow over time!
![1715297915105.png](https://cdn-lfs-us-1.huggingface.co/repos/57/1b/571b2c4e2b91d34b081519e5dc8b22423094e6f347c2219f4a22b591ba74f736/ff228772368e0545fd0529a2b78eecb6314cc81d0c3783a45ccb34b3b2d921f1?response-content-disposition=inline%3B+filename*%3DUTF-8%27%27Download.png%3B+filename%3D%22Download.png%22%3B&response-content-type=image%2Fpng&Expires=1717257965&Policy=eyJTdGF0ZW1lbnQiOlt7IkNvbmRpdGlvbiI6eyJEYXRlTGVzc1RoYW4iOnsiQVdTOkVwb2NoVGltZSI6MTcxNzI1Nzk2NX19LCJSZXNvdXJjZSI6Imh0dHBzOi8vY2RuLWxmcy11cy0xLmh1Z2dpbmdmYWNlLmNvL3JlcG9zLzU3LzFiLzU3MWIyYzRlMmI5MWQzNGIwODE1MTllNWRjOGIyMjQyMzA5NGU2ZjM0N2MyMjE5ZjRhMjJiNTkxYmE3NGY3MzYvZmYyMjg3NzIzNjhlMDU0NWZkMDUyOWEyYjc4ZWVjYjYzMTRjYzgxZDBjMzc4M2E0NWNjYjM0YjNiMmQ5MjFmMT9yZXNwb25zZS1jb250ZW50LWRpc3Bvc2l0aW9uPSomcmVzcG9uc2UtY29udGVudC10eXBlPSoifV19&Signature=gTYaQWOEWyMb40Y9ZCHOhz5nQvimxvHd0r443ZjW0sysmDUdGA0ice01jHV5vMuLR6ZscBkA%7EEwdN63V2rhMp3Uq4pqjRyCFHL2Ip-b4Jz%7EYNyjAedXxrwQba-TVqq8LW8dF1n9KQsFIHH3J3X-zeZBrNE9wMxMx%7Epr%7EHMsrtFpskfYXYvodDYw4MwLtBwVo83gg-duzx0YyiIY6zMRzi25wkGX4JylCCfAuFY-wZUkuDEHNHBHbiflqR03k965tDmVm04Cvfv6Lwz%7Emod6OsBhY0jL9l6CCPKrAIrHdBHKH%7Ei1K8jO9olYTfweV0Lh60YC4kXd5DCrSNSHzxvvriQ__&Key-Pair-Id=KCD77M1F0VK2B)
# InternLM2-chat-20B-ToxicRP-QLORA-Merged
This Model was Finetuned by me, using the Machine Power of g4rg.
Big Thanks to all people that helped me.
Do whatever you want with this Model, just dont do anything illegal.
non Quantized here: Fischerboot/InternLM2-Chat-20B-ToxicRP-QLORA-Merged
### Have fun
### Merge Method
This model was merged using the passthrough merge method.
### Models Merged
The following models were included in the merge:
* output/intervitens_internlm2-limarp-chat-20b-2 + [Fischerboot/InternLM2-ToxicRP-QLORA-4Bit](https://huggingface.co/Fischerboot/InternLM2-ToxicRP-QLORA-4Bit)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
dtype: bfloat16
merge_method: passthrough
slices:
- sources:
- layer_range: [0, 48]
model: output/intervitens_internlm2-limarp-chat-20b-2+Fischerboot/InternLM2-ToxicRP-QLORA-4Bit
``` |