--- base_model: - Fischerboot/InternLM2-ToxicRP-QLORA-4Bit library_name: transformers tags: - mergekit - merge language: - en --- ![download.png](https://raw.githubusercontent.com/Fischherboot/Aculi/main/watermark-no-bg.png) # InternLM2-chat-20B-ToxicRP-QLORA-Merged This Model was Finetuned by me, using the Machine Power of g4rg. Big Thanks to all people that helped me. Do whatever you want with this Model, just dont do anything illegal. non Quantized here: Fischerboot/InternLM2-Chat-20B-ToxicRP-QLORA-Merged ### Have fun ### This Model uses CHATML btw. ### Merge Method This model was merged using the passthrough merge method. ### Models Merged The following models were included in the merge: * output/intervitens_internlm2-limarp-chat-20b-2 + [Fischerboot/InternLM2-ToxicRP-QLORA-4Bit](https://huggingface.co/Fischerboot/InternLM2-ToxicRP-QLORA-4Bit) ### Configuration The following YAML configuration was used to produce this model: ```yaml dtype: bfloat16 merge_method: passthrough slices: - sources: - layer_range: [0, 48] model: output/intervitens_internlm2-limarp-chat-20b-2+Fischerboot/InternLM2-ToxicRP-QLORA-4Bit ```