RainyMotip-2x7B / README.md
Alsebay's picture
Update README.md
d36a167 verified
metadata
tags:
  - moe
  - merge
license: apache-2.0

What is it? A 2x7B MoE model for Roleplay(?).

You will occur GPT-like responses sometimes, just skip it and reroll (gacha time). Overall, I think it good enough for Roleplaying.

You may want see this: https://huggingface.co/Alsebay/My_LLMs_Leaderboard

This model is is a Mixure of Experts (MoE) made with the following models:

  • udkai/Turdus
  • Kquant03/Samlagast-7B-laser-bf16

If you used it, please let me know if it good or not. Thank you :)

GGUF version here:

https://huggingface.co/Alsebay/RainyMotip-2x7B-GGUF

Want more Quantization?

mradermacher made full of them. Check it out :) https://huggingface.co/mradermacher/RainyMotip-2x7B-GGUF