Edit model card

DaringMaid-20B-V1.1

Whats New?

This is an updated version of DaringMaid-20B, it is pretty much the same but with Noromaid-13b v0.3 instead of v0.1.1 and with a slightly higher weight for Noromaid.

I used v0.3 since it was the last to use Alpaca as to not break anything.

Quants

EXL2: 6bpw, 3.5bpw, 3bpw

GGUF: Q3_K_M - Q4_K_M - Q5_K_M - Q6_K_M

FP16

Recipe:

Prompt template:

I have been using Undi/Ikaris SillyTavern presets for Noromaid: Context template, Instruct template.

Alpaca:

Below is an instruction that describes a task. Write a response that appropriately completes the request.
### Instruction:
{prompt}
### Input:
{input}
### Response:

Contact

Kooten on discord.

Downloads last month
1,095
GGUF
Model size
20B params
Architecture
llama

3-bit

4-bit

5-bit

6-bit

Inference Examples
Unable to determine this model's library. Check the docs .