|
--- |
|
license: llama2 |
|
language: |
|
- en |
|
--- |
|
|
|
# Daddy Dave's stamp of approval 👍 |
|
|
|
4-bit GPTQ quants of the writer version of [Sao10K](https://huggingface.co/Sao10K)'s fantastic [SthenoWriter model (Stheno model collection link)](https://huggingface.co/collections/Sao10K/stheno-6536a20823c9d18c09288fb1) model. |
|
|
|
The main branch contains 4-bit groupsize of 128 and no act_order. |
|
|
|
The other branches contain groupsizes of 128, 64, and 32 all with act_order. |
|
|
|
## **⬇︎** Original card **⬇︎** |
|
|
|
|
|
|
|
<img src="https://c4.wallpaperflare.com/wallpaper/309/535/658/anime-anime-girls-fate-series-fate-grand-order-stheno-fate-grand-order-hd-wallpaper-preview.jpg" style="width: 70%; min-width: 300px; display: block; margin: auto;"> |
|
|
|
A Stheno-1.8 Variant focused on writing. |
|
|
|
Stheno-1.8 + Storywriter, mixed with Holodeck + Spring Dragon qLoRA. End Result is mixed with One More Experimental Literature-based LoRA. |
|
|
|
Re-Reviewed... it's not bad, honestly. |
|
|
|
Support me [here](https://ko-fi.com/sao10k) :) |
|
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard) |
|
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__SthenoWriter-L2-13B) |
|
|
|
| Metric | Value | |
|
|-----------------------|---------------------------| |
|
| Avg. | 48.35 | |
|
| ARC (25-shot) | 62.29 | |
|
| HellaSwag (10-shot) | 83.28 | |
|
| MMLU (5-shot) | 56.14 | |
|
| TruthfulQA (0-shot) | 44.72 | |
|
| Winogrande (5-shot) | 74.35 | |
|
| GSM8K (5-shot) | 11.22 | |
|
| DROP (3-shot) | 6.48 | |
|
|