Apply for community grant: Academic project (gpu)

#1
by GGLS - opened
FuseAI org

FuseChat-7B-VaRM is the fusion of three prominent chat LLMs with diverse architectures and scales, namely NH2-Mixtral-8x7B, NH2-Solar-10.7B, and OpenChat-3.5-7B, using the approach proposed in "FuseChat: Knowledge Fusion of Chat Models". FuseChat-7B-VaRM achieves the SOTA performance of 7B LLM on MT-Bench, outperforming various powerful chat LLMs like Starling-7B, Yi-34B-Chat, and Tulu-2-DPO-70B, even surpassing GPT-3.5 (March), Claude-2.1, and approaching Mixtral-8x7B-Instruct.

Hi @GGLS , we have assigned a gpu to this space. Note that GPU Grants are provided temporarily and might be removed after some time if the usage is very low.

To learn more about GPUs in Spaces, please check out https://huggingface.co/docs/hub/spaces-gpus

FuseAI org

Many thanks to you and the HF team for your help!👏

Sign up or log in to comment