Edit model card

Model

🐘 Gaja

Gaja is a Hindi/Hinglish chat model, initially trained on SarvamAI's OpenHathi model and further fine-tuned for conversational interactions. Image

Additional Information

  • It outperforms Airavata, AI4Bharat's chat version, on Huggingface OpenLLM benchmark suite.
  • It was fine-tuned on only 5k samples

Inference

hey guys thanks to Bhabha AI, you guys can finally try my model

Additional Information

  • The code for this can be found in The github code - Github

πŸ’¬ Prompt template

<|im_start|>user
{}<|im_end|> 
<|im_start|>assistant
{}<|im_end|> 

😎 Features:

  • Language Support: Gaja is designed to understand and generate responses in both Hindi and Hinglish, catering to a diverse range of users.
  • Base Model: Built upon SarvamAI's OpenHathi model, Gaja inherits its foundational capabilities while being optimized for conversational tasks.
  • Fine-tuning: Gaja has undergone fine-tuning specifically for chat-based interactions, enhancing its ability to engage in meaningful conversations with users.
  • Experimental Platform: With its flexibility and adaptability, Gaja serves as a valuable platform for conducting experiments and exploring innovative approaches to chatbot development.

Open LLM Leaderboard Evaluation Results

Detailed results can be found here

Metric Value
Avg. 46.98
AI2 Reasoning Challenge (25-Shot) 51.79
HellaSwag (10-Shot) 75.79
MMLU (5-Shot) 40.69
TruthfulQA (0-shot) 41.50
Winogrande (5-shot) 71.90
GSM8k (5-shot) 0.23
Downloads last month
72
Safetensors
Model size
6.87B params
Tensor type
FP16
Β·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for damerajee/Gaja-v2.00

Finetunes
1 model

Dataset used to train damerajee/Gaja-v2.00

Collection including damerajee/Gaja-v2.00

Evaluation results