Edit model card

etri-ones-solar

Model Details

Model Developers

  • the model is fine-tuned by open instruction dataset

Model Architecture

  • this model is an auto-regressive language model based on the solar transformer architecture.

Base Model

Training Dataset


Model comparisons1

comming soon

Model Average Ko-ARC Ko-HellaSwag Ko-MMLU Ko-TruthfulQA Ko-CommonGen V2
[...your_model_name...] NaN NaN NaN NaN NaN NaN

Model comparisons2

AI-Harness evaluation; link

Model Copa Copa HellaSwag HellaSwag BoolQ BoolQ Sentineg Sentineg
0-shot 5-shot 0-shot 5-shot 0-shot 5-shot 0-shot 5-shot
[...your_model_name...] NaN NaN NaN NaN NaN NaN NaN NaN

Implementation Code

### KO-Platypus
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch

repo = "[...your_model_repo...]"
OpenOrca = AutoModelForCausalLM.from_pretrained(
        repo,
        return_dict=True,
        torch_dtype=torch.float16,
        device_map='auto'
)
OpenOrca_tokenizer = AutoTokenizer.from_pretrained(repo)

Downloads last month
80
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.