File size: 662 Bytes
a6f0b17
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
---
license: mit
---
# Model Card

## Model Description

This is a Large Language Model (LLM) trained on a dataset of DIBT_10k_prompts. This is a test model to ensure I can publish models. 

## Evaluation Results

### Hellaswag

|  Tasks  |Version|Filter|n-shot| Metric |   |Value |   |Stderr|
|---------|------:|------|-----:|--------|---|-----:|---|-----:|
|hellaswag|      1|none  |     0|acc     |↑  |0.2882|±  |0.0048|
|         |       |none  |     0|acc_norm|↑  |0.3082|±  |0.0047|


## How to Use this model
Download the checkpoint and load it into your preferred deep-learning framework. 

Don't use this model, use EleutherAI/pythia-160m instead.