Model Overview
OLMo-1B-Base-Shakespeare
is a fine-tuned version of the allenai/OLMo-1B-0724-hf
model, retrained on the complete collection of novels by William Shakespeare. The model aims to generate text in the style of Shakespeare's works and has been optimized to capture the linguistic and stylistic nuances present in the original text.
Model Details
Model Type: Base Model
Base Model: allenai/OLMo-1B-0724-hf
Training Dataset: Works by William Shakespeare
GPU VRAM Requirements: 25 GB
Intended Use Cases:
- Creative writing assistance
- Educational purposes for studying literary styles
- Text generation in the style of William Shakespeare
Installation
Ensure you have the transformers
library installed:
pip install transformers
Inference
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
torch.random.manual_seed(0)
model_name = 'sartajbhuvaji/OLMo-1B-Base-Shakespeare'
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(
model_name,
device_map="cuda",
torch_dtype="auto",
trust_remote_code=True,
)
model.to('cuda')
input_text = 'Hello how are you?'
input_ids = tokenizer.encode(input_text, return_tensors='pt').to('cuda')
output = model.generate(input_ids, max_length=100, num_return_sequences=1, no_repeat_ngram_size=2)
generated_text = tokenizer.decode(output[0], skip_special_tokens=True)
print(generated_text)
'''
Hello how are you?
SECOND GENTLEMAN. I am a gentleman.
The Duke, my lord, and all the court are yours.
Enter a MESSENGER
THIRD GENTSLE MAN. Here's a messenger. What news? What's the news,
sir? How doth your lady? Is she well? Or is she
hears'd, beaten, or slain? The news is, sir
'''
Fientuning Details
- Global Step: 4656
- Train Runtime: 2710.0517 sec
- Train Samples per second: 13.742
- Train Steps per second: 1.718
- Epoch: 3.0
Training Curve
- Downloads last month
- 28
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.