mmmm / README.md
boompack's picture
Update README.md
1aef0fe verified
|
raw
history blame
892 Bytes
metadata
title: Mmmm
emoji: πŸš€
colorFrom: red
colorTo: indigo
sdk: docker
sdk_version: 5.4.0
app_file: app.py
pinned: false
license: bigscience-openrail-m
duplicated_from: ysharma/ChatGPT4
disable_embedding: true
datasets:
  - allenai/WildChat-1M
  - allenai/WildChat-1M-Full
  - allenai/WildChat
models:
  - allenai/WildLlama-7b-user-assistant
  - allenai/WildLlama-7b-assistant-only
short_description: nbb

Use a pipeline as a high-level helper

from transformers import pipeline

pipe = pipeline("text-generation", model="allenai/WildLlama-7b-assistant-only")

Load model directly

from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("allenai/WildLlama-7b-assistant-only") model = AutoModelForCausalLM.from_pretrained("allenai/WildLlama-7b-assistant-only")