Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
tokyotech-llm
/
Swallow-7b-plus-hf
like
8
Follow
tokyotech-llm
75
Text Generation
Transformers
PyTorch
English
Japanese
llama
text-generation-inference
Inference Endpoints
arxiv:
2404.17790
arxiv:
2404.17733
License:
llama2
Model card
Files
Files and versions
Community
2
Train
Deploy
Use this model
c8f9eda
Swallow-7b-plus-hf
2 contributors
History:
17 commits
SFconvertbot
Adding `safetensors` variant of this model
c8f9eda
verified
8 months ago
.gitattributes
Safe
1.56 kB
Upload logo.png
9 months ago
LICENSE.txt
Safe
7.02 kB
Upload LICENSE.txt
9 months ago
README.md
Safe
10.2 kB
Upload README.md
9 months ago
config.json
Safe
689 Bytes
Update config.json
9 months ago
generation_config.json
Safe
203 Bytes
Upload generation_config.json
9 months ago
logo.png
Safe
1.91 MB
LFS
Upload logo.png
9 months ago
model-00001-of-00002.safetensors
Safe
9.98 GB
LFS
Adding `safetensors` variant of this model
8 months ago
model-00002-of-00002.safetensors
Safe
3.68 GB
LFS
Adding `safetensors` variant of this model
8 months ago
model.safetensors.index.json
Safe
28.1 kB
Adding `safetensors` variant of this model
8 months ago
pytorch_model-00001-of-00002.bin
Safe
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.BFloat16Storage"
What is a pickle import?
9.98 GB
LFS
Upload pytorch_model-00001-of-00002.bin
9 months ago
pytorch_model-00002-of-00002.bin
Safe
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.BFloat16Storage"
What is a pickle import?
3.68 GB
LFS
Upload pytorch_model-00002-of-00002.bin
9 months ago
pytorch_model.bin.index.json
Safe
26.8 kB
Upload pytorch_model.bin.index.json
9 months ago
requirements.txt
Safe
53 Bytes
Upload requirements.txt
9 months ago
special_tokens_map.json
Safe
457 Bytes
Upload special_tokens_map.json
9 months ago
tokenizer.model
Safe
914 kB
LFS
Upload tokenizer.model
9 months ago
tokenizer_config.json
Safe
773 Bytes
Upload tokenizer_config.json
9 months ago