Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
LeroyDyer
/
SpydazWebAI_MultiModel_001_Project
like
1
Transformers
Safetensors
English
chemistry
biology
code
medical
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
f64e370
SpydazWebAI_MultiModel_001_Project
1 contributor
History:
8 commits
LeroyDyer
Update README.md
f64e370
verified
3 months ago
Image
Upload folder using huggingface_hub
8 months ago
Multi
Update Multi/config.json
8 months ago
Sound
Upload folder using huggingface_hub
8 months ago
Video
Upload folder using huggingface_hub
8 months ago
.gitattributes
1.52 kB
initial commit
8 months ago
README.md
5.46 kB
Update README.md
3 months ago
config.json
10.1 kB
Update config.json
7 months ago
model-00001-of-00008.safetensors
1.98 GB
LFS
Upload folder using huggingface_hub
8 months ago
model-00002-of-00008.safetensors
1.95 GB
LFS
Upload folder using huggingface_hub
8 months ago
model-00003-of-00008.safetensors
1.97 GB
LFS
Upload folder using huggingface_hub
8 months ago
model-00004-of-00008.safetensors
1.98 GB
LFS
Upload folder using huggingface_hub
8 months ago
model-00005-of-00008.safetensors
1.95 GB
LFS
Upload folder using huggingface_hub
8 months ago
model-00006-of-00008.safetensors
1.92 GB
LFS
Upload folder using huggingface_hub
8 months ago
model-00007-of-00008.safetensors
1.95 GB
LFS
Upload folder using huggingface_hub
8 months ago
model-00008-of-00008.safetensors
789 MB
LFS
Upload folder using huggingface_hub
8 months ago
model.safetensors.index.json
22.8 kB
Upload folder using huggingface_hub
8 months ago
special_tokens_map.json
625 Bytes
Upload folder using huggingface_hub
8 months ago
tokenizer.json
1.8 MB
Upload folder using huggingface_hub
8 months ago
tokenizer.model
493 kB
LFS
Upload folder using huggingface_hub
8 months ago
tokenizer_config.json
1.01 kB
Upload folder using huggingface_hub
8 months ago