Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
MarsupialAI
/
Celeste-12B-V1.6_iMatrix_GGUF
like
2
Transformers
GGUF
nothingiisreal/c2-logs-cleaned
kalomaze/Opus_Instruct_25k
nothingiisreal/Reddit-Dirty-And-WritingPrompts
English
Not-For-All-Audiences
Inference Endpoints
imatrix
conversational
License:
apache-2.0
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
Celeste-12B-V1.6_iMatrix_GGUF
1 contributor
History:
11 commits
MarsupialAI
Upload 4 files
62bea93
verified
4 months ago
.gitattributes
Safe
2.41 kB
Upload 4 files
4 months ago
Celeste-12B-V1.6-imatrix.dat
Safe
7.05 MB
LFS
Upload 2 files
4 months ago
Celeste-12B-V1.6_Q3km.gguf
Safe
6.08 GB
LFS
Upload Celeste-12B-V1.6_Q3km.gguf
4 months ago
Celeste-12B-V1.6_Q4km.gguf
Safe
7.48 GB
LFS
Upload Celeste-12B-V1.6_Q4km.gguf
4 months ago
Celeste-12B-V1.6_Q4ks.gguf
Safe
7.12 GB
LFS
Upload Celeste-12B-V1.6_Q4ks.gguf
4 months ago
Celeste-12B-V1.6_Q5km.gguf
Safe
8.73 GB
LFS
Upload 2 files
4 months ago
Celeste-12B-V1.6_Q5ks.gguf
Safe
8.52 GB
LFS
Upload Celeste-12B-V1.6_Q5ks.gguf
4 months ago
Celeste-12B-V1.6_Q6k.gguf
Safe
10.1 GB
LFS
Upload 2 files
4 months ago
Celeste-12B-V1.6_fp16.gguf
Safe
24.5 GB
LFS
Upload Celeste-12B-V1.6_fp16.gguf
4 months ago
Celeste-12B-V1.6_iQ2m.gguf
Safe
4.44 GB
LFS
Upload 2 files
4 months ago
Celeste-12B-V1.6_iQ2xxs.gguf
Safe
3.59 GB
LFS
Upload 2 files
4 months ago
Celeste-12B-V1.6_iQ3m.gguf
Safe
5.72 GB
LFS
Upload 4 files
4 months ago
Celeste-12B-V1.6_iQ3xxs.gguf
Safe
4.95 GB
LFS
Upload 4 files
4 months ago
Celeste-12B-V1.6_iQ4nl.gguf
Safe
7.1 GB
LFS
Upload 4 files
4 months ago
Celeste-12B-V1.6_iQ4xs.gguf
Safe
6.74 GB
LFS
Upload 4 files
4 months ago
README.md
Safe
344 Bytes
Create README.md
4 months ago
groups_merged.txt
Safe
201 kB
Upload 2 files
4 months ago