Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
flyingfishinwater
/
good_and_small_models
like
1
GGUF
Inference Endpoints
conversational
Model card
Files
Files and versions
Community
Deploy
Use this model
main
good_and_small_models
1 contributor
History:
107 commits
flyingfishinwater
Update README.md
5e59e60
verified
7 days ago
.gitattributes
Safe
4.42 kB
Upload smollm2-1.7b-instruct-q4_k_m.gguf
7 days ago
FinGPT-7B-Q3_K_M.gguf
Safe
3.3 GB
LFS
Upload 2 files
4 months ago
Llama-3.2-3B-Instruct-Q4_K_M.gguf
Safe
2.02 GB
LFS
Upload Llama-3.2-3B-Instruct-Q4_K_M.gguf
about 1 month ago
Mistral-7B-Instruct-v0.3.Q3_K_M.gguf
Safe
3.52 GB
LFS
Upload Mistral-7B-Instruct-v0.3.Q3_K_M.gguf
6 months ago
Phi-3-mini-128k-instruct-LLaMAfied.Q4_K_S.gguf
Safe
2.19 GB
LFS
Upload 2 files
6 months ago
Phi-3-mini-128k-instruct.Q3_K_S.gguf
Safe
1.68 GB
LFS
Upload 2 files
6 months ago
Phi-3-mini-4k-instruct-q4.gguf
Safe
2.32 GB
LFS
Upload Phi-3-mini-4k-instruct-q4.gguf
6 months ago
Phi-3-mini-f16-mmproj.gguf
Safe
608 MB
LFS
Upload Phi-3-mini-f16-mmproj.gguf
6 months ago
README.md
Safe
27.9 kB
Update README.md
7 days ago
Yi-1.5-6B-Q3_K_M.gguf
Safe
2.99 GB
LFS
Upload Yi-1.5-6B-Q3_K_M.gguf
6 months ago
dolphin-2.9.2-qwen2-7b-Q3_K_S.gguf
Safe
3.49 GB
LFS
Upload 2 files
5 months ago
dolphin-2.9.4-gemma2-2b-Q4_K_M.gguf
Safe
1.71 GB
LFS
Upload dolphin-2.9.4-gemma2-2b-Q4_K_M.gguf
2 months ago
gemma-2-9b-it-Q3_K_L.gguf
Safe
5.13 GB
LFS
Upload gemma-2-9b-it-Q3_K_L.gguf
4 months ago
gemma-2b-it-q8_0.gguf
Safe
2.67 GB
LFS
Upload gemma-2b-it-q8_0.gguf
8 months ago
llama-3.1-whiterabbitneo-2-8b-q4_k_m.gguf
Safe
4.92 GB
LFS
Upload llama-3.1-whiterabbitneo-2-8b-q4_k_m.gguf
about 1 month ago
openchat-3.6-8b-20240522-Q3_K_M.gguf
Safe
4.02 GB
LFS
Upload openchat-3.6-8b-20240522-Q3_K_M.gguf
6 months ago
qwen2.5-1.5b-instruct-q4_k_m.gguf
Safe
1.12 GB
LFS
Upload qwen2.5-1.5b-instruct-q4_k_m.gguf
about 1 month ago
qwen2.5-3b-instruct-q4_k_m.gguf
Safe
2.1 GB
LFS
Upload qwen2.5-3b-instruct-q4_k_m.gguf
about 2 months ago
reader-lm-1.5b-Q4_K_M.gguf
Safe
986 MB
LFS
Upload reader-lm-1.5b-Q4_K_M.gguf
about 1 month ago
smollm2-1.7b-instruct-q4_k_m.gguf
Safe
1.06 GB
LFS
Upload smollm2-1.7b-instruct-q4_k_m.gguf
7 days ago
starcoder2-3b-instruct-gguf_Q8_0.gguf
Safe
3.22 GB
LFS
Upload starcoder2-3b-instruct-gguf_Q8_0.gguf
8 months ago