How to use the model?
1
#8 opened 2 months ago
by
AIer0107
how do I use this can't load t5 gguf with clip l safetensor
10
#6 opened 3 months ago
by
MANOFAi94
For the fastest inference on 12GB VRAM, are the following GGUF models appropriate to use?
3
#4 opened 3 months ago
by
ViratX
Comparisons to FP8 e4m3fn ?
1
#3 opened 3 months ago
by
NielsGx
Where do i put it which folder please?
1
#2 opened 3 months ago
by
Ashkacha
How do I load t5-v1_1-xxl-encoder-gguf?
11
#1 opened 3 months ago
by
YuFeiLiu