AWQ quant with pileeval
of Norquinal/Mistral-7B-claude-chat
{ "zero_point": True, "q_group_size": 128, "w_bit": 4, "version": "GEMM" }
- Downloads last month
- 24
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.