Unable to load with transformers library as config files are missing.
#4
by
mlwithaarvy
- opened
respective configs files are missing
this is a ggml version of the model, I'm afraid you can't use it with transformers.
try
https://huggingface.co/eachadea/vicuna-7b-1.1
Is there any guide to convert any of these GGMLs into API?
I am using flask and Transformers library but everything just works in the local terminal as indicated but not in flask via Transformers API. I have searched every possible source.
Is there any guide to convert any of these GGMLs into API?
I am using flask and Transformers library but everything just works in the local terminal as indicated but not in flask via Transformers API. I have searched every possible source.
Sure, it's actually pretty simple to get started (you don't need transformers
for that):
- Download the model of your choice as ggml format and place it inside a local folder
- Install llama-cpp-python, which provides Python bindings for llama.cpp
- Follow the instructions in llama-cpp-python to call the model with python, make sure that this works
- Now instead make the call from inside the flask application