Not working properly (Q5_K_M) but I am not very experienced
Using it both in command line and in Ollama, it simply didn't answer anything. When I would hit return again, I got either nothing or a strange response. I am going to download a different version and try that, but put this here in case others have the same issue.
@mlabonne Thanks for doing these!
what model are you running?
can you answer?
Welcome to the virtual bookshelf! I'm here to help you find a new book recommendation. Can you please tell me what
type of books you're in the mood for? Would you like something based on a specific theme, genre, or author?
Did you try if this one works?
https://ollama.com/mannix/llama3.1-8b-abliterated:q5_k_m
How do I get that model? I cannot find a hugging face version of it, I cannot find a download button or option in the page you linked me to, and I cannot pull it with ollama pull from the command line. I know this is a terribly basic question, but I am still getting started with this.
You just enter in the terminal:
ollama run mannix/llama3-uncensored
@i4one
As suggested by
@bhaswata08
you can run on terminal the run command to pull, more precisely this one:
ollama run mannix/llama3.1-8b-abliterated:q5_k_m
I had actually tried both of those commands before I posted, both pull and run, no joy.
It starts with >pulling manifest
then I getting an error: >Error: Incorrect Function
There was an issue with this error raised on GitHub, and it looked like they pushed a fix. I did the full upgrade per the docs.openwebui, but same error.
@i4one
Try running with ollama docker then. spin up an instance and pull