Please use the latest release of llama.cpp (after b2100) which fixed bugs for minicpm support.BTW, refer to this link if you want to convert model by yourself.
· Sign up or log in to comment