Inference error in transformers 4.42.1
#58
by
kang1
- opened
The related issue is: https://github.com/huggingface/transformers/issues/31678
GLM-4 is now support 4.44.0, you can using with our latest model glm-4-9b-chat
zRzRzRzRzRzRzR
changed discussion status to
closed