@miaogu The inference API widget sometimes cuts the response short. Please check this issue for more details. You may want to run the model yourself in case the inference API bug cuts the results short.
@Norm Hi, sorry... Where are the issue's link?
@miaogu Here: https://github.com/NormXU/nougat-latex-ocr/issues/2 Sorry for confusion
· Sign up or log in to comment