Demo not working
Can't load tokenizer using from_pretrained, please update its configuration: <class 'transformers.models.vision_encoder_decoder.configuration_vision_encoder_decoder.VisionEncoderDecoderConfig'>
Hello asifmian, I am not completely sure what you mean by "demo,' but I am going to answer the two items that I believe you are referencing. If these answers do not answer what you are looking for, feel free to respond with a more details about the error.
First, if the demo you are referencing is the 'Hosted Inference API' on the upper righthand side of the 'Model card' tab, that is not connected to my model.
Second, for the tokenizer, that should be accessed from the processor, which uses the processor from the base/untrained checkpoint (like this: processor = TrOCRProcessor.from_pretrained("microsoft/trocr-base-printed") ).
I hope this helps! And like I said earlier, if these are not targeting the error that you are receiving, feel free to respond with more details about where and when the error is occurring; I will do my best to help.