How's T5 handling longer sequences?
#14
by
kelvinspire
- opened
I'm curious to know how T5 is handling longer sequences behind the scenes, does it chunk the inputs? Any ideas?
Don't trust on me on this, but I think it can handle up to 512 tokens (and truncates after that) but unfortunately cannot remember where I saw this information and cannot say if this is correct :) 512 tokens is roughly 400 words.
@juusohugs I will leave this https://github.com/google-research/FLAN/issues/36#issuecomment-1472282261 here. This thread has the answer to this question.