Text Generation
Transformers
PyTorch
mpt
Composer
MosaicML
llm-foundry
custom_code
text-generation-inference

Whats the input Max Token Size for MPT 7B Instruct? (Usecase - Transcript Summarization)

#61
by vibhanu - opened

Whats the input Max Token Size for MPT 7B Instruct?

Usecase - Customer Service Calls Transcript Summarization
(Transcript length - 5000 words)

Hi @vibhanu are you able to use the model for Customer Service Calls Transcript Summarization?

Sign up or log in to comment