torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 92.99 GiB
#2
by
chenrq2005
- opened
when using GPU (A100-40GB) encountering the torch.cuda.OutOfMemoryError error, any suggestion?
Hi,
Sorry for the late response, in this case you may need to reduce the inference batch size and check your audio length. The input should be a speech turn instead of an entire conversation. If you have a long audio, you can first segment with a vad and then pass the segments to the model.
Thanks Yingzhi!