RunpodR
Runpod4mo ago
Taz

Long delay time

HI , my serverless inference requests always have a long delay time 40 - 50 seconds. What is exactly this delay time ? My docker image is quite big, would making it smaller reduce the delay time ? Thanks you
Was this page helpful?