ComfyUI Serverless reduce cold start
We’re using Rundpod Serverless to deploy our ComfyUI workflow, and overall it runs beautifully. The only real challenge we’re facing is the startup time (even with Flashboot enabled):
• Total worker execution time: 1 min 10 sec
• ComfyUI startup (module loading): 43 sec
• ComfyUI workflow runtime: ~27 sec
We’ve tried using network volumes as well as baking the models directly into the Docker image, but neither approach had any noticeable effect on startup duration. Since the workflow itself would finish in roughly 30 seconds, the initialization phase is by far the main bottleneck.
Does anyone have an idea how to speed up the startup process, cache model loading, or otherwise reduce ComfyUI’s initialization time?
• Total worker execution time: 1 min 10 sec
• ComfyUI startup (module loading): 43 sec
• ComfyUI workflow runtime: ~27 sec
We’ve tried using network volumes as well as baking the models directly into the Docker image, but neither approach had any noticeable effect on startup duration. Since the workflow itself would finish in roughly 30 seconds, the initialization phase is by far the main bottleneck.
Does anyone have an idea how to speed up the startup process, cache model loading, or otherwise reduce ComfyUI’s initialization time?
