Serverless with network storage

Hi all,

I am trying to setup a serverless worker for comfyui (currently using customized template from this https://github.com/blib-la/runpod-worker-comfy .
I have a several large models which I would like not to bake into the image.
I see there is an option to mount network storage to serverless worker, I tried to mount it (with the required models to run the workflow) to the serverless comfy worker, but when I send a request with the workflow I see in the worker logs that it does not see any of the models in the mounted storage.
There are also no customization for the network storage mount for a serverless worker, so I am not even sure if the paths are mounted correctly.

Therefore I want to ask:
  • Is this type of functionality/usecase is supported/feasible?
  • If it is, what am I missing or not doing correctly?
Thanks in advance!
GitHub
ComfyUI as a serverless API on RunPod. Contribute to blib-la/runpod-worker-comfy development by creating an account on GitHub.
Was this page helpful?