How to get worker to save multiple images to S3?
Hey all - my comfyui workflow is saving multiple images from throughout the workflow......however in the S3 upload, the worker is only saving one image - do you know how I can have it to save the multiple images into the same directory in S3?

34 Replies
Unknown User•16mo ago
Message Not Public
Sign In & Join Server To View
oh I see
I'll dig into it - thanks!
Unknown User•16mo ago
Message Not Public
Sign In & Join Server To View
zip it then upload
Oh that’s a great suggestion - I wasn’t sure if comfyui had a zip node?
Unknown User•16mo ago
Message Not Public
Sign In & Join Server To View
I'm using the runpod-worker-comfy repo by blib-la - this is the lines of code that handles the upload:
Unknown User•16mo ago
Message Not Public
Sign In & Join Server To View
FWIW, I mentioned the following in the blib-la support discord too:

Unknown User•16mo ago
Message Not Public
Sign In & Join Server To View
the path where the output images are stored in the worker's instance of ComfyUI I believe
Unknown User•16mo ago
Message Not Public
Sign In & Join Server To View
correct
Unknown User•16mo ago
Message Not Public
Sign In & Join Server To View
nah you can upload multi
Unknown User•16mo ago
Message Not Public
Sign In & Join Server To View
It's not magic. It depends on how you code your serverless handler to run. If you need to upload multiple files then call rp_upload.upload_image (included in the runpod Python module) multiple times. If you don't like how rp_upload.upload_image works you can write your own upload routine.
Here is an example of uploading 2 files uses rp_upload.upload_image:
from runpod.serverless.utils import rp_upload
import runpod
def handler(job):
input = job['input']
JOB_ID = job['id']
imageOne_url = rp_upload.upload_image(JOB_ID, "./imageOne.png") imageTwo_url = rp_upload.upload_image(JOB_ID, "./imageTwo.png")
return {'imageOne_url': imageOne_url, 'imageTwo_url': imageTwo_url} You will have to modify this for your specific needs.
imageOne_url = rp_upload.upload_image(JOB_ID, "./imageOne.png") imageTwo_url = rp_upload.upload_image(JOB_ID, "./imageTwo.png")
return {'imageOne_url': imageOne_url, 'imageTwo_url': imageTwo_url} You will have to modify this for your specific needs.
Unknown User•16mo ago
Message Not Public
Sign In & Join Server To View
I do not believe that rp_upload.upload_image handles multiple files with its default code. Here is code for a Multi file upload routine you could add and use to upload multiple files, with wilcard:
Here is how you call such a function:
Unknown User•16mo ago
Message Not Public
Sign In & Join Server To View
Thank you @Encyrption - this is super helpful. I’m actually looking to download - rather than upload - multiple images from the output folder. Basically my workflow is producing multiple images and I want to present them all to the user to choose the one they want?
Unknown User•16mo ago
Message Not Public
Sign In & Join Server To View
I think what you are describing is uploading. Your workflow is likely producing image files PNG, JPG, or similar. They are stored on the worker's disk as files. For the user to get access to them you will have to upload those files somewhere that will host them on the Internet. My code above uploads them to an S3 bucket. With proper configuration of the S3 bucket it's URL will be available for anyone on the Internet to access it. Your handler needs to return those URLs to user in JSON. If you are presenting a web interface to the user then you will need to include the images in <img> tags so they can see them or use JavaScript to make it so the user downloads it.
Unknown User•16mo ago
Message Not Public
Sign In & Join Server To View
If you do actually need to dowload something into your worker you can add this function:
You can call it like this:
Ahh thank you both!
Btw from a performance point of view is it better to go via S3 or bring directly to local device? Ie export as a list of base64 strings
Unknown User•16mo ago
Message Not Public
Sign In & Join Server To View
When you run a serverless worker the only thing that will be returned is JSON (text). Below is an example of a run of one of my ToonCrafter worker:
You can see how I returned a s3 URL to a video file. None of the serverless disk will persist so any files not uploaded somewhere are lost.
One other option would be to convert your image to BASE64. Since BAS64 is just text you can return that in the actual JSON response. Although, there is a size limit on how much data you can include in a response. You could likely return an image encoded as BASE64 but I wouldn't suggest that route for multiple images.
But again, nothing persists after an API call to a serverless worker. You have to move the results somewhere that will persist and give a link to it in the JSON that is returned.
Thank you thank you both!!
Unknown User•16mo ago
Message Not Public
Sign In & Join Server To View
I've just finished building a web socket proxy... that creates a WebSocket connection between the serverless worker and the user's browser. With that you could send the results directly to the users browser. I'm not giving that code out yet though... I am considering if I want to run it as a service.
Unknown User•16mo ago
Message Not Public
Sign In & Join Server To View
Hard to say.... it depends on a lot of factors. Upload/Download speed at given runpod region. Upload/Download speed of the users browser. I have done tests with streaming logs from the worker to the browser. I am currently working on code that will stream webcam video to the worker from the browser and video in the reverse. Transferring over media (images, videos) should be no problem in most scenarios.... but if a user was on a slow link it could.
@tzk i noticed this bug too with blib-la's repo. This is what im doing to grab all comfyui output images: