R
RunPod2mo ago
houmie

How to download models for Stable Diffusion XL on serverless?

1) I created a new network storage of 26 GB for various models I'm interested in trying.
2) I created a Stable Diffusion XL endpoint on serverless, but couldn't attach the network storage.
3) After the deployment succeeded, I clicked on edit endpoint and attached that network storage to it. So far so good I believe. But how do I exactly download various SDXL models into my network storage, so that I could use them via Postman? Many Thanks
23 Replies
Madiator2011
Madiator20112mo ago
quick deploy sdxl endpoint is based on difusser and it has model baked in
houmie
houmie2mo ago
So if I wanted to use DreamShaper XL could I do this with that? Or do I need to clone https://github.com/runpod-workers/worker-sdxl and add the DreamShaper XL to it, push it to DockerHub and then pull it as serverless template?
Madiator2011
Madiator20112mo ago
should work if model is in difusser format
Madiator2011
Madiator20112mo ago
GitHub
worker-sdxl/src/rp_handler.py at 10b177bff0ec746b48cf9a4e4c682797ad...
RunPod worker for Stable Diffusion XL. Contribute to runpod-workers/worker-sdxl development by creating an account on GitHub.
GitHub
worker-sdxl/builder/cache_models.py at 10b177bff0ec746b48cf9a4e4c68...
RunPod worker for Stable Diffusion XL. Contribute to runpod-workers/worker-sdxl development by creating an account on GitHub.
houmie
houmie2mo ago
Ah so it's currently using the base model stable-diffusion-xl-base-1.0 ? So do I have to clone the https://github.com/runpod-workers/worker-sdxl and change the two files manually fromstabilityai/stable-diffusion-xl-base-1.0 to stablediffusionapi/dreamshaper-xl ? Is there no environment variable to inject in instead?
Madiator2011
Madiator20112mo ago
nope no env
Madiator2011
Madiator20112mo ago
if you would like something more flexible you can use https://github.com/ashleykleynhans/runpod-worker-a1111
GitHub
GitHub - ashleykleynhans/runpod-worker-a1111: RunPod Serverless Wor...
RunPod Serverless Worker for the Automatic1111 Stable Diffusion API - ashleykleynhans/runpod-worker-a1111
houmie
houmie2mo ago
Ahh nice. But this repo is based on the classic SD, not SDXL, correct? In that case for SDXL, I will try to close it and change the files myself. Then I need to add the model to my docker image and push it to DockerHub, correct? Then in RunPod I would create a template based on the dockerHub image and build a new serverless endpoint? So far my plan makes sense? 🙂 And will the model that the docker downloads be added to the attached network storage? I have a feeling because there is no environment variable passed in, the docker image is loaded in local storage, instead of network storage. I hope I'm wrong, because that would take a very long time each time I would post to the endpoint.
Madiator2011
Madiator20112mo ago
for a1111 it supports sdxl just need to get safetensors file for runpod-workers/worker-sdxl you would need to edit lines I told you rebuild image and push to dockerhub runpod-worker-a1111 supports network storage
houmie
houmie2mo ago
Thanks, yes, I 'm making progress with runpod-worker-a1111 . Is there a way to check from the dashboard how much space is left on the network-storage?
digigoblin
digigoblin2mo ago
Not unless you attach it to a pod
houmie
houmie2mo ago
ok, after I attached it to a pod how could I do that? df -h ? I don't think that's possible on network storage because it shows everything
digigoblin
digigoblin2mo ago
No, df -h will show you the space on the entire network storage, not just what is assigned to you check usage in your pod in the runpod web console It has a percentage indicator, which is also a bit crappy, would be nice to show actual space used if you hover over it or something
houmie
houmie2mo ago
Ah yeah. It says 95%. Yeah would be good if it could give us an actual number instead of guess work.
digigoblin
digigoblin2mo ago
Well you can do the math, don't need to do guess work but still its an unnecessary waste of time for something that could have better UX.
houmie
houmie2mo ago
Sorry, I worded it badly. Of course I could do 5% of 50 GB. What I meant is that having the actual number is more accurate and convenient.
digigoblin
digigoblin2mo ago
Yeah definitely, I agree that it can do with improvement
gnarley_farley.
Would i be able to use SD3 with this or do I need to update some things?
nerdylive
nerdylive2w ago
Not sure if a1111 supports Sd3 yet Check the a1111 github repo And that repo, which a1111 version is it
gnarley_farley.
OKay thanks. Looks like there isn't an updated version yet
gnarley_farley.
This looks like a solid option though. https://github.com/blib-la/runpod-worker-comfy
GitHub
GitHub - blib-la/runpod-worker-comfy: ComfyUI as a serverless API o...
ComfyUI as a serverless API on RunPod. Contribute to blib-la/runpod-worker-comfy development by creating an account on GitHub.
gnarley_farley.
do you have any idea how i can format my workflow to fit the recommended structure: Recommended: https://github.com/blib-la/runpod-worker-comfy/blob/main/test_resources/workflows/workflow_webp.json My workflow
GitHub
runpod-worker-comfy/test_resources/workflows/workflow_webp.json at ...
ComfyUI as a serverless API on RunPod. Contribute to blib-la/runpod-worker-comfy development by creating an account on GitHub.
gnarley_farley.
Thats just the standard recommended workflow by comfy
No description
Want results from more Discord servers?
Add your server
More Posts