Pod crashing due to low regular RAM?
Hey, I am running ComfyUI and my pod keeps crashing at one point in the workflow, the VRam is only at 70% utilised, but the GPU says 100%
Does this mean if I found a different pod with more regular Ram, then I could keep going with the workflow?


26 Replies
pod has 30gb ram

Unknown User•16mo ago
Message Not Public
Sign In & Join Server To View
So I have a friends 4090 in real life that I am using to render artwork from a workflow in comfyui. i Need to work faster so I rented a 4090 on Runpod, but it doesnt work. it crashes... and I am trying to figure out why
the screenshot I shared is a plug in that visualises how the ram is being used, and it seems like the VRAM is not the issue
So I want to know how to fix it, if i had a different pod with more than 30gb RAM maybe would be ok
It is just annoying to waste money testing out these pods, thought someone here might know
Unknown User•16mo ago
Message Not Public
Sign In & Join Server To View
thats the thing, it doesnt say why in the logs
Unknown User•16mo ago
Message Not Public
Sign In & Join Server To View
i shared the screenshot..
it says "reconnecting" and ERR
but it required a pod restart to get it working again
log didnt say anything, the other stuff was just error because I put wrong format image fir ipadapter

Unknown User•16mo ago
Message Not Public
Sign In & Join Server To View

yes
I got it working with using less controlnets, but whenever I use 3 it crashes
Unknown User•16mo ago
Message Not Public
Sign In & Join Server To View

Unknown User•16mo ago
Message Not Public
Sign In & Join Server To View
oh now issue with just 2 controlnet
Unknown User•16mo ago
Message Not Public
Sign In & Join Server To View

Unknown User•16mo ago
Message Not Public
Sign In & Join Server To View
ugh hmm, how can I launch it from commandline?
I got it to render a couple of times so I dont think its actually a bad pod
but it just keeps crashing, I am 90% sure its a RAM issue
regular ram, not VRAM
But idk really... the workflow is exactly the same as what I am running on a local machine
It works now, with 1 Ipadapter, 2 controlnet, but I need 2 Ipadapter, 3 controlnets. that is the issue. I guess it can't handle it
Unknown User•16mo ago
Message Not Public
Sign In & Join Server To View
I actually don't know, need to ask my friend, I am accessing it remotely
i would guess 64GB
if its working
thanks for your help anyway
Unknown User•16mo ago
Message Not Public
Sign In & Join Server To View
this seems like memory issue?

Unknown User•16mo ago
Message Not Public
Sign In & Join Server To View
I was searching and found this😂 :
The error message indicates an "Allocation on device" issue when executing SamplerCustom in ComfyUI. This type of error typically occurs when there's not enough GPU memory available to allocate for the operation.
Unknown User•16mo ago
Message Not Public
Sign In & Join Server To View