R
Runpod2y ago
Emad

Cost me 25$ for a small request.

No description
No description
No description
No description
No description
18 Replies
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
Emad
EmadOP2y ago
It charged for 5.3 hours When the request wasnt that long
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
Emad
EmadOP2y ago
The program just ran into an error which can be seen from the logs How can it run for 5.3 hours
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
Emad
EmadOP2y ago
Should I resolve this with the webchat?
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
Madiator2011
Madiator20112y ago
From what I see your worker was throwing error and was looping and if not proper handled it will loop worker causing it to never stop. You probably want to check code of your worker
Emad
EmadOP2y ago
And the server was on for 5 hours?
Madiator2011
Madiator20112y ago
yes if you did not set it to end. I'm not sure how your worker is build
Hermann
Hermann2y ago
I have noticed this too. If your LLM throws CUDA run out of memory, it will loop forever, unless you cancel the job manually. It's very dangerous on an expensive GPU. I wished there was a better way to handle that.
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
Hermann
Hermann2y ago
Can you show us how? That would great. Thanks
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
Hermann
Hermann2y ago
Ahh yes. I found it. Mine isn't enabled.
No description
Hermann
Hermann2y ago
60 seconds is reasonable, right?
digigoblin
digigoblin2y ago
Depends on your specific endpoint.
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View

Did you find this page helpful?