no compatible serverless GPUs found while following tutorial steps
hi, i'm trying to run orca-mini on serverless by following this tutorial [https://docs.runpod.io/tutorials/serverless/cpu/run-ollama-inference]. whenever the download finishes, i get the error message below and then the ckpt download resstarts.
3 Replies
i'm using 2vCPUs with 5GB of storage space
update - having moved to 4vCPUs, i instead get this error message in the logs when i send a request
update 2 - this was caused by my not following the tutorial properly. i wasn't using the appropriate request
the latest weird stuff that i've seen was this response to a request
Unknown User•10mo ago
Message Not Public
Sign In & Join Server To View
hey, thanks for your help. my best guess is that the pod hadn't initialised properly or something when i ran it the first time. it started working perfectly on the 2nd try