Hi, I am new to Runpod. I am trying to run models in Runpod using Ollama. But I always get this error from Ollama
ollama run deepseek-r1:8b Error: 500 Internal Server Error: do load request: Post "http://127.0.0.1:44469/load": EOF
I do not know how to fix this. I have tried many things e.g. installing linux packages. Yet, I still encounter this issue. Please help me find a solution to this.