Llama 3.1 + Serveless
I´m trying to use this tutorial:
Llama 3.1 via Ollama
Tried to use: pooyaharatian/runpod-ollama:0.0.8 and override the default start with llama3.1
but getting this error:
{
"delayTime": 16752,
"error": "model "llama3.1" not found, try pulling it first",
"executionTime": 156,
"id": "f3687a15-700f-4acf-856a-d7df024ad304-u1",
"status": "FAILED"
}
into the logs:
2024-09-02 14:52:09.063
[info]
The model you are attempting to pull requires a newer version of Ollama.
Tried to update to pooyaharatian/runpod-ollama:0.0.9 but getting some JSON decoded errors.
Llama 3.1 via Ollama
Tried to use: pooyaharatian/runpod-ollama:0.0.8 and override the default start with llama3.1
but getting this error:
{
"delayTime": 16752,
"error": "model "llama3.1" not found, try pulling it first",
"executionTime": 156,
"id": "f3687a15-700f-4acf-856a-d7df024ad304-u1",
"status": "FAILED"
}
into the logs:
2024-09-02 14:52:09.063
[info]
The model you are attempting to pull requires a newer version of Ollama.
Tried to update to pooyaharatian/runpod-ollama:0.0.9 but getting some JSON decoded errors.


