REST API with Ollama

Hello everyone, I installed ollama and trying to make some request do this API using my pod instance and port and I´m getting no results or 502. I´m using this tutorial: https://docs.runpod.io/tutorials/pods/run-ollama
Set up Ollama on your GPU Pod | RunPod Documentation
Learn how to set up Ollama, a powerful language model, on a GPU Pod using RunPod, and interact with it through HTTP API requests, allowing you to harness the power of GPU acceleration for your AI projects.
2 Replies
Madiator2011
Madiator20116mo ago
would not use this guide for now it's outdated
Gustavo Monti
Gustavo MontiOP6mo ago
@Papa Madiator any recommendation in how to do that?
Want results from more Discord servers?
Add your server