I recently got a Framework desktop with the 128 Gb option with the idea (admittedly a strange idea) to use as both a HTPC and local AI server. I've got Bazzite (in desktop mode) up and running, but now I'm unsure how to get Ollama running. Google searching brings up answers with using the command ujust ollama, but this command gives the error "Justfile does not contain recipe 'ollama'".
Trying to figure out the best way to run ollama to be able to utilize the AMD Strix Halo on the Framework desktop. brew has ollama available as well as just using podman to run an instance of ollama and openwebui, but I'm unsure hot to pass the GPU through using podman.
Anyone else running ollama on Bazzite and have suggestions on the best way to run it?