How to setup?
Hi I've started a RTX 3090 pod. The issue I'm faacing is im not able to setup a tgi for llama 3.5 on the pod. I've tried to debug the issue and looks like it says I dont not have permissions or not allowed to. Am I even allwoed to do that on this pod? Like isnt this pod a docker instance how to get the privileges to do so?
1 Reply
Am I even allwoed to do that on this pod?
whats "that"?
what previlleges?
yes the pod is a docker container
any errors you experienced?