R
Runpod15mo ago
james3000

meta-llama/Meta-Llama-3-8B-Instruct serverless

I am bit confused, trying to get this tested using Python but it seems to point me to using openai in the tutorial @ https://docs.runpod.io/serverless/workers/vllm/get-started Can we still use the openai python library or we need to use another one to connect to the endpoint? Can anyone help me please?
Get started | RunPod Documentation
Deploy a Serverless Endpoint for large language models (LLMs) with RunPod, a simple and efficient way to run vLLM Workers with minimal configuration.
1 Reply
Unknown User
Unknown User15mo ago
Message Not Public
Sign In & Join Server To View

Did you find this page helpful?