RunpodR
Runpod17mo ago
1 reply
james3000

meta-llama/Meta-Llama-3-8B-Instruct serverless

I am bit confused, trying to get this tested using Python but it seems to point me to using openai in the tutorial @ https://docs.runpod.io/serverless/workers/vllm/get-started

Can we still use the openai python library or we need to use another one to connect to the endpoint? Can anyone help me please?
Deploy a Serverless Endpoint for large language models (LLMs) with RunPod, a simple and efficient way to run vLLM Workers with minimal configuration.
Get started | RunPod Documentation
Was this page helpful?