Error while using vLLm in RTX A6000

2024-02-22T11:19:46.009303238Z /usr/bin/python3: Error while finding module specification for 'vllm.entrypoints.openai.api_server' (ModuleNotFoundError: No module named 'vllm') Using RTX A600, i'm using this gpu from last 4-5days, not getting any error, but today i'm facing this issue, could anyone help me out with this, why it is happening?
1 Reply
ashleyk
ashleyk4mo ago
Which template are you using or did you install vllm yourself?