vLLM serverless not working with hugginface model
Hey, been trynna create a serverless instance of the model https://huggingface.co/fancyfeast/llama-joycaption-beta-one-hf-llava for image description in order to use its api on an automation.
I have tried the vLLM template as well as using a git repo but whener I test a request I get the worker exited with exit code 1
I believe this is probably something very simple that i'm doing wrong. Thanks
0 Replies