RunpodR
Runpod2y ago
8 replies
MattArgentina

vLLM Serverless error

When using the vLLM Serverless template I get the following error when trying to use the model - cognitivecomputations/dolphin-2.9-llama3-8b:

HFValidationError: Repo id must use alphanumeric chars or '-', '_', '.', '--' and '..' are forbidden, '-' and '.' cannot start or end the name
Solution
Its fixed, it was due to the thing you just posted in #🚨|incidents @haris
Was this page helpful?