vLLM Serverless error
When using the vLLM Serverless template I get the following error when trying to use the model - cognitivecomputations/dolphin-2.9-llama3-8b:
HFValidationError: Repo id must use alphanumeric chars or '-', '_', '.', '--' and '..' are forbidden, '-' and '.' cannot start or end the name
HFValidationError: Repo id must use alphanumeric chars or '-', '_', '.', '--' and '..' are forbidden, '-' and '.' cannot start or end the name
