Best way to deploy a new LLM serverless, where I don't want to build large docker images - Runpod