RunpodR
Runpod2y ago
12 replies
Boxitunny

worker vllm 'build docker image with model inside' fails

from the page https://github.com/runpod-workers/worker-vllm?tab=readme-ov-file

Option 2: Build Docker Image with Model Inside
To build an image with the model baked in, you must specify the following docker arguments when building the image.

Prerequisites
RunPod Account
Docker
Arguments:
Required

MODEL_NAME
Optional
MODEL_BASE_PATH: Defaults to /runpod-volume for network storage. Use /models or for local container storage.
QUANTIZATION
WORKER_CUDA_VERSION: 11.8.0 or 12.1.0 (default: 11.8.0 due to a small amount of workers not having CUDA 12.1 support yet. 12.1.0 is recommended for optimal performance).
For the remaining settings, you may apply them as environment variables when running the container. Supported environment variables are listed in the Environment Variables section.

Example: Building an image with OpenChat-3.5
sudo docker build -t username/image:tag --build-arg MODEL_NAME="openchat/openchat_3.5" --build-arg MODEL_BASE_PATH="/models" .

so I cloned the github thing into a folder, then opened a command prompt inside the same folder as the dockerfile, then put in

docker build -t toxibunny/RPmixtralAPI:0.1 --build-arg MODEL_NAME="TheBloke/Mixtral-8x7B-MoE-RP-Story-AWQ" --build-arg MODEL_BASE_PATH="/models"

but it came back

ERROR: "docker buildx build" requires exactly 1 argument.
See 'docker buildx build --help'.

Usage: docker buildx build [OPTIONS] PATH | URL | -


what am I doing wrong?
GitHub
The RunPod worker template for serving our large language model endpoints. Powered by vLLM. - GitHub - runpod-workers/worker-vllm: The RunPod worker template for serving our large language model en...
GitHub - runpod-workers/worker-vllm: The RunPod worker template for...
Was this page helpful?