embeddings endpoints

Hi, I have tried following the sparse documentation, but so far havent been able to get a non-error response or a helpful error message out of the embeddings endpoints. has anyone had any success actually using these, and if so, could you share a setup and exact request format that is known to work?
9 Replies
Keev
Keev5w ago
Following - having the same issue with a ComfyUI workflow. Every time I send a request it just returns: { "delayTime": 8257, "error": "Workflow execution failed: Failed to submit workflow: {"error": "no prompt", "node_errors": []}", "executionTime": 4099, "id": "48d45ad7-66aa-4032-aa24-7f7ee184aed6-e1", "status": "FAILED", "workerId": "djc2dj8t3qkhm9" } Tried numerous different requests and JSON formats
nerdylive
nerdylive4w ago
Your handler code is wrong, check how to send requests to comfyui again
nerdylive
nerdylive4w ago
https://github.com/blib-la/runpod-worker-comfy/blob/main/LICENSE you can use this as reference for your comfyui worker @Keev
GitHub
runpod-worker-comfy/LICENSE at main · blib-la/runpod-worker-comfy
ComfyUI as a serverless API on RunPod. Contribute to blib-la/runpod-worker-comfy development by creating an account on GitHub.
nerdylive
nerdylive4w ago
{
"input": {
"model": "YOUR_DEPLOYED_MODEL_NAME",
"input": "hey, did you know one apple a day that is washed pre-eating is better than unwashed apples?",
}
}
{
"input": {
"model": "YOUR_DEPLOYED_MODEL_NAME",
"input": "hey, did you know one apple a day that is washed pre-eating is better than unwashed apples?",
}
}
and you can refer more example here https://github.com/runpod-workers/worker-infinity-embedding?tab=readme-ov-file#openai-compatibility
GitHub
GitHub - runpod-workers/worker-infinity-embedding
Contribute to runpod-workers/worker-infinity-embedding development by creating an account on GitHub.
nerdylive
nerdylive4w ago
something like that for embeddings @yhlong00000 is it true that embedding workers doesnt need network volumes now ( from the quick deploy menu in serverless )
Keev
Keev2w ago
Got it working, thanks! Having one small issue: I'm not able to generate multiple images from a prompt / request to the endpoint. We have added a variable for the ""batch_size": " value in our workflow, but it only seems to generate one image regardless of the batch_size we give it. This is our Github repo for the runpod worker: https://github.com/sozanski1988/runpod-worker-comfyui/ And the workflow json is here: https://github.com/sozanski1988/runpod-worker-comfyui/blob/main/test_resources/workflows/default_workflow.json Is there a special way that we need to alter the workflow to be able to generate urls for multiple image outputs?
Keev
Keev2w ago
P.S. I've also added an example request and serverless endpoint logs for reference. @nerdylive
nerdylive
nerdylive2w ago
your repo is private, the request seems right (workflow).. you need to chamge your handler code to handle all the files, im guessing it just uploads one file right now find the part where it does that if its true, then change it
Keev
Keev2w ago
thanks you where right, it was the handler. Fixed it

Did you find this page helpful?