I would ask in #autorag , it's a bit weird to ask in the workers-ai channel about not using workers-
I would ask in #autorag , it's a bit weird to ask in the workers-ai channel about not using workers-ai lol
nova-3 work with language spanish websocket via REST API, i'm testing this with the repository: https://github.com/cloudflare/realtime-examples/tree/main/ai-tts-stt when setting the language to 'es' or 'es-419' it gives me a 500 error: Failed to establish Nova WebSocket: 500 while the websocket keeps running, it works for english only, can anyone help me? error: '4002: could not route request to AI model', with model: '@cf/baai/bge-m3',Error: 3030: 1 validation error for VllmBatchRequest does anyone know what doesn't mean and how to resolve it?InferenceUpstreamError: error code: 1031 and now I'm getting InferenceUpstreamError: <!DOCTYPE html>... where it is returning CF html error page, with 500 error. I've tried using my production worker, and then in dev with --remote and dev locally. In all cases it returns some variation of the above but all fail. The code worked fine yesterday and hasn't changed. It fails at the call "@cf/qwen/qwen2.5-coder-32b-instruct". All other requests appear to work.remote = true included in my wrangler.toml I had it set in both queues and r2_buckets, but oddly not in ai. Removing them fixed the problem.remote = true solved the problem in dev but not in production. I redeployed worker, but the request still hangs. Nothing in logs...Workers AI: 9003: unknown internal error for exampleWorkers AI: WORKERS AI: Operation timed out after 40000 ms.venv/lib/python3.10/site-packages/llama_index/embeddings/cloudflare_workersai/base.py", line 118, in _aget_text_embeddings
return resp["result"]["data"]
KeyError: 'data'