Hi, I am using the batch api for the
Hi, I am using the batch api for the Markdown API; I am getting a 50% error rate when converting images to Markdown. I am using 10/batch and 10/concurrent requests. example errors I am getting:
queueRequest=true doesn't seems to be compatible with this endpoint, and if I send indiviual request or 1/per batch it gonna take very long (approax 2.25million items need to be processed);200 when some of the files fail, and some of them succeed, so the AI gateway won't rely on performing a retry since it is considered a success; e.g, 43349ae4-23ad-4201-bbd6-b67e903bf250 request 6/10 succeed and four failed, but the gateway considers this as a success.

queueRequest is a request parameter available when calling Workers AI models. The Markdown Conversion API uses a different set of endpoints and it does not support this feature (yet 200 response right now but the success of failure of a request can be seen in the response's success attribute. These requests do not go through AI Gateway: they are not Workers AI model invocations in the normal sense, as it's a totally different set of endpoints.queueRequest is not supported on the Markdown Conversion API endpoints. We have that feature on the roadmap, although with no ETAawait Promise.all([...]) that "runs all promises at the same time". A common use case is to run several steps concurrently this way (since they return promises that can be awaited together)await Promise.all([...]) maybe something there ...queueRequest=true20020043349ae4-23ad-4201-bbd6-b67e903bf250queueRequestqueueRequestawait Promise.all([...])await Promise.all([...])