I've been getting that too, I think something is off today
I've been getting that too, I think something is off today
@cf/bytedance/stable-diffusion-xl-lightning and @cf/black-forest-labs/flux-1-schnell but both produce the error:expected destination type of 'string' or '[]byte' for responses with content-type 'image/png' that is not 'application/json'
AiError: 3040: Capacity temporarily exceeded, please try again.D1 TYPE ERROR: Type 'object' not supported for value '[object Object]'

"tool_calls":[] object."tool_calls":[] goes away and instead some json is output is streamed in the response string."type": "function" and sometimes not.@cf/black-forest-labs/flux-1-schnell via the workers AI binding for text-to-image, which recently stopped working for me, giving me this response:@cf/qwen/qwen2.5-coder-32b-instruct. This really sucks when your paying for tokens, a couple of hundred token for nothing per request can start to add up.


@cf/bytedance/stable-diffusion-xl-lightning@cf/black-forest-labs/flux-1-schnell@cf/black-forest-labs/flux-1-schnellAiError: 3040: Capacity temporarily exceeded, please try again.{
exception:
{
stack:
"AiError: 3040: Capacity temporarily exceeded, please try again. at Ai._parseError (cloudflare-internal:ai-api:102:24) at async Ai.run (cloudflare-internal:ai-api:82:19) at async Object.fetch (index.js:21:22)",
name:
"Error",
message:
"3040: Capacity temporarily exceeded, please try again.",
timestamp:
1747050909290,
},
message:
"3040: Capacity temporarily exceeded, please try again.",
$workers:
{
truncated:
false,
event:
{
request:
{
url:
"https://classify-ai.workers.dev/",
method:
"POST",
path:
"/"
}
},
outcome:
"exception",
scriptName:
"classify-ai",
eventType:
"fetch",
executionModel:
"stateless",
scriptVersion:
{
id:
"1678a794-81e9-48fd-8d8f-43f9d95d4235"
},
requestId:
"93e9b9b6a8b802ad"
},
$metadata:
{
id:
"01JV25JKKAQGM3BZG6KQDG9AA7",
requestId:
"93e9b9b6a8b802ad",
trigger:
"POST /",
service:
"classify-ai",
level:
"error",
error:
"3040: Capacity temporarily exceeded, please try again.",
message:
"3040: Capacity temporarily exceeded, please try again.",
account:
"f1c3db27dc01771c8c27ec175b404598",
type:
"cf-worker",
fingerprint:
"4bc40a10bc8b1073f4883173eeb61c8a",
origin:
"fetch",
messageTemplate:
"<NUM>: Capacity temporarily exceeded, please try again.",
errorTemplate:
"<NUM>: Capacity temporarily exceeded, please try again.",
}
}"tool_calls":[]"tool_calls":[]response"type": "function"Monthly usage limit for image reached for your plan. Please upgrade.@cf/qwen/qwen2.5-coder-32b-instruct⎔ Starting local server...
[wrangler:inf] Ready on http://localhost:8788
✘ [ERROR] OAuth error response: 401 invalid_token - Missing or invalid access token
[wrangler:inf] POST /sse 401 Unauthorized (6ms)
[wrangler:inf] GET /.well-known/oauth-authorization-server 200 OK (1ms)
[wrangler:inf] GET /authorize 302 Found (68ms)
[wrangler:inf] GET /callback 302 Found (871ms)
[wrangler:inf] GET /.well-known/oauth-authorization-server 200 OK (2ms)
✘ [ERROR] OAuth error response: 401 invalid_client - Client not found
[wrangler:inf] POST /token 401 Unauthorized (3ms)const API_BASE_URL = "https://api.cloudflare.com/client/v4/accounts/{myID}/ai/run/"
const API_AUTH_TOKEN = "{myTOKEN}" //process.env.API_AUTH_TOKEN;
const model = "@cf/meta/llama-2-7b-chat-int8"
const headers = {
'Authorization':`Bearer ${API_AUTH_TOKEN}`,
//'Content-type':'application/type',
"Access-Control-Allow-Origin": "*",
"Access-Control-Allow-Methods": "GET,HEAD,POST,OPTIONS",
"Access-Control-Max-Age": "86400"
}
if (!API_BASE_URL || !API_AUTH_TOKEN) {
throw new Error('API credential is wrong or not configured from Github Action')
}
const inputs = [
{'role':'system', 'content':systemPrompt},
{'role':'user', 'content':userPrompt},
]
const payload = {
message: inputs
}
try {
console.log("Requesting to LLM...")
const response = await fetch(`${API_BASE_URL}${model}`, {
method: 'POST',
headers:headers,
body: JSON.stringify(payload),
mode: 'no-cors'
}
);
if (!response.ok) {
throw new Error (`Error request from LLM: ${response.status}`)
}
console.log("Requesting completed. Waiting for output...")
const output = await response.json(); // output
console.log(output)
}
catch (error) {
console.log("API Error", error);
throw error;
}{{{{ ...the expected response... } } ] } } } \n' +
'\n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' \n' +
' ', resp, err := client.AI.Run(ctx, model, ai.AIRunParams{
AccountID: cloudflare.F(cfAccountID),
Body: ai.AIRunParamsBodyTextToImage{
Prompt: cloudflare.F(prompt),
},
})
if err != nil {
return err
}