yea but i cant find any info on the token limits
yea but i cant find any info on the token limits
@hf/thebloke won't generate a response if max_tokens is set to 597 or higher. (Fails with "3025: Unknown internal error".)runWithTools syntax change?runWithTools I get Error in runWithTools: Error in runAndProcessToolCall: Bad input: must have required property 'prompt', must have required property 'name', must have required property 'description', must have required property 'parameters', must have required property 'type', must have required property 'properties', must match exactly one schema in oneOf, must match exactly one schema in oneOf,@cf/qwen/qwen1.5-14b-chat-awqError in runWithTools: Error in runAndProcessToolCall: , making me think that maybe the "Bad input" part was a response from the apiparameters: {} I got past the error by changing it to parameters: { type: "object", properties: {} }.
raw parameterInferenceUpstreamError: undefined: undefined fixed? I'm getting the same error using @cf/microsoft/resnet-50 with a PNG and this code https://gist.github.com/net-tech/0eb5e0caa05c6be931ab373cbb35f360
Intel/distilbert-base-uncased-finetuned-sst-2-english-int8-static-inc and how it's been trained.@hf/theblokerunWithToolsrunWithToolsError in runWithTools: Error in runAndProcessToolCall: Bad input: must have required property 'prompt', must have required property 'name', must have required property 'description', must have required property 'parameters', must have required property 'type', must have required property 'properties', must match exactly one schema in oneOf, must match exactly one schema in oneOf,@cf/qwen/qwen1.5-14b-chat-awqError in runWithTools: Error in runAndProcessToolCall: parameters: {}parameters: { type: "object", properties: {} }InferenceUpstreamError: undefined: undefined@cf/microsoft/resnet-50Intel/distilbert-base-uncased-finetuned-sst-2-english-int8-static-incexport async function onRequest(context) {
const input = { prompt: "What is the origin of the phrase Hello, World" }
const answer = await context.env.AI.run('@cf/meta/llama-3.1-8b-instruct', input);
return Response.json(answer);
}A ReadableStream branch was created but never consumed. Such branches can be created, for instance, by calling the tee() method on a ReadableStream, or by calling the clone() method on a Request or Response object. If a branch is created but never consumed, it can force the runtime to buffer the entire body of the stream in memory, which may cause the Worker to exceed its memory limit and be terminated. To avoid this, ensure that all branches created are consumed.