Workers intentionally don't support dynamically loading WASM for security reasons, it must be upload
Workers intentionally don't support dynamically loading WASM for security reasons, it must be uploaded at deploy time.
A change to the Workers Runtime must never break an application that is live in production.
env.AI.run() in ctx.passThroughOnException while still saving the result?passThroughOnException works. I just want my worker to respond properly when the models time outpassThroughOnException is more intended for if you have an origin behind the workermax_tokens@cf/qwen/qwen1.5-7b-chat-awq has a context length way above 8k. Managed to chop up and squeeze in ~20k tokens, ~100k characters, and ask a question about a detail buried in the beginning of the text.@cf/llava-hf/llava-1.5-7b-hf and @cf/unum/uform-gen2-qwen-500m accept an image as input.const stream = await retrivalChain.stream({
input: 'what is hello world',
});
return new Response(stream, {
headers: {
'content-type': 'text/event-stream',
'Access-Control-Allow-Origin': '*',
},
});Argument of type 'IterableReadableStream<{ context: Document<Record<string, any>>[]; answer: string; } & { [key: string]: unknown; }>' is not assignable to parameter of type 'BodyInit | null | undefined'.
Type 'IterableReadableStream<{ context: Document<Record<string, any>>[]; answer: string; } & { [key: string]: unknown; }>' is not assignable to type 'ReadableStream<Uint8Array>'.
The types returned by 'getReader()' are incompatible between these types.
Type 'ReadableStreamDefaultReader<{ context: Document<Record<string, any>>[]; answer: string; } & { [key: string]: unknown; }>' is not assignable to type 'ReadableStreamDefaultReader<Uint8Array>'.
Type '{ context: Document<Record<string, any>>[]; answer: string; } & { [key: string]: unknown; }' is missing the following properties from type 'Uint8Array': BYTES_PER_ELEMENT, buffer, byteLength, byteOffset, and 27 more.ts(2345)
const stream: IterableReadableStream<{
context: Document<Record<string, any>>[];
answer: string;
} & {
[key: string]: unknown;
}>try {
await env.AI.run(...)
} catch (e) {
return new Response('handle errors here')
}