since upgrading `@cloudflare/ai` from 1.0.53 to 1.1.0, using the `stream: true` option when invoking

since upgrading
@cloudflare/ai
from 1.0.53 to 1.1.0, using the
stream: true
option when invoking
ai.run(…)
no longer streams in the response as would be expected. instead, there are no messages for the most part until the very end when suddenly the entirety of the response streams in at once (still broken into chunks but in extremely rapid succession). here’s how i am using it:
    const ai = new Ai(env.AI);
    const model = '@cf/openchat/openchat-3.5-0106';
    const stream = await ai.run(model, {
        messages: getMessagesFromPrompt({ model, prompt }),
        stream: true,
    });

    return new Response(stream, {
        headers: { 'content-type': 'text/event-stream' },
    });
Was this page helpful?