Hi I m trying out the LLM worker and I m

Hi! I'm trying out the LLM worker and I'm wondering if it's possible to return the response as a stream? similar to https://developers.cloudflare.com/workers/examples/openai-sdk-streaming/ ?
Was this page helpful?