© 2026 Hedgehog Software, LLC

TwitterGitHubDiscord
More
CommunitiesDocsAboutTermsPrivacy
Search
Star
Setup for Free
Cloudflare DevelopersCD
Cloudflare Developers•3y ago•
3 replies
siggmus

How to make a Worker stream (an OpenAI) response back to the front-end?

I am trying to make a worker return chunks of data to the front-end instead of waiting for the full OpenAI result before responding. Currently, it does not return any data before the full response is received from OpenAI.

Any idea of how to solve this?

Worker code: (I copied the code from the Cloudflare docs (https://developers.cloudflare.com/workers/examples/openai-sdk-streaming/) and made minor adjustments to the header configuration to avoid CORS errors during development.)

export default {
    async fetch(request, env, ctx) {
        const openai = new OpenAI({
            apiKey: "##-#######"
        })

        const stream = await openai.chat.completions.create({
            model: "gpt-3.5-turbo",
            messages: [{ role: "user", content: "Tell me a story using 900 chars." }],
            stream: true
        })

        let { readable, writable } = new TransformStream()

        let writer = writable.getWriter()
        const textEncoder = new TextEncoder()

        for await (const part of stream) {
            console.log(part.choices[0]?.delta?.content || "")
            writer.write(textEncoder.encode(part.choices[0]?.delta?.content || ""))
        }

        writer.close()

       return new Response(readable, {
            status: 200,
            headers: {
                'Access-Control-Allow-Origin': '*',
                'Access-Control-Allow-Headers': 'Origin, X-Requested-With, Content-Type, Accept',
                'Content-Type': 'text/plain; charset=utf-8'
            }
        })
    }
}
export default {
    async fetch(request, env, ctx) {
        const openai = new OpenAI({
            apiKey: "##-#######"
        })

        const stream = await openai.chat.completions.create({
            model: "gpt-3.5-turbo",
            messages: [{ role: "user", content: "Tell me a story using 900 chars." }],
            stream: true
        })

        let { readable, writable } = new TransformStream()

        let writer = writable.getWriter()
        const textEncoder = new TextEncoder()

        for await (const part of stream) {
            console.log(part.choices[0]?.delta?.content || "")
            writer.write(textEncoder.encode(part.choices[0]?.delta?.content || ""))
        }

        writer.close()

       return new Response(readable, {
            status: 200,
            headers: {
                'Access-Control-Allow-Origin': '*',
                'Access-Control-Allow-Headers': 'Origin, X-Requested-With, Content-Type, Accept',
                'Content-Type': 'text/plain; charset=utf-8'
            }
        })
    }
}



Front-end code:

const onSubmit = async (data) => {
            const response = await fetch('CLOUDFLARE_END_POINT');

            const reader = response.body.getReader();

            while (true) {
                const { done, value } = await reader.read();

                if (done) {
                    break;
                }
                console.log(new TextDecoder().decode(value));
            }
}
const onSubmit = async (data) => {
            const response = await fetch('CLOUDFLARE_END_POINT');

            const reader = response.body.getReader();

            while (true) {
                const { done, value } = await reader.read();

                if (done) {
                    break;
                }
                console.log(new TextDecoder().decode(value));
            }
}
Stream OpenAI API Responses · Cloudflare Workers docs
Documentation for Cloudflare Workers, a serverless execution environment that allows you to create entirely new applications or augment existing ones …
Stream OpenAI API Responses · Cloudflare Workers docs
Cloudflare Developers banner
Cloudflare DevelopersJoin
Welcome to the official Cloudflare Developers server. Here you can ask for help and stay updated with the latest news
85,042Members
Resources
Was this page helpful?

Similar Threads

Recent Announcements

Similar Threads

How to cache worker response?
Cloudflare DevelopersCDCloudflare Developers / workers-and-pages-help
2y ago
Can an external http client make an RPC call to a Worker?
Cloudflare DevelopersCDCloudflare Developers / workers-and-pages-help
2y ago
Slow worker response
Cloudflare DevelopersCDCloudflare Developers / workers-and-pages-help
15mo ago
find the http response size of a worker
Cloudflare DevelopersCDCloudflare Developers / workers-and-pages-help
3y ago