Server Side Streaming Support? (SSE)

Hi! I was wondering if the HTTP routes or the actions of gadget support server side streaming? So for example when streaming an LLM response over HTTP SSE? Kind regards, Dane
4 Replies
Chocci_Milk
Chocci_Milk4w ago
Hello, Whats your use case? Would our current streaming capabilities not be enough? https://docs.gadget.dev/guides/http-routes/common-use-cases#ai-response-streaming
Käse
KäseOP4w ago
Ah I didn’t know this was thing! Will it also work with the OpenAI Agents SDK? (typescript) Or just regular calls to openai? Because the agents sdk allows for streaming substeps too, such as “thinking…”, “searching the web…”. That would be great if it’s supported
Chocci_Milk
Chocci_Milk4w ago
I'm not sure. I haven't tried it out. I don't see why it wouldn't work but you'd need to test and see I'm going to mark this thread as closed. Let us know if you have anymore questions
Gizmo
Gizmo4w ago
Do you like this answer? ​

Did you find this page helpful?