How to Stream responses when using Langchain in the new route handlers

Im trying to build a ChatGPT app using langchain, but I cant figure out how to stream the responses to the client.

This is how I have initialized openai with langchain
import { ChatOpenAI } from "langchain/chat_models/openai";
import { HumanChatMessage } from "langchain/schema";

const chat = new ChatOpenAI({
  streaming: true,
  callbacks: [
    {
      handleLLMNewToken(token: string) {
        process.stdout.write(token); // how to stream this response??
      },
    },
  ],
});

await chat.call([
  new HumanChatMessage("Write me a song about sparkling water."),
]);


I can get the stream from the handleLLMNewToken callback. handleLLMNewToken returns a string for every token(word) generated.

Can someone please help me on how to stream this to the client using the new route handlers?
Was this page helpful?