GenkitG
Genkitโ€ข12mo agoโ€ข
11 replies
faint-white

Deploying chat with a flow

How do I use chat() in a flow in the backend so I can talk to the frontend runFlow() or streamFlow() ? I'm trying to adapt the code that's using generate() for chat().

Backend:

  async (request, streamingCallback) => {
    // Retrive conversation history
    const history = await run(
      'retrieve-history',
      request.conversationId,
      async () => {
        return (await historyStore?.load(request.conversationId)) ||
        [];
      }
    );

    // Run user prompt (with history)
    const mainResp = await ai.generate({
      prompt: request.prompt,
      messages: history,
      model: llms[request.llmIndex],
      streamingCallback,
    });

    await run(
      'save-history',
      {
        conversationId: request.conversationId,
        history: mainResp.messages,
      },
      async () => {
        await historyStore?.save(request.conversationId, mainResp.messages);
      }
    );
    return mainResp.text;
  }


Frontend:

import { streamFlow, runFlow} from 'genkit/client';
const response = await streamFlow({
         url,
         input: {
           prompt: input,
           conversationId: this.id,
           llmIndex: this.llmIndex,
         },
       });


I have tried something like below but the returned text did not change despite new input in request.prompt. Is there a good example somewhere?

const routingAgentFlow = ai.defineFlow(
  {
    name: 'chatbotFlow',
    inputSchema: AgentInput,
    outputSchema: z.string(),
    streamSchema: GenerateResponseChunkSchema,
  },

  async (request) => {

    // Retrive conversation history..
    
    const {response, stream}  = await chat.sendStream(request.prompt as string);
    const text = (await response).text;

    // Save conversation history..

    return text;
  }
);
Was this page helpful?