GenkitG
Genkitโ€ข16mo agoโ€ข
20 replies
rubber-blue

Hi genkit team, i have a question for

Hi genkit team, i have a question for you. What's the best method to make LLM prompts & responses easily readable using the genkit UI? Right now, everything is presented as JSON, which makes it extremely challenging for me to comprehend.

Right now, there are two workarounds:
1) try opening the LLM call in the model runner
2) return a formatted string value any time I use runFlow, and access the actual return value of the function using typescript closures.

The issue with the model runner method is it only allows me to view my prompt and not the response. The issue with using closures is that it's ugly, and also creates side effects, making them easy to misuse and difficult to test.
Screenshot_2024-09-25_at_8.04.50_AM.png
Was this page helpful?