Tracing prompts with Langfuse
I am using Langfuse for prompt management, and I can consume the prompts just fine. However, I cannot have the tracing link to the prompts to the ones created in langfuse, and versions. Here is an example of how I set it up
However, this is still not working. I can just see the raw prompt in Langfuse. Do we have any other tracing integration with Langfuse?
4 Replies
๐ Created GitHub issue: https://github.com/mastra-ai/mastra/issues/10172
๐ If you're experiencing an error, please provide a minimal reproducible example to help us resolve it quickly.
๐ Thank you @Karl for helping us improve Mastra!
Can you try this?
- Fetch the prompt via Langfuse Prompt Management, update the active span with
- Mastraโs Langfuse exporter ships that span metadata to Langfuse, so the trace links to the managed prompt version instead of showing raw text.
promptName/promptVersion/promptId, and forward tracingContext into the agent call.- Mastraโs Langfuse exporter ships that span metadata to Langfuse, so the trace links to the managed prompt version instead of showing raw text.
Thanks @Grayson . I think there is a mismatch in function API, for example there is no "langfuseClient.prompts", and the prompt object has no "promptId", only name and version. Anyway, I attempted
But this is still not working.
The prompt object "instrctionsHeaderAuditing" has been predefined globally in another file.
I think the ability to link prompts in langfuse hasn't been added yet: https://github.com/mastra-ai/mastra/pull/8762#issuecomment-3402806448
GitHub
Add langfuse prompt tracing by zack-ashen ยท Pull Request #8762 ยท ...
Description
This adds the ability to link prompts from llm generations to stored langfuse prompts. Specifically following this reference. When you add a Langfuse object to the metadata of a generat...