Langfuse integration
Since upgrading to mastra 0.19.1 the langfuse integration has stopped showing data properly (no input, cost, tracing)
6 Replies
๐ Created GitHub issue: https://github.com/mastra-ai/mastra/issues/8874
GitHub
[DISCORD:1427945003027665006] Langfuse integration ยท Issue #8874 ...
This issue was created from Discord post: https://discord.com/channels/1309558646228779139/1427945003027665006 Since upgrading to mastra 0.19.1 the langfuse integration has stopped showing data pro...
Hi @jack ! Are you using the new AI tracing? If not you should migrate to this new implementation https://mastra.ai/en/docs/observability/ai-tracing/overview
AI Tracing | Observability | Mastra Docs
Set up AI tracing for Mastra applications
Sorry can close this I am using AI tracing but my mastra langfuse package was very outdated. Moving to latest packages across all fixed.
Awesome, thanks for letting me know ๐
No problem! In an unrelated follow up, I'm finding mastra is quite slow for smaller tasks (This is just a factor of an agent right haha), I was thinking to route smaller tasks directly to the system via NLP or similar and falling back to the agent but one thing about that approach is the "fast routed" messages and fake responses aren't actually threaded through mastra and won't show up in the full chat history / message context if a user continues talking etc.
Wondering if you guys have seen this problem and have a solution? or with a better idea of mastra know if there's a best path workaround here?
Hey @jack ! You could directly inject these messages into a thread created in mastra. You can access the agent memory part and call saveMessage on it with a threadId and a resourceId
I'd love to know what makes Mastra feel slow, we're always looking at improving the framework ๐