Has anyone had success setting up Sessions & Users in Langfuse using Mastra
I'm trying to create observability with Sessions (i.e. Mastra threads) and Users (i.e Mastra Resources). Has anyone had success with this so far? I was able to configure it properly to get basic LLM traces, but ideally I'd like to view these as threads so its easier to track my user's conversations (internal company use case).
I've found a bunch of different packages/frameworks for doing this (ie langfuse-vercel, vs mastra/langfuse vs opentelemetry) and it's difficult to reconcile the documentation together
If any one else is also trying to solve for this, let's collab
8 Replies
I was able to sort it out if anyone needs help here, let me know 👍
Hi @montel9705 ! Would you be able to share your solution? I don't think we have this documented yet, so it might be helpful for other people. Thanks 🙏
📝 Created GitHub issue: https://github.com/mastra-ai/mastra/issues/8170
GitHub
[DISCORD:1420496794156011681] Has anyone had success setting up Ses...
This issue was created from Discord post: https://discord.com/channels/1309558646228779139/1420496794156011681 I'm trying to create observability with Sessions (i.e. Mastra threads) and Users (...
@Romain
I just added a comment to the github issues for others to review/use/improve. I put together a small
observability.ts file that wires up Langfuse so I can capture traces, logs, and spans. Would love your feedback on whether I’m approaching this the right way.
One thing I’m still working through: aligning userId attribution. Right now I’m pulling ResourceId from Mastra’s Memory class, but in Langfuse everything shows up under the Agent rather than the end user. Ideally, I’d like each session/trace to propagate the userId so we can track conversations at the user/thread level (all of our users are on Mastra Cloud, internal POC use case).Hey @montel9705 , have you managed to get scores intro Langfuse that are produced by Mastra scorers? 🤓
We have made custom exporter for manage correctly LangFuse but now we are able to trace usage for single step, single thread, project, user.
That's amazing @Stefano Denti ! Do you think you could share your code/how you built this? We got most of this handled, but the UserId isn't working and there are a lot of excess logs. It's getting the job done, but we could definitely improve it
I haven't tried to build this out yet
I’m attaching the code for my custom exporter, which basically extends: AITracingExporter
At this point, we used a custom context based on AsyncLocalStore to have thread-safe data to insert into the exporter. The runtimeContext from Mastra, on the other hand, has a lifecycle that’s not compatible (as far as we’ve understood).
I hope this helps — feel free to reach out if you need anything, I’ll be happy to help if I can.
I started from this source to create our custom version — here’s the link:
https://github.com/mastra-ai/mastra/blob/2f4be87e1e94547bc7bb6f5340b467ec24c7e884/observability/langfuse/src/ai-tracing.ts