Langfuse integration via mastra/langfuse - tags and cached token count
Current versions:
"@mastra/core": "0.24.1",
"@mastra/langfuse": "0.2.3",
"@mastra/memory": "0.15.11",
"@mastra/pg": "0.17.8",
I'm struggling to find info in the docs, I have 2 issues:
1) I'd like to add a tag to my Langfuse span. I can't do it at the runtimeContext level, I need to do it within the workflow step as the tag depends on the LLM response. I can add metadata e.g.:
BUT how can I add this as a tag that appears within the Langfuse Tracing view under the 'tag' column?
2) I'm using OpenRouter (OpenAI SDK by default?) - I can see input and output tokens used in my workflows, but the costs are inaccurate as cached tokens don't seem to be sent with the trace. How can I do this?
1 Reply
š Created GitHub issue: https://github.com/mastra-ai/mastra/issues/10174
š If you're experiencing an error, please provide a minimal reproducible example to help us resolve it quickly.
š Thank you @Bobapi for helping us improve Mastra!