M
Mastra•4w ago
Bobapi

Langfuse integration via mastra/langfuse - tags and cached token count

Current versions: "@mastra/core": "0.24.1", "@mastra/langfuse": "0.2.3", "@mastra/memory": "0.15.11", "@mastra/pg": "0.17.8", I'm struggling to find info in the docs, I have 2 issues: 1) I'd like to add a tag to my Langfuse span. I can't do it at the runtimeContext level, I need to do it within the workflow step as the tag depends on the LLM response. I can add metadata e.g.:
tracingContext.currentSpan?.update({
metadata: {
langfuseTags: [
...(reportSpecError ? ['report-validation-failed'] : []),
],
},
});
tracingContext.currentSpan?.update({
metadata: {
langfuseTags: [
...(reportSpecError ? ['report-validation-failed'] : []),
],
},
});
BUT how can I add this as a tag that appears within the Langfuse Tracing view under the 'tag' column? 2) I'm using OpenRouter (OpenAI SDK by default?) - I can see input and output tokens used in my workflows, but the costs are inaccurate as cached tokens don't seem to be sent with the trace. How can I do this?
1 Reply
Mastra Triager
Mastra Triager•4w ago
šŸ“ Created GitHub issue: https://github.com/mastra-ai/mastra/issues/10174 šŸ” If you're experiencing an error, please provide a minimal reproducible example to help us resolve it quickly. šŸ™ Thank you @Bobapi for helping us improve Mastra!

Did you find this page helpful?