AI tracing issue with Langfuse- prompt tracing not working

Running into an issue where the tracing is not properly correlating with the prompts, so I can't follow the linked generations in langfuse. Here is the relative documentation for Langfuse for using the Vercel AI SDK, which does work. I have recreated the issue here: https://github.com/tommyOtsai/test-ai-tracing Unsure if there is something wrong with my setup or if there is an issue with the Langfuse exporter.
GitHub
GitHub - tommyOtsai/test-ai-tracing
Contribute to tommyOtsai/test-ai-tracing development by creating an account on GitHub.
Get started with Langfuse Prompt Management.
7 Replies
Mastra Triager
GitHub
[DISCORD:1419427091061145713] AI tracing issue with Langfuse- promp...
This issue was created from Discord post: https://discord.com/channels/1309558646228779139/1419427091061145713 Running into an issue where the tracing is not properly correlating with the prompts, ...
Eric
Eric4w ago
hi @tommy .. there are a few things incorrect with your setup... but even with the correct settings, this isn't currently working.
I'll add this to the backlog of fixes to make to AI Tracing... you will want a setup more like this:
import { openai } from '@ai-sdk/openai';
import { Agent, AgentGenerateOptions } from '@mastra/core/agent';
import { weatherTool } from '../tools/weather-tool';
import { LangfuseClient } from "@langfuse/client";

const langfuseClient = new LangfuseClient({
publicKey: process.env.LANGFUSE_PUBLIC_KEY,
secretKey: process.env.LANGFUSE_SECRET_KEY,
baseUrl: process.env.LANGFUSE_BASE_URL,
});

const prompt = await langfuseClient.prompt.get("Weather Agent Prompt");
const config = prompt.config as { generationOptions: AgentGenerateOptions };

const generateOptions = {
...config.generationOptions,
tracingOptions: {
metadata: {
langfusePrompt: prompt.toJSON(),
},
},
}

export const weatherAgent = new Agent({
name: 'Weather Agent',
instructions: prompt.prompt,
model: openai('gpt-5'),
tools: { weatherTool },
defaultGenerateOptions: generateOptions,
defaultStreamOptions: generateOptions,
defaultVNextStreamOptions: {
tracingOptions: {
metadata: {
langfusePrompt: prompt.toJSON(),
},
},
},
});
import { openai } from '@ai-sdk/openai';
import { Agent, AgentGenerateOptions } from '@mastra/core/agent';
import { weatherTool } from '../tools/weather-tool';
import { LangfuseClient } from "@langfuse/client";

const langfuseClient = new LangfuseClient({
publicKey: process.env.LANGFUSE_PUBLIC_KEY,
secretKey: process.env.LANGFUSE_SECRET_KEY,
baseUrl: process.env.LANGFUSE_BASE_URL,
});

const prompt = await langfuseClient.prompt.get("Weather Agent Prompt");
const config = prompt.config as { generationOptions: AgentGenerateOptions };

const generateOptions = {
...config.generationOptions,
tracingOptions: {
metadata: {
langfusePrompt: prompt.toJSON(),
},
},
}

export const weatherAgent = new Agent({
name: 'Weather Agent',
instructions: prompt.prompt,
model: openai('gpt-5'),
tools: { weatherTool },
defaultGenerateOptions: generateOptions,
defaultStreamOptions: generateOptions,
defaultVNextStreamOptions: {
tracingOptions: {
metadata: {
langfusePrompt: prompt.toJSON(),
},
},
},
});
but this just adds the metadata to the root span... which won't put the metadata on the model span which is needed for langfuse prompt tracking.
tommy
tommyOP4w ago
Yea sorry, I had adjusted the setup on my actual repository and forgot to update this example.
tommy
tommyOP4w ago
Thanks for taking a look @Eric , I saw this post: https://github.com/mastra-ai/mastra/issues/8149 Is (bug) Langfuse splitting traces by RunID related to prompt tracing?
GitHub
[Timeline] AI Tracing (Observability) · Issue #8149 · mastra-ai/m...
This issue will show a list of the current known bugs and feature requests for AI Tracing and the order that we plan on working through them. (new) Postgres storage support (new) Documentation Upda...
Eric
Eric4w ago
no this is un-related
tommy
tommyOP4w ago
Is the prompt tracing not currently on the roadmap?
Eric
Eric4w ago
it's on the roadmap, but quite down on the list. Here is everything I'm currently aware of: https://github.com/mastra-ai/mastra/issues/8149

Did you find this page helpful?