MastraM
Mastra3mo ago
Aisha

anthropic Prompt caching is not working with mastra

my code:
this.agent = new Agent({
  name: 'Agent',
  instructions: [
    {
      role: 'system',
      content: prompt(),
      providerOptions: {
        anthropic: { cacheControl: { type: 'ephemeral' } },
      },
    },
  ],
  model: wrappedModel,
})


I used it the exact way Mastra documented it in the code, but I’m still seeing usage as 0 :

inputTokens: 90,567
outputTokens: 1,485
totalTokens: 92,052
cachedInputTokens: 0

I also tested the AI-SDK with caching , and caching works there, but not in Mastra. Am I doing something wrong? Did it work for anyone else?
Was this page helpful?