anthropic Prompt caching is not working with mastra
my code:
I used it the exact way Mastra documented it in the code, but I’m still seeing usage as 0 :
inputTokens: 90,567
outputTokens: 1,485
totalTokens: 92,052
cachedInputTokens: 0
I also tested the AI-SDK with caching , and caching works there, but not in Mastra. Am I doing something wrong? Did it work for anyone else?
I used it the exact way Mastra documented it in the code, but I’m still seeing usage as 0 :
inputTokens: 90,567
outputTokens: 1,485
totalTokens: 92,052
cachedInputTokens: 0
I also tested the AI-SDK with caching , and caching works there, but not in Mastra. Am I doing something wrong? Did it work for anyone else?