Anthropic System Prompt Caching Regression
It seems that Anthropic Prompt caching is being regressed from the AI-SDK implementation.
I read the source code for the
The work around would of been to pass the
but it seems that Mastra has some proprietary logic wrapped over the AI-SDK that omits the added
to be sure this was not an AI SDK issue i tried the following code and it succeeded at caching my system prompt:
I am using mastra version
with AI SDK v5
I am be doing something wrong, or just miss-read the source code, there is a lot of things pertaining to memory, title generation and other features of Mastra in the Agent Class which may have lead to my conclusion.
as a sidenote it would also be nice to provide this
I read the source code for the
Agent class in mastra and it seems like in multiple places the providerOptions are not being passed for system role messages.The work around would of been to pass the
system role messages as part of the message list when .streamVNext is called, as below:but it seems that Mastra has some proprietary logic wrapped over the AI-SDK that omits the added
providerOptions. to be sure this was not an AI SDK issue i tried the following code and it succeeded at caching my system prompt:
I am using mastra version
0.16.0with AI SDK v5
5.0.33I am be doing something wrong, or just miss-read the source code, there is a lot of things pertaining to memory, title generation and other features of Mastra in the Agent Class which may have lead to my conclusion.
as a sidenote it would also be nice to provide this
providerOption somewhere when initializing the Agent class. Since instructions is a required param, I can't leave it blank doing the method above.