M
MastraAI2mo ago
Raid55

Anthropic System Prompt Caching Regression

It seems that Anthropic Prompt caching is being regressed from the AI-SDK implementation. I read the source code for the Agent class in mastra and it seems like in multiple places the providerOptions are not being passed for system role messages. The work around would of been to pass the system role messages as part of the message list when .streamVNext is called, as below:
const stream = await agentDouble0Seven.streamVNext(
[
{
role: "system",
providerOptions: {
anthropic: { cacheControl: { type: "ephemeral" } }, // what enables the "checkpoint" caching that anthropic uses
},
content: `...Long System Prompt over 1.2k tokens (minimum required for caching)...`,
},
{
role: "user",
content: "Do the cool AI thing you do",
},
],
{
format: "aisdk",
},
);
const stream = await agentDouble0Seven.streamVNext(
[
{
role: "system",
providerOptions: {
anthropic: { cacheControl: { type: "ephemeral" } }, // what enables the "checkpoint" caching that anthropic uses
},
content: `...Long System Prompt over 1.2k tokens (minimum required for caching)...`,
},
{
role: "user",
content: "Do the cool AI thing you do",
},
],
{
format: "aisdk",
},
);
but it seems that Mastra has some proprietary logic wrapped over the AI-SDK that omits the added providerOptions. to be sure this was not an AI SDK issue i tried the following code and it succeeded at caching my system prompt:
const result = await streamText({
model: anthropic("claude-sonnet-4-20250514"),
messages: [
{
role: "system",
providerOptions: {
anthropic: { cacheControl: { type: "ephemeral" } },
},
content: `...Long System Prompt over 1.2k tokens (minimum required for caching)...`,
},
{
role: "user",
content: "Do the cool AI thing you do",
},
],
});
const result = await streamText({
model: anthropic("claude-sonnet-4-20250514"),
messages: [
{
role: "system",
providerOptions: {
anthropic: { cacheControl: { type: "ephemeral" } },
},
content: `...Long System Prompt over 1.2k tokens (minimum required for caching)...`,
},
{
role: "user",
content: "Do the cool AI thing you do",
},
],
});
I am using mastra version 0.16.0 with AI SDK v5 5.0.33 I am be doing something wrong, or just miss-read the source code, there is a lot of things pertaining to memory, title generation and other features of Mastra in the Agent Class which may have lead to my conclusion. as a sidenote it would also be nice to provide this providerOption somewhere when initializing the Agent class. Since instructions is a required param, I can't leave it blank doing the method above.
5 Replies
Unknown User
Unknown User2mo ago
Message Not Public
Sign In & Join Server To View
Raid55
Raid55OP2mo ago
yes, the AI-SDK functionality works, its the Mastra agents that do not. The snippet you shared is the AI-SDK (you are calling the LLM with generateText which is a AI-SDK function). I'd like to point out that since i posted this I think the issue has to do with the MessageList class, since I am using that to create a list of messages and my tool caching seems to not be working either.
_roamin_
_roamin_2mo ago
There is a pending PR that would allow for providing system messages as instruction which should help with this issue https://github.com/mastra-ai/mastra/pull/6845 You should still be able to provide the providerOptions on streamVNext directly though, like this:
const response = await agent.streamVNext("...",
{
providerOptions: {
anthropic: { cacheControl: { type: "ephemeral" } },
},
}
);
const response = await agent.streamVNext("...",
{
providerOptions: {
anthropic: { cacheControl: { type: "ephemeral" } },
},
}
);
GitHub
Instructions core message by steven-range · Pull Request #6845 · ...
Description Updates the Agent's instructions parameter to accept CoreSystemMessage and CoreSystemMessage[] types alongside strings, enabling providerOptions on instructions without requirin...
Mastra Triager
Mastra Triager2mo ago
GitHub
[DISCORD:1417254706740461659] Anthropic System Prompt Caching Regre...
This issue was created from Discord post: https://discord.com/channels/1309558646228779139/1417254706740461659 It seems that Anthropic Prompt caching is being regressed from the AI-SDK implementati...
Unknown User
Unknown User3w ago
Message Not Public
Sign In & Join Server To View

Did you find this page helpful?