@mastra/[email protected], @mastra/[email protected], and @mastra/[email protected] with claude-haiku-4-5-20251001. We have cacheControl: { type: "ephemeral" } in our Anthropic provider options and caching is working — we can confirm via the Anthropic Console dashboard (94% cache hit rate) and direct API calls show cache_creation_input_tokens / cache_read_input_tokens in the response.mastra_ai_spans table. The spans only show inputTokens and outputTokens — no inputDetails.cacheRead or inputDetails.cacheWrite.extractUsageMetrics() in @mastra/observability reads providerMetadata.anthropic.cacheReadInputTokensinputDetails.cacheRead → cache_read_input_tokenscacheCreationInputTokens in providerMetadata on the message_stop eventstep-finish chunk in streaming mode might not be carrying providerMetadata through to the observability handler at #endStepSpan(), so metadata?.providerMetadata is undefined when extractUsageMetrics is called. We're using agent.stream() with maxSteps: 5.