context_length_exceeded because one of our tools returns a large JSON array (600k+ chars). The interesting thing is that we do have TokenLimiter(128000) and ToolCallFilter() as inputProcessors, but those didn't help... processInput runs once at the start -- trims history, not current-step tool resultsprocessInputStep runs at each step but TokenLimiter drops older messages to make room -- it won't truncate the content of a single oversized tool-result message (since it's the most recent, it gets kept)