Your chat is too long.
I am working an a project as a beginner and after a few days of working on it now. I got this message:
Your chat is too long. You can try shortening your last message or starting a new chat.
If the issue persists, contact support and provide the following: ( THERE IS A TRACE ID HERE - I didnt know if I was supposed to share that)
Is there any solution to that?
2 Replies
that’s a limitation of llms. they have a limit on how many tokens you can pass to them. whenever you send a message, you also send the entire history (your prompts and its responses)
two suggestions:
tell the ai’s to not produce so many tokens. this makes the history shorter. i do this by saying “NO YAPPING” but after like three messages it starts yapping again
branch off into new threads. you can pick a message, branch it, and now the model has the context up to the branch. this also is helpful for not giving unnecessary context that the model might think is important but isn’t
You can also try "compacting" the conversation by telling the LLM to summarize the chat.
Here's a generic prompt for that:
That should be used like a last resort though, the actual solution is to keep your conversations to one topic only