I use llama 3.1 which has 128k token context window, it might be worth the upgrade in your use case

I use llama 3.1 which has 128k token context window, it might be worth the upgrade in your use case if you are going over limits
Was this page helpful?