So I'm working on an AI image gen app and I proposed a full text search feature that would allow users to search all their generated images based on the prompt. The problem is our image collection is over 1 billion rows (mongodb btw). They pushed back and said they tried this in the past and it was not performant because the collection/table is too big and it would sometimes take up to 8 seconds. Now I believe it's probably because they were doing full text search on the full 1 billion+ table/collection, but if we query by first filtering it down based on userID first then perform the full text search on that much smaller set it would be performant right? I only know SQL database so I'm not sure but it wouldn't be similar for mongodb too? And if you know more performant way to do this, please let me know