I am looking at the results from context search (web and iOS) and I am a bit confused.
If I search for basic but very different things like "dog", "concert" or "beer" I do get quite good results in the beginning of the search results but then I also get all my other photos listed afterwards.
On my test installation I only have 14 photos:
1 beer can
6 from a concert
3 with a dog
1 of a vacum cleaner
1 of a chair
If I search for "beer", the beer can shows first, but then my other 13 photos are shown after it. Same with "concert" or "dog", it knows the first matches are good but the rest of them are not matching at all and are still shown.
So it kinda, but its like there is no threshold for how small the match has to be.
I tried first with the defaullt model and then I changed to "immich-app/ViT-B-16-SigLIP2__webli" and rerun the smart search job but all images are still shown when context searching.