Features / Search Sorting & Relevance

Search Sorting & Relevance

Order smart search results by date or relevance, and dial in a similarity threshold so filtered searches only return photos that actually look right.

searchCLIPsortingrelevance Documentation
Smart search in a Travel Memories space returning nature photos with the sort dropdown visible in the top bar

Relevance isn't always what you want

CLIP smart search ranks results by visual similarity — useful when you're looking for "the best match", less useful when you're trying to find when something happened. Searching "birthday cake" should let you ask "show me the most recent ones" as easily as "show me the closest matches".

A new sort dropdown sits next to the search bar with three modes: Relevance (the default — highest similarity first), Newest, and Oldest. Switching modes transforms the result set in place without re-running the query.

Two-phase recall, then sort

Date-sorted results use a two-phase CTE: the first phase recalls the top 500 matches by vector similarity, the second re-orders that set chronologically. You still see only photos that match your query — just arranged by date instead of by how confident CLIP is in each match. It's one round trip, and pagination is stable across sort changes.

Smart search results sorted by newest, grouped into January 2026 and May 2020 date sections with infinite scroll

Date groups and infinite scroll

When you pick Newest or Oldest, results are grouped by month — January 2026, May 2020, and so on — so the timeline is immediately readable. Infinite scroll kicks in as you approach the bottom of the list, so large result sets don't force you into pagination clicks.

This works on the main timeline (/photos) and inside any Shared Space — the same search experience, same sort controls, same date grouping. Smart search on /photos can even reach into spaces you've pinned to your timeline, so a single query covers your entire library.

A relevance threshold for filtered searches

When you combine a text search with metadata filters — say, searching "forest" while filtering to a specific country — results sometimes include photos that match the filter but have almost nothing to do with the query. That happens because CLIP returns every photo in the filtered set, ranked by similarity. Even the worst matches still show up.

The new Max Search Distance admin setting adds a hard cutoff. Results above the configured cosine distance are excluded before pagination, so a search either returns relevant photos or returns nothing — never irrelevant ones padding out the grid.

Administration Smart Search panel showing the Max search distance field set to 0.75 with tuning description

Tuning the threshold

The setting lives in Administration → Machine Learning → Smart Search. Values are cosine distances between 0 (identical) and 2 (opposite). 0.75 is a good starting point — strict enough to filter noise, permissive enough for abstract queries. Lower values (0.5) only keep strong visual matches; higher values (1.0) let weaker matches through for broad searches.

A couple of things worth knowing: CLIP embeddings tend to cluster tightly, so small threshold changes can have outsized effects on result counts — that's normal, not a bug. Different CLIP models produce different distance distributions, so if you switch models you'll likely want to retune. Text-to-image searches also tend to have looser distances than image-to-image searches; pick a threshold that works for text and it'll be permissive enough for everything else.

The default is 0 (disabled) so existing behavior is preserved until you opt in.

Read the full documentation on GitHub

See Search Sorting & Relevance in action or set up your own instance.