Filtering searches, row size limitations
I'm currently using filters to create an exact text search but I noticed that there are row size limitations and large rows (>10KB) are being silently excluded.
I'm attempting to filter documents, which can be pretty big (hundreds of KB), and I'm wondering if Xata is a good use case for this or not.
I think we only care about exact keyword search sorted by date and not BM25 scoring.
1 Reply
With Search, filters work only with string type columns which are limited to 2048 characters, see column limits: https://xata.io/docs/rest-api/limits#column-limits. So the Search store is not a good fit for this type of lengthy document filtering.
You could however try using the Postgres-based "query" endpoints (or wire protocol) which have less strict limitations.
With the REST API/TS or Python SDK you can use contains/icontains filters for the "search" part with the query endpoint https://xata.io/docs/sdk/filtering#case-insensitive-matching and exact match filtering with "no" limit https://xata.io/docs/sdk/filtering#exact-matching (the applicable limit is 600kb, which is the max text column length).
With SQL queries over Wire protocol and text columns created via DDL, there are no length limitations.