Can we enforce the maximum number of rows returned? #2715
-
Is there a way to enforce a LIMIT/maximum of rows returned to the user? For example, never return more than 100 rows, regardless of the user specifiying There seems no way to implement this PostgreSQL-side. Triggers don't affect SELECT, and as far as I can tell, also the RLS policy can't LIMIT number of rows of a single SELECT. A view with a LIMIT clause applies the limit before filters are applied. I have one particularly large materialized view ("mvw") containing hundreds of thousands of first- and last-names, together with some metadata. The user should get about 20 rows of randomized names, by specifying the type of names (eg. 20 french male first names). My first try was to not allow access to the mvw, and create an accessible view with a random order and limit clause. However, this won't reliably work, because the filters come in after the limiting has already taken place. For example, this view: Then I thought I could go around this, by instead using a function that does the query. But it turns out that the user needs access to the, mvw even when it has execute access on the function. Surprisingly this was not required when using view-on-view. Ok, so yhis might be a bit of an edge-case, but this "force-limiting" might still be a very useful addition to PostgREST. It can act as another layer of protection to prevent vacuuming whole databases. Esp. for SPA-applications, where the code runs largely client-side, |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
Hi, I think the |
Beta Was this translation helpful? Give feedback.
Hi, I think the
db-max-rows
config parameter is what you're looking for?