Should we have an optional, disabled-by-default limit on the recursion/iteration depth of recursive CTEs to guard against stupid queries that loop ad infinitum?
I've looked at other database systems that support WITH RECURSIVE queries, and this idea crops up there. For example, Firebird, the only other RDBMS that I cared to look at for reasons you can perhaps guess, has a hard limit of 1024 (though you could argue that that's a limitation of their implementation, and I'd agree). Maybe the proprietary databases like SQL server have similar, perhaps even optional/adjustable limits - I don't know because I didn't check. I'd suggest that an appropriate interface would be an int GUC with a GucContext of PGC_SUSET, so that DBAs can impose system-wide limits. A possible use of such a GUC is to zero in on the actual recursion depth of the rCTE with the greatest depth in a given query, by performing a "git bisect" style binary search, setting the GUC dynamically at each step. It's probably not worth having a proper interface to do that with, but I can imagine that being a useful trick in certain narrow situations. We could also add a similar GUC that can be separately set by unprivileged users, that independently limits the recursion depth per session. This could be used as a sort of assertion of the maximum recursion depth of a given query. -- Peter Geoghegan http://www.2ndQuadrant.com/ PostgreSQL Development, 24x7 Support, Training and Services -- Sent via pgsql-hackers mailing list (firstname.lastname@example.org) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-hackers