Sorry for the duplicate, accidently hit send before I was done, finished

On Aug 3, 2017 1:08 AM, "guettli" <> wrote:

First I asked a similar question on the postgresql-general list. The
discussion[1] has settled there.

Now I would love the hear what you think.

I am thinking about rewriting an existing application which uses
PostgreSQL via Django.

Up to now the permission checks are done at the application level.

Up to now queries like: "Show all items which the current user is
allowed to modify" result in complicated SQL and
this leads to slow queries.

Up to now there is one db-user and the application does the filtering
of rows to prevent application users to see
items which they are not allowed to see.

I guess most web applications work like this.

I would like to reduce the "ifing and elsing" in my python code (less
conditions, less bugs, more SQL, more performance)
One important intention for me: I would like to avoid the redundancy.
As soon as I want to query for
"Show all items which the current user is allowed to modify" I need
the permission checking in a SQL WHERE condition.

If I implement this. Then my code which might look like this is redundant:


def has_perm(obj, user):
    if user.is_superuser:
        return True


Yes, I feel farewell pain. I love Python, but I guess I will use perm
checking via SQL WHERE for new projects in the future.

What do you think?

   Thomas G├╝ttler


I thumbed through the threads on the PG list, and the best summary of the
answers you received is:

A. Yes you can probably do that, but it would likely be extremely
complicated and Django won't necessarily make it easy.

B. The "problem" you are trying to solve is more of a "preference" with a
questionable motive of speed increases.

There wasn't a great amount of support for what you are trying to do, and I
doubt you'll find much on this list.

The DB should not be responsible for business logic (in this case,
authorization rules). DB's are only good at storing, searching, and
returning data. Data doesn't care who you are, nor does the DBMS past the
table level in most cases.

There are several other points to consider.

- By moving this logic in to the DB, you are almost certainly vendor
locking yourself. If your project is not distributed to others, then it may
not be an issue. If you do, abstract away the function calls as much as
possible to build an internal Python API. If you need to change the
underlying database functionality later, your application code should
remain relatively untouched, and you can send the bundle of money in labor
you saved to me. ;-)

- You'll almost certainly take a performance hit when connecting to the
database. If each connection to the DB is using a different user, then you
likely cannot take advantage of things like DB connection pooling. Every
request would require that a connection be built, utilized, and then torn
down. Those operations have a cost in both time and resources. Under load,
that cost will likely be noticeable.

- It likely won't scale well. Assuming you have a fair number of unique
users with concurrent connections (depending on the resources available and
tuning, this could be as little as a few dozen), your DB now has to manage
at least a single connection per user. DB's may or may not fare well under
those conditions. I haven't tried that model, though, so I may be mistaken.
Managing dozens, hundreds, or thousands of threads/processes in the DB all
attacking the same data set does not sound desirable, especially with
locking and/or transactions in the mix.

- It will likely be very rigid. DB's are not necessarily good at what it is
you are trying to do, and may lack features. Any set of rules you establish
will probably be more basic than you are anticipating/hoping, or will be
near impossible to modify once in production. Again, this is a guess.

- Performance gains may be marginal, especially weighted against the amount
of complexity introduced. Your claim that the DB queries are "slow" is
probably somewhat accurate due to the GFK usage by the contrib.auth
package. However, the GFK's provide a great deal of flexibility, which
usually offsets the marginal performance hit they introduce. I'd rather run
two reasonably fast queries than rely on a custom rolled set of triggers,
DB views, stored procedures, and black magic offered by the DB.

- Maintenance and sync will be a nightmare. Users who are disabled,
modified, or deleted may pose a risk of synch issues, as now you have to
keep that data in sync in two locations (ie the users' password).

Your argument of "if/elif" statements being undesirable for auth is likely
not based on any decent production system, granted they can get very hairy
in complex scenarios. I can chalk that up to personal preference, but I
think it unwise to discard an awesome tool in your chest in liueu of a hard
dependency on the DB.

I would suggest that rather than looking to optimize at the DB level, start
with rolling your own authentication/authorization system instead of
pushing the work out. Your queries can be tweaked and tuned to keep them
simple and fast without having to track things in the DB.


You received this message because you are subscribed to the Google Groups 
"Django users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
To post to this group, send email to
Visit this group at
To view this discussion on the web visit
For more options, visit

Reply via email to