Let's not forget how the various *count *calls starts to kill your database 
when you get over 1 million rows (postgres at least).

So far the only options I have found with postgres are:
- Estimate count for non filtered queries: SELECT reltuples::BIGINT FROM 
pg_class WHERE relname = '%s';
- If queries are filtered, replace it with a subquery that will first limit 
the results to a reasonable number (such as 1k), this is silly, and won't 
allow you to go through all results but at least the count call won't kill 
your database. This is only useful if the filtered query returns over one 
million rows as well.

El miércoles, 5 de diciembre de 2018, 7:15:22 (UTC-5), Saleem Jaffer 
> Hi all,
> The default paginator that comes with Django is inefficient when dealing 
> with large tables. This is because the final query for fetching pages uses 
> "OFFSET" which is basically a linear scan till the last index of the 
> current page. Does it make sense to have a better paginator which does not 
> use "OFFSET". 
> If this sounds like a good idea, I have some ideas on how to do it and 
> with some help from you guys I can implement it.
> Saleem

You received this message because you are subscribed to the Google Groups 
"Django developers  (Contributions to Django itself)" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to django-developers+unsubscr...@googlegroups.com.
To post to this group, send email to django-developers@googlegroups.com.
Visit this group at https://groups.google.com/group/django-developers.
To view this discussion on the web visit 
For more options, visit https://groups.google.com/d/optout.
  • A... Saleem Jaffer
    • ... ludovic coues
      • ... Jason Johns
        • ... Adam Johnson
    • ... Curtis Maloney
      • ... Josh Smeaton
    • ... Kye Russell
      • ... Dan Davis
    • ... 'Ivan Anishchuk' via Django developers (Contributions to Django itself)
    • ... Cristiano Coelho
      • ... Tim Allen
        • ... Tom Forbes

Reply via email to