Hi all
Just added cache decorator to the method that look like

    @beaker_cache(expire=600, type='memory', key=["limit",
"category_id", "company_id"])
    def recent_products(self, limit, category_id=None,
company_id=None):
        query = Product.query.select_from(
            Product.table.join(Company.table)
        ).filter(and_(Company.status == Company.STATUS_ACTIVE,
Product.status == Product.STATUS_ON_DISPLAY, Company.content_quality >
0.4))
        if category_id:
            query = query.filter_by(category_id=category_id)
        if company_id:
            query = query.filter_by(company_id=company_id)
        query = query.order_by(Product.date_created.desc())
        return query.limit(limit)

And after some time I get

File "/usr/lib/python2.5/site-packages/SQLAlchemy-0.4.5-py2.5.egg/
sqlalchemy/pool.py", line 587, in do_get
   raise exceptions.TimeoutError("QueuePool limit of size %d overflow
%d reached, connection timed out, timeout %d" % (self.size(),
self.overflow(), self._timeout))
TimeoutError: QueuePool limit of size 5 overflow 20 reached,
connection timed out, timeout 30

What is the root of problem here? Is it that I cache query but not the
objects and then after query get detached from session it starts
behave in a way that it does not close connections? Maybe there should
be some exception in this case of the wrong use?

Also if someone knows best practices of caching with Elixir, it would
be great to hear about.

Thanks in advance for any help on the topic

m
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"SQLElixir" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/sqlelixir?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to