Thanks Laurence, but that would mean that eggs which are not hosted on the allowed hosts would not be retrievable. This behaviour would not be possible for us because our eggproxy cache is used by many projects that needs many dependencies that could eventually be hosted on various hosts.
Cheers, Bruno On 23 May 2011 17:31, Laurence Rowe <[email protected]> wrote: > > Bruno Binet wrote: >> >> Hi, >> >> I've setup collective.eggproxy to act as a cache for eggs, configured >> with always_refresh=1 to force updating package indexes from pypi so >> they're always up to date. My problem is that when I try to install >> eggs from this collective.eggproxy server, I get stuck with a timeout >> error, especially for the egg "configobj". >> I extracted the relevant code in cause from collective.eggproxy and >> the problem can easily be reproduced with this simple test case: >> from pkg_resources import Requirement >> import collective.eggproxy.utils as utils >> utils.ALWAYS_REFRESH = True >> requirement = Requirement.parse('configobj') >> index = utils.PackageIndex('http://pypi.python.org/simple') >> index.find_packages(requirement) >> >> Running this example will show the following: >> (eggproxy)bbinet@bbinet: ~/dev $ python packageindex.py >> Page at http://pypi.python.org/simple/configobj/ links to .py file(s) >> without version info; an index scan is required. >> [...] >> Page at http://pypi.python.org/simple/configobj/ links to .py file(s) >> without version info; an index scan is required. >> (eggproxy)bbinet@bbinet: ~/dev $ >> >> As you can see, eggproxy will scan the configobj index multiple times >> which causes the timeout. >> After digging in the code, I've found that the problem comes from >> download urls of that form: >> http://www.voidspace.org.uk/cgi-bin/voidspace/downman.py?file=configobj-4.4.0.zip >> The urls ending with ".py" are interpreted as special urls by >> setuptools/distribute and requires to scan the index again, but I >> can't say why. >> >> Any idea how we can fix this in collective.eggproxy? >> I've thought of introducing a short time during which we won't scan >> again the same index instead of always refresh. >> > > In my buildouts I tend to add a line like: > > allow-hosts = > *.python.org > *.plone.org > > plus a few exceptions to prevent problems with random third party servers > going down. Perhaps you can do the same with collective.eggproxy. > > Laurence > > > -- > View this message in context: > http://plone.293351.n2.nabble.com/collective-eggproxy-timeout-issue-tp6394847p6394939.html > Sent from the Product Developers mailing list archive at Nabble.com. > _______________________________________________ > Product-Developers mailing list > [email protected] > https://lists.plone.org/mailman/listinfo/plone-product-developers > -- Bruno Binet Camptocamp France SAS Savoie Technolac, BP 352 73377 Le Bourget du Lac, Cedex Mail : [email protected] http://www.camptocamp.com _______________________________________________ Product-Developers mailing list [email protected] https://lists.plone.org/mailman/listinfo/plone-product-developers
