> Take the hachoir-core as an example. The home URL was changed in > 1.2.1 to : > > "http://bitbucket.org/haypo/hachoir/wiki/hachoir-core" > > the 1.2 version home url was: > > "http://bitbucket.org/haypo/hachoir/wiki/hachoir-core" > > But the PyPI simple API will keep track of both: > > http://pypi.python.org/simple/hachoir-core > > Leading to the problem described (because the script visits all urls > before it decides > what tarball to pick) > > So what the maintainer should do ?
One option would be to delete releases that point to versions which no longer work. Another way would be to make sure that, despite the URLs not pointing to actual data anymore, they still give a quick meaningful error message (such as "connection refused"). In the very specific case, deleting the DNS entry for hachoir.org, or pointing it to, say, 127.0.0.1. > Recreate a new version of an old release so the old URL is removed > from PyPI ? Just register the metadata, knowing that the one contained in > the tarball is not the same ? The latter would also fix this problem in a convenient way, ISTM. > Maybe the solution would be to add in that page only the latest home URL > link.. That would solve this specific problem. I'm not so sure it would solve the general problem: what if some package *wishes* to provide different homepage URLs for different releases, and has them all working simultaneously? Regards, Martin _______________________________________________ Catalog-SIG mailing list Catalog-SIG@python.org http://mail.python.org/mailman/listinfo/catalog-sig