[issue12526] packaging.pypi.Crawler and resulting objects have a circular API

2014-03-12 Thread Éric Araujo

Changes by Éric Araujo :


--
resolution:  -> out of date
stage:  -> committed/rejected
status: open -> closed

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue12526] packaging.pypi.Crawler and resulting objects have a circular API

2011-07-15 Thread Éric Araujo

Éric Araujo  added the comment:

Thanks for the report.  I noticed similar strangeness in the API when working 
on the documentation.  Alexis is quite busy these weeks, but he will with no 
doubt comment on this later.

I’m not sure the force argument is a good idea; I think we should ask ourselves 
what is the behavior that would be most intuitive for users, and implement 
that.  If there are performance or caching issues, we’ll see.

--

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue12526] packaging.pypi.Crawler and resulting objects have a circular API

2011-07-10 Thread Michael Mulich

New submission from Michael Mulich :

The issue, as best I can describe it, is in how the a release list 
(packaging.pypi.dist.ReleaseList) looks up releases.

Here is a simple example using a random package on PyPI.

>>> crawler = Crawler()
>>> projects = crawler.search_projects('snimpy')
>>> projects
[]
>>> project = projects[0]
>>> [x for x in project]
[]

The results show that project 'snimpy' has no releases, but this is incorrect 
because distribution 'snimpy' has five releases.

Even after calling sort_releases and fetch_releases on the project which both 
refer back to the crawler instance (see the project's _index attribute) the 
project fails to get the releases.

>>> project.fetch_releases()
[]
>>> project.sort_releases()
>>> [x for x in project]
[]

In order to get the releases, one is forced to use the crawler's
API rather than the resulting project's API.

>>> crawler.get_releases(project.name, force_update=True)

>>> [x for x in project]
[, , , , ]

So as far as I can gather, We lack the ability to forcibly update the project 
(or ReleaseList). I don't have a solution at this time, but we may want to look 
into adding a force_update argument to the get_release method on the Crawler.

--
assignee: tarek
components: Distutils2
messages: 140083
nosy: alexis, eric.araujo, michael.mulich, tarek
priority: normal
severity: normal
status: open
title: packaging.pypi.Crawler and resulting objects have a circular API
type: behavior
versions: Python 3.3

___
Python tracker 

___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com