And my answer is yes, you can get a bunch of them.  Whether or not you  
can get ALL of them, though, will depend on a couple of things.   
Namely, Google's servers will only allow you to get up to 100 entries  
from a feed, current or otherwise.  So if the feed you're wanting to  
dl has had more than 100 entries over its lifetime, or if some of  
theme disappeared before FeedFetcher crawled it, you're out of luck.

Jeremy R. Geerdes
Effective website design & development
Des Moines, IA

For more information or a project quote:
http://jgeerdes.home.mchsi.com
http://jgeerdes.blogspot.com
http://jgeerdes.wordpress.com
[email protected]

Unless otherwise noted, any price quotes contained within this  
communication are given in US dollars.

If you're in the Des Moines, IA, area, check out Debra Heights  
Wesleyan Church!

And check out my blog, Adventures in Web Development, at 
http://jgeerdes.blogspot.com 
  !


On Mar 11, 2009, at 11:20 AM, fij wrote:

>
> Thanks, Jeremy.
>
> Sorry, my question was not precise enough.
>
> I subscribed to a feed with Google Reader. With some scripting (perl)
> and wget can I download the aggregated contents (now + past) of this
> feed to my Linux computer from Google ?
>
> Thank you,
> Illes
>
> On Mar 7, 10:59 pm, Jeremy Geerdes <[email protected]> wrote:
>> The answer to your broader question, how can Google Reader do this,  
>> is
>> simply that Google caches the feeds and their entries.
>>
>> You can get a bunch of these entries using the AJAX Feeds API by
>> including &scoring=h in the url that you request with wget.  Whether
>> or not you can get all of them will depend on the sheer number of
>> entries that are involved.  Presently, there is a limit of 100  
>> entries
>> returned via the API.  Links to documentation are below:
>>
>> http://code.google.com/apis/ajaxfeeds/documentation#fonjehttp:// 
>> code.google.com/apis/ajaxfeeds/documentation/reference.html#_i...
>>
>> Jeremy R. Geerdes
>> Effective website design & development
>> Des Moines, IA
>>
>> For more information or a project quote:http://jgeerdes.home.mchsi.comhttp 
>> ://jgeerdes.blogspot.comhttp://jgeerdes.wordpress.com
>> [email protected]
>>
>> Unless otherwise noted, any price quotes contained within this
>> communication are given in US dollars.
>>
>> If you're in the Des Moines, IA, area, check out Debra Heights
>> Wesleyan Church!
>>
>> And check out my blog, Adventures in Web Development, 
>> athttp://jgeerdes.blogspot.com
>>   !
>>
>> On Mar 7, 2009, at 12:41 PM, fij wrote:
>>
>>
>>
>>> Hi,
>>
>>> Could you please help me find the right group for this question?
>>
>>> I'd like to download with wget an entire feed, not just the most
>>> recent part of it. As I scroll down in a feed with Google Reader, it
>>> can add older parts of the feed again and again, and I can go as far
>>> as the first item of the feed that appeared years ago. Do you happen
>>> to know how Google Reader does this?
>>
>>> I've just looked at some feeds downloaded with wget. Is there a
>>> "continue" or "next page" URL at the bottom of the feed?
>>
>>> Thanks.
> >


--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Google AJAX APIs" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/Google-AJAX-Search-API?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to