> So I've just had a setback for my feedreader, it seems I can't fetch
> feeds from feedburner with curl anymore, trying
> curl http://feeds.feedburner.com/codinghorror
> results in nothing, while for instance the follwoing works:
> curl http://penguinpetes.com/b2evo/xmlsrv/rss2.php?blog=1
the feed has been moved:
to...@bela:~$ curl -i http://feeds.feedburner.com/codinghorror
HTTP/1.0 302 Moved Temporarily
Date: Sat, 04 Apr 2009 12:55:25 GMT
P3P: CP="ALL DSP COR NID CUR OUR NOR"
Keep-Alive: timeout=30, max=100
Content-Type: text/plain; charset=UTF-8
so you can try the new url: http://feeds2.feedburner.com/codinghorror
> However doing wget -O - http://feeds.feedburner.com/codinghorror is
> not a problem.
wget seems to be following the redirect.
> The above wget options will output to stdout instead of writing to a
> file, my basic problem here is that I don't know enough about basic
> Linux functions, I'm trying the following with no success:
> (pipe (out (list 'wget '-O '-
> 'http://feeds.feedburner.com/codinghorror)) (till))
: (in (list "wget" "-q" "-O" "-" "http://feeds.feedburner.com/codinghorror")
The trouble here is that the feed is not utf but iso-8859-1:
<?xml version="1.0" encoding="iso-8859-1"?>
Not sure how Alex handles things like this in a standard picolisp way,
but I guess you'll need to convert the latin1 file to utf externally
(use sh -c wget ... | lat1-to-utf8) before reading it into picolisp.
lat1-to-utf8 would be an external program to do this (it actually
might be in the picolisp distribution already, I saw some tools
looking like that).
> Any ideas?
Another idea would be to use the 'client' function to download the
feed. However, you will still need to do the encoding conversion I