I was too quick in my earlier post, I got minor encoding problems
anyway, problems I never got when using curl. So I did some more
research, provided one has curl installed for php the below works just
like using the curl command but manages redirects also:
$ch =3D curl_init($_SERVER['argv']);
curl_setopt($ch, CURLOPT_MAXREDIRS, 1);
It seems the php curl has got some more power than the "normal" curl,
as far as I know anyway, any example of using the normal curl to
handle redirects would be welcome so I don't have to go via php.
On Sat, Apr 4, 2009 at 6:13 PM, Henrik Sarvell <hsarv...@gmail.com> wrote:
> Apparently the change of everyone's url is Google making changes, I
> think they are related to Adsense in feeds or something.
> Anyway, the following works at the moment:
> In getfeed.php:
> #!/usr/bin/php -q
> =A0echo utf8_encode(file_get_contents($_SERVER['argv']));
> In PicoLisp:
> (in (list "php" "projects/rss-reader/getfeed.php" (; Feed xmlUrl)) (till)=
> Thanks for the UTF8 hint, strange that that worked earlier with curl,
> curl must've done some automatic conversions, that's the only
> explanation I can think of.
> On Sat, Apr 4, 2009 at 3:33 PM, Tomas Hlavaty <kviet...@seznam.cz> wrote:
>> Hi Henrik,
>>> So I've just had a setback for my feedreader, it seems I can't fetch
>>> feeds from feedburner with curl anymore, trying
>>> curl http://feeds.feedburner.com/codinghorror
>>> results in nothing, while for instance the follwoing works:
>>> curl http://penguinpetes.com/b2evo/xmlsrv/rss2.php?blog=3D1
>> the feed has been moved:
>> to...@bela:~$ curl -i http://feeds.feedburner.com/codinghorror
>> HTTP/1.0 302 Moved Temporarily
>> Date: Sat, 04 Apr 2009 12:55:25 GMT
>> Server: Apache
>> X-FB-Host: app33
>> Location: http://feeds2.feedburner.com/codinghorror
>> Content-Length: 0
>> P3P: CP=3D"ALL DSP COR NID CUR OUR NOR"
>> Keep-Alive: timeout=3D30, max=3D100
>> Connection: keep-alive
>> Content-Type: text/plain; charset=3DUTF-8
>> so you can try the new url: http://feeds2.feedburner.com/codinghorror
>>> However doing wget -O - http://feeds.feedburner.com/codinghorror is
>>> not a problem.
>> wget seems to be following the redirect.
>>> The above wget options will output to stdout instead of writing to a
>>> file, my basic problem here is that I don't know enough about basic
>>> Linux functions, I'm trying the following with no success:
>>> (pipe (out (list 'wget '-O '-
>>> 'http://feeds.feedburner.com/codinghorror)) (till))
>> I tried:
>> : (in (list "wget" "-q" "-O" "-" "http://feeds.feedburner.com/codinghorr=
>> Bad UTF-8
>> The trouble here is that the feed is not utf but iso-8859-1:
>> <?xml version=3D"1.0" encoding=3D"iso-8859-1"?>
>> Not sure how Alex handles things like this in a standard picolisp way,
>> but I guess you'll need to convert the latin1 file to utf externally
>> (use sh -c wget ... | lat1-to-utf8) before reading it into picolisp.
>> lat1-to-utf8 would be an external program to do this (it actually
>> might be in the picolisp distribution already, I saw some tools
>> looking like that).
>>> Any ideas?
>> Another idea would be to use the 'client' function to download the
>> feed. =A0However, you will still need to do the encoding conversion I
>> UNSUBSCRIBE: mailto:picol...@software-lab.de?subject=3dunsubscribe