Hi everyone,

I'm totally mystified by this one:

I have a shell script that fetch a couple of RSS feeds periodically. It worked fine. I now want to integrate a 3 times retry if the fetch fails. The logic of the script if now good but when I'm testing the script by unplugging the network cable to confirm that the script is retrying I can see that fetch always succeed to download, even with the cable unplugged!

There is the output received by email from cron:

fetch: http://www.cyberpresse.ca/rss/225.xml: No address record
Let's wait...actualites.xml
0
fetch: http://www.cyberpresse.ca/rss/225.xml: No address record
Let's wait...actualites.xml
1
fetch: http://www.cyberpresse.ca/rss/225.xml: No address record
Let's wait...actualites.xml
2
Let's continue...
/usr/local/www/canadien.xml                             10 kB  479 kBps
/usr/local/www/insolite.xml                           9329  B  448 kBps
...

But the cable is still unplugged! Every feed comes from the same source (cyberpresse.ca)

At the beginning, my script used ftp instead of fetch, but I got the same result. I'm using a simple 'fetch -o actualites.xml http://www.cyberpresse.ca/rss/225.xml' command that take the output file and the source file from a variable. I could paste the code but there's part of it in French so I'll just start with this.

I'm wondering if there is any sort of caching or verification that if the output file is already there not to download again? I know it's silly but I'm totally confused...

Thanks for your help.

Martin
_______________________________________________
freebsd-questions@freebsd.org mailing list
http://lists.freebsd.org/mailman/listinfo/freebsd-questions
To unsubscribe, send any mail to "freebsd-questions-unsubscr...@freebsd.org"

Reply via email to