From: "Chris Santerre" <[EMAIL PROTECTED]>
> >-----Original Message-----
> >From: Don Newcomer [mailto:[EMAIL PROTECTED]
> >
> >Aside from hoping that I can soon get an updated copy of all
> >the files that
> >are out there, I wanted to mention a possible problem. From an earlier
> >e-mail today it seems that the one download per 24-hours restriction is
> >going to stay in place. I'm sure that a lot of sites have set up daily
> >cron jobs as I have to get these files reliably. However,
> >since the new
> >RDJ has randomizing code in place, it's possible that even
> >though my cron
> >job always starts at 3:00 AM, on day one I could be delayed by 3000
> >seconds and start at 3:50 and on day two be delayed 120
> >seconds and start
> >at 3:02. Since this randomization by the script forces me to
> >load again
> >within 24 hours, I'd be denied.
> >
> >I understand the need to limit downloads but you may need to
> >reconsider the
> >randomization thing. I've gone for almost a week now without
> >being able to
> >use RDJ to get the new files after having a cron job that
> >worked flawlessly
> >for months. Sorry if I appear impatient but, well, I am! Thanks.
> >
>
> 3 per day. WHy you would want to update a file 3 times a day is beyond me.
Chris, he downloads at a randomized time between 3:00 and 4:00 every day
once a day. 24 hours trips him up. 22 hours would be safe for him. It
should add much load to your system to allow downloads after 22 hours
rather than 24 hours, should it?
I'm also thinking it would be best to wget a directory listing with times
and then only wget the files that show as new in that directory listing
rather than execute many individual wgets for headers or ftp wgets of
only new files. One of you perl wizards could probably hack that up in
almost no time at all.
{^_^} Joanne, who'd develop a hernia by trying to do it in C++. {^_-}