Every two hours, I run a script from cron that does a newsx slurp from three
different locations (in serial, not in parallel).  Usually it takes 15 or 20
minutes to complete.  But every now and then, something gets buggered up.
I've never seen it at the moment it first starts to happen, so I don't know
how it starts, but the symptoms are:
- the scripts are taking a lot longer to run, longer than 2 hours
- "newsq -c" shows two simultaneous connections to one of my "feed" sites.
- the in.hosts file for at least one of the feed sites is corrupted, so the
  feed for that site takes an incredibly long time.

My guess from these symptoms is that one of the feeds is taking longer than
the usual time, and the locking that normally prevents you from having two
simultaneous connections to the same site fails for some reason.  The two
simultaneous connections munge the in.hosts file, and the rest is crap.

Usually the only solution to this problem is to kill the multiple newsx
processes, delete the in.hosts file and do a "catchup" "newsx -c", or to
shut off the cron job, delete the file and do a regular feed, and restart
the cron job when it's finished.

Anybody have any ideas why the locking would fail?  I'm running Linux 2.0.30
with Cnews CR.G, and newsx 0.9.

-- 
Paul Tomblin ([EMAIL PROTECTED])  I don't buy from spammers.
"A little rudeness and disrespect can elevate a meaningless interaction into a
battle of wills and add drama to an otherwise dull day."
     - Calvin discovers Usenet

Reply via email to