I know we're well into this thread now, but...

Most of my goofing around with web page grabs and the like was  done 
before the CallHTTP stuff was out, or when only the really early versions 
that were buggy were out.  So back then we just used "wget" for grabbing 
web pages on our Unix based UV servers.  It works like a charm and 
requires very little setup to use and no programming to test and view the 
results.

An added advantage is that it runs from the Unix command line so we use it 
for a pantload of other things that don't touch on UV at all. For 
instance, we have a web enabled temperature sensor in our computer room. 
We have a cron job that runs every 15 minutes on one of our Unix servers, 
uses wget to grab the status page from the sensor and then runs the 
results through some "cut" and "grep" commands to whittle the results down 
to a singe number.  If it breaks a threshold then it starts sending text 
messages and emails to people.


Dave Barrett,
Lawyers' Professional Indemnity Company
_______________________________________________
U2-Users mailing list
[email protected]
http://listserver.u2ug.org/mailman/listinfo/u2-users

Reply via email to