I was going to suggest the same as Nick, but this post put paid to that.

However, starting up using either init.d scripts *or* crontab scripts is
good. I know it sounds a bit strange, but if you put it in cron, then it
can act as a means of automagically restarting the process if it fails. I
wish I could take credit for that one, but I stole it off the seti project
(:

But, you're missing something in all of this... where's the database to
store it all in? That way, you can generate up-to-date graphs on the fly
when requested!

I do similar with the METAR weather data that I get from noaa ( I think!
). I've just got to write the data display part (^:

Cheers,

Steve

On Fri, January 21, 2005 11:56 am, Andrew Errington said:
>> I am thinking that these programs probably do nt need to run
>> continuously anyway.
>>
>> i assume they are doing something like:
>>
>> begin
>> poll sensor
>> write data
>> sleep x minutes
>> again
>>
>> would it be better to rewrite the program to just do:
>>
>> begin
>> poll sensor
>> write data
>> end
>>
>> then let cron do the scheduling every x minutes. that way as long as
>> cron is running, your data collection program should run. if the data
>> collection program outputs a value indicating whether or not it has
>> finished successfuly, you can get cron to send you an email on failure.
>>
>> just another option...
>
> Nice idea.  The temperature sensor could work like that, but it has its
> own
> scheduling built-in (just provide the interval at the command line), and I
> have set it up for every 5 minutes.  The wind sensor is outputting data
> every three seconds.  My Perl script accumulates this data over 10 minutes
> and outputs the average and max values, so it really needs to be running
> all the time.
>
> Thanks,
>
> Andy
>


-- 
Artificial Intelligence is no match for natural stupidity.

Reply via email to