>I think that he was only looking for a sample every couple of seconds, and
>while I don't know how much data he was expecting, if it is a small number
>of bytes per sample, then there is lots of time to consume the data, and
>interact with the user.

Stuart gave me some additional info earlier this week that suggested his
process is similar to one I've already implemented on the Palm.
>
>I am with Jim on this one. For my application checking the time every time
>through the loop, and asking for an event when I need it make sense to me.
>It runs just fine for me.

In my case, there are few extra CPU cycles to burn by using every message as
a potential timing trigger.  I still believe that using Jim's model, you put
yourself at the risk of skewing the timestamp of your readings.  I've tried
it that way and it didn't work for me.

-----Original Message-----
From: Ed Deinstadt <[EMAIL PROTECTED]>
To: [EMAIL PROTECTED] <[EMAIL PROTECTED]>
Date: Thursday, July 15, 1999 1:07 PM
Subject: Re: Automated serial output


>
>-----Original Message-----
>From: Dave Lippincott <[EMAIL PROTECTED]>
>To: [EMAIL PROTECTED] <[EMAIL PROTECTED]>
>Cc: [EMAIL PROTECTED] <[EMAIL PROTECTED]>
>Date: Thursday, July 15, 1999 9:33 AM
>Subject: Re: Automated serial output
>
>
>>If you watch for the time to execute your routine by triggering on every
>>message, you are going to slow your app down to a crawl.
>I don't think that the overhead is naerly as bad as you think. I took the
>basis for my app from the GPS example in the Palm Pilot book. It
>pre-calculates the time and checks it every time it is invoked. All it has
>to do is fetch the time in ticks and compare.
>>
>>>A side effect of checking only during nil events is that the user may be
>>entering data or tapping the screen quickly, enqueueing >each event within
>>the timeout frequency, thus starving your event loop of nil events.
>>The fact is, your app looses control during user input anyway.  Just hold
>>your stylus on the screen in a single position and watch nothing get
>>updated!
>>
>Sure. But you will also not see the event that the system thinks you are
>getting if you don't check for non-nil events.
>>>You'll burn more cycles halving the timeout frequency than you will
>>performing a simple "is it time now?" check on every received event.
>>How? Your suggestion has the app doing the same thing, only more
>frequently.
>>What will happen to your timing if you require an event every second, look
>>for the time to execute on every message and have no use input for several
>>seconds?  If you rely on your method and set EvtGetEvent to time out only
>>every second, you risk missing your sampling window by up to a second!
>>
>But you don't do it "every second". When you get an event that you decide
to
>use, you recelculate when you should be invoked next time, one cycle at a
>time. This gives lots of room to adjust.
>>There are risks to any method that is not timer/interrupt driven.  Using
>the
>>method you suggest will skew your sampling rate, Increasing the nilEvent
>>frequency may burn the CPU for a few extra cycles, but it won't slow your
>>app (by always checking for the correct execution time on every event) or
>>cause you to skew your sampling rate by up to one sampling interval.
>>
>>
>That is certainly true. But the parameters that you have to work with are
>that this is NOT a real time system. The kernel is a real time
multi-tasking
>system, however that is not exposed to user level apps. If you need really
>accurate timing in this system, the only option I can see is to make it
>interrupt driven. Having said that... there is lots that you can do without
>having to go that far. there are many examples of apps which manage to talk
>to the serial port just fine, and meet the timing requirements of their
>protocol.
>
>>> enqueueing each event within the timeout frequency, thus starving your
>>event loop of nil events.
>>You forget that the user will not typically be entering data during the
>>sampling operation (not on a 16mHz computer).  If the user is interacting
>>with the Palm during this time, your app will not be executing the local
>>event loops much anyway.  Plus UI elements generate nilEvents Plus Plus
the
>>OS generates nilEvents. Plus Plus Plus how many users enter data or tap
the
>>screen several times a second? (that are not playing some sort of action
>>game)
>
>I think that he was only looking for a sample every couple of seconds, and
>while I don't know how much data he was expecting, if it is a small number
>of bytes per sample, then there is lots of time to consume the data, and
>interact with the user.
>
>I am with Jim on this one. For my application checking the time every time
>through the loop, and asking for an event when I need it make sense to me.
>It runs just fine for me.
>
>Ed Deinstadt
>
>>
>>-----Original Message-----
>>From: Jim Schram <[EMAIL PROTECTED]>
>>To: [EMAIL PROTECTED] <[EMAIL PROTECTED]>
>>Date: Wednesday, July 14, 1999 7:14 PM
>>Subject: Re: Automated serial output
>>
>>
>>>At 3:33 PM -0400 1999/07/14, Dave Lippincott wrote:
>>>>If you do it this way, you will have to intercept *every* message
>>generated
>>>>by the OS.  That will suck up allot of CPU time.  Just look for the
>>nilEvent
>>>>message.  When it is received, check the tick count or clock.  If you
set
>>>>the EvtGetEvent timeout to some rate that is at least twice the required
>>>>frequency, you will almost certainly get the desired timing period.
>>>
>>>No, check the tick count at every event if you need to guarantee a
>specific
>>frequency. You'll burn more cycles halving the timeout frequency than you
>>will performing a simple "is it time now?" check on every received event.
>>>
>>>A side effect of checking only during nil events is that the user may be
>>entering data or tapping the screen quickly, enqueueing each event within
>>the timeout frequency, thus starving your event loop of nil events.
>>>
>>>Regards,
>>>
>>>Jim Schram
>>>3Com/Palm Computing
>>>Partner Engineering
>>>
>>>
>>
>>
>>
>>
>
>


Reply via email to