Carl,

Right, makes sense, I really use delays sparingly and go for millis() But this 
was a copy-paste task. and it was only 25ms of delay, so I let it slide. And 
considering I am probably only using like 10% of computing power I thought I 
could let it slide. And I could have if it wasn't for you meddling programmers. 
`,~) 

Normally I only use a short delay at the end of the loop if the loop is moving 
to fast, or for comm purposes. so definitely out of norm for me. I still like 
the original work around though. I would use it again if in a pinch.

Good discussion in any case and now I have a better grasp on Ticker. 

Makes one of my Dad's best advice quotes ring true:

When all else fails, read the instructions ~ Dad

Good Stuff,

John Vaughters






On Tuesday, January 26, 2021, 12:51:13 PM EST, Carl Nobile 
<[email protected]> wrote: 





Yeah, delays could mess with interrupts even if they are not in the interrupt 
itself. Actually what happens is the interrupt messes with the delay. If the 
interrupt happens in the middle of the delay the delay will be longer than what 
you set it at.
Almost all processors will be runningĀ interruptsĀ even if you're not using any.
There are other people in the group that have more experience with this than I, 
so just my two cents.

~Carl

_______________________________________________
Triangle, NC Embedded Computing mailing list

To post message: [email protected]
List info: http://mail.triembed.org/mailman/listinfo/triembed_triembed.org
TriEmbed web site: http://TriEmbed.org
To unsubscribe, click link and send a blank message: 
mailto:[email protected]?subject=unsubscribe

Reply via email to