On Fri, April 14, 2006 3:18 pm, Peter E. Barck wrote: >>I wonder why you all divide it by 1000000 (10^6) while your loop >>only goes to 100000 (10^5)... > > Because 1 Microsecond = 1/1000000 of a second. Has nothing to do with > the loop count.
Yeah, but isn't microseconds only accurate to milliseconds (1/1000 second) under Windows? At least that's what the LR says. If so, and this post originally started because Windows was "faster," wouldn't that explain the dramatic timing difference? -- Marc Zeedar Publisher REALbasic Developer Magazine http://www.rbdeveloper.com _______________________________________________ Unsubscribe or switch delivery mode: <http://www.realsoftware.com/support/listmanager/> Search the archives of this list here: <http://support.realsoftware.com/listarchives/lists.html>
