Hello,

Not sure I should ask this here.

How do I get time measurements in milliseconds? What is the accuracy of the mac's C library implementation ?

I am using the clock() function from time.h and measuring differences in seconds between the trigering of a NSTimer.

The NSTimer fires every 0.1 secs, and in the event code I measure the difference between the current clock() call and the previous. I then divide it by the CLOCKS_PER_SEC constant and get two orders of magnitude difference than the interval in the timer. I get 0.001 secs instead of the 0.1 of the NSTimer.

Any ideias ?

_______________________________________________

Cocoa-dev mailing list ([email protected])

Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com

Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com

This email sent to [EMAIL PROTECTED]

Reply via email to