Dear All,

I was experimenting with the timing and timing accuracy on BBB. and I came 
across a very *interesting* yet strange thing!

Just running  clock_gettime()twice and measuring the time difference, gives 
you horrendous timing. Here is my simple C code:

#include <stdio.h>
#include <stdlib.h>
#include <math.h>
#include <time.h>


int main()
{


    struct timeval startTime, stopTime;


    clock_gettime(CLOCK_REALTIME, &sT);
    printf("\n%ld.%09ld\n" ,sT.tv_sec,sT.tv_nsec);  

    clock_gettime(CLOCK_REALTIME, &eT);
    printf("\n%ld.%09ld %ld uSec\n" , eT.tv_sec, eT.tv_nsec, ((eT.tv_sec - 
sT.tv_sec)*1000000 + (eT.tv_nsec-sT.tv_nsec)/1000)); 


    return 0;
}

Just two gettime are causing *2894 to 3200 micro sec* (uSec) time 
difference, how is such huge amount possible!??

Can someone throw some light on this? and can it be improved or fixed?

I have tried the following to investigate.

1) Running same code on my i7 
    it gives 80uS to ~300 uS delay.

2) Trying TimeVal and GetTimeOfDay -- gives worse result! 

3) Running NTP:
   No impact.

Thank you, and hope you find it interesting as well :)

-- 
For more options, visit http://beagleboard.org/discuss
--- 
You received this message because you are subscribed to the Google Groups 
"BeagleBoard" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/beagleboard/7d0c12ca-1e1a-4754-9b30-8f1a75c1dc6c%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to