[PHP] Not important - Simple question about microsec()
Hi, I was wandering what the result from microsec() means: I tested the time in the begin and in the end of my code and got the following result: Begin: 0.70278800 986354975 End:0.08970900 986354975 I think the latter part of the value is Unix-time in sec (right?), but what does the former part say like: 0.25576700 ?!? I read in the manual it should be the 1/100 of seconds but it does not compute in my brain, how come is the End time lower than the Begin time? Probably I'm thinking of it the wrong way, but I hope you can light me :) Regards, SED -- PHP General Mailing List (http://www.php.net/) To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] To contact the list administrators, e-mail: [EMAIL PROTECTED]
Re: [PHP] Not important - Simple question about microsec()
On 3 Apr 2001 20:31:28 -0700, SED [EMAIL PROTECTED] wrote: I think the latter part of the value is Unix-time in sec (right?), but what does the former part say like: 0.25576700 ?!? For some reason, microtime() returns results backwards - the first part is the decimal portion of the second part. As for the time being returned in properly, are you by any chance running this on Windows? I seem to recall reading something in bugs.php.net about that when I ran into some benchmark problems with code which worked properly on Unix and returned results like you showed on Windows ("Wow! This sure is a fast computer - it finished before it started!"). -- PHP General Mailing List (http://www.php.net/) To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED] To contact the list administrators, e-mail: [EMAIL PROTECTED]