Norman Vine wrote:

> Erik Hofman writes:
> 
>>I noticed that in SimGear/timing/timestamp.cxx for Windows the seconds
>>are intitalized to zero while under Unix it initilized properly:
>>
> 
> You expect Windows and Unix to do things similarly ?


I expected FlighGear/SimGear would use the same variables similarly.



> 
>>Isn't this a problem?
>>
> 
> Only after running for 45 days or so :-)
> ie.
> secs_min*mins_hour*hours_day*1000*46 =
> 32 bit unsigned integer overflow


> I'll let others comment on the probability of Win32 staying up that long
> :-)


No comment.


> note that when
>    #ifdef WIN32
> the seconds variable is never used,  I guess this could be better
> documented,  however I haven't heard of it causing any problems
> and it has been this way for a while !


Yep, I only noticed because I was changing the code to use the SDL 
library (ionly for internal use for now) adn then I discovered this 
difference.


> 
> In fact since october 1998 when I changed it to use
> the Win32 API timeGetTime() instead of the <Unix'y> ftime()
> which not so surprisingly would only allow a max of 18.3 fps !
> 
> FYI  that's also when I got my first 3D accelerated GFX card
> and I was %$#%# that I could still only got 18.3 fps max and
> then I discovered the ftme() <-> bios timer connection in Windows :-)
> 
> 
> Since then, thanks to FGFS I have learned alot about WIN32,
> I was primarily an embedded systems programmer,
> and if I was todo this again I think I would just add
 

<skipped a lot of rumble>


> Cheers
> 
> Norman

Thanks. I'll need it now ;-)

Erik



_______________________________________________
Flightgear-devel mailing list
[EMAIL PROTECTED]
http://mail.flightgear.org/mailman/listinfo/flightgear-devel

Reply via email to