@Arrrrrrrrr

Wow, there's a lot of assumptions here... 

> You have to define an interface between client and server anyway, that's the 
> only place you should care.

Beyond the simple transfer of data, I also want all the computations to give 
the same result (for reasons I explained earlier). I can totally get an integer 
overflow multiplying two quantities in 32bit, which I won't get in 64bit. If I 
don't check against the overflow (which I probably should), I will get 
different results. And if I do check for the overflow, the client code will 
fail, while the server code will not (and nor will the tests, because I want to 
run them on a server-class machine, so that it runs faster). Both cases will 
result in a "de-sync" of the client, forcing a disconnect, a complete reset of 
the client and a re-connect. So, atm, I can't agree with your statement. 
Possibly, I'm missing something which is obvious to you?

> Are you expecting to have more than a million IDs?

What if I gave every bullet it's own ID? In a DOOM-style "death-match" that 
lasts an hour, that might not be an issue, but what if I want to create a vast, 
persistent game world, which is designed to spread over 1000s of (randomly 
generated) square kilometers, and run for many years without a reset (which, by 
the way, is exactly what I plan)? Would you still think I'm not going to reach 
a million IDs? I won't have a million _active_ IDs _at the same time_, but I 
will, in total, over time. And it's way simpler to just use uint64 for IDs, 
than to implement some complicated ID-reusing system.

> i'm sure you are not going to use every bit of an int.

How can you assume that without knowing precisely what I want to do? If it was 
a 64bit int, maybe not, but that's exactly the problem; on a 32bit client, it 
won't be 64bit, but rather 32bit.

Let's say I have some kind of whole-number currency in my game (credits, 
gold-pieces, diamonds ...) and I was stupid enough to use an "int" to store it. 
Once a player on a 32-bit client reached the 2 billions limit, their account 
would flip into the negative, while on the server, where int can go up to 2^63, 
everything would be fine. Have you ever experienced this situation? I did, in 
two different games so far, which is why I'm aware that int32 is a bad choice 
for a game currency. I guess you could say that an account balance will be part 
of the interface, and so it's "size" must be defined to something specific.

But there are also "transient", computed values, like the "total weight of all 
equipment", ... which are needed for performance, but do not need to be 
transferred through the interface, as they are derived from other values. If I 
only check the interface types, I might accidentally use an int for a transient 
value, since it's not part of the interface.

I might never have 2 billions players, or 2 billions messages, or 2 billions 
game entities, but I most certainly could have more than 2 billions of some 
"quantity".

I guess one special case where the size of int wouldn't mater would be when 
_every single value_ had it's own data-type (distinct int32, for example), and 
all those data-types had hard-coded, programmer-defined, limits. If that limit 
is less than 2^31, then the behavior would be the same everywhere, and if the 
limit was over 2^31, I would get a compiler error while defining that constant 
and would be forced to use int64. Coding like that, OTOH, sounds like a PITA. 
But maybe that is what professional game devs do. As a "corporate Java 
programmer", I've never looked at a professional "native" game code-base, so I 
wouldn't know.

Reply via email to