Richard, Whoops... sorry for the previous snafu! This is a good question. What you're observing is an issue of how arithmetic is implemented on a computer. Its neither natural nor unnatural. Arithmetic is implemented as a group (mod 2^n) where n is the number of bits in the variable. Well, it behaves as one anyway!
It basically boils down to this: At the hardware level, the operation is the same whether the result will exceed bounds or not. Now, if the operation exceeds bounds, the excess is "discarded". A simple example is with a 2 bit number: For each number I will add one. 00 01 10 11 100 Note that at this point, the actual value is 4, but 4 has no representation in a 2 bit system. The least significant 2 bits are 00, which is 0. This is the phenomenon that you are describing. Mark

