No, it wouldn't "work just fine", `rev` is already an indicator that it doesn't 
work but here is maybe a stronger argument: You program with `int`, you get 
overflows, you change to `BigInt` in the appropriate places, things work. You 
program with `uint`, you don't get overflows, it's hard to debug and then you 
have no idea if a BigUint will save you as the semantics of wraparound are cast 
into stone...

Having said that, using BigInt as the default in a language makes plenty of 
sense. Too bad we cannot have that because of "performance".

Reply via email to