When I first looked into Swift, I noticed that the base type was called “UInt8” (and “Int8”) and not something like “Byte.” I know modern computers have followed the bog standard 8/16/32(/64) architecture for decades, but why hard code it into the language/library? Why should 36-bit processors with 9-bit bytes, or processors that start at 16 bits, be excluded right off the bat? Did you guys see a problem with how (Objective-)C(++) had to define its base types in a mushy way to accommodate the possibility non-octet bytes?
BTW, is there an equivalent of CHAR_BIT, the number of bits per byte, in the library? Or are we supposed to hard code an “8”? — Daryle Walker Mac, Internet, and Video Game Junkie darylew AT mac DOT com
_______________________________________________ swift-evolution mailing list [email protected] https://lists.swift.org/mailman/listinfo/swift-evolution
