I'm not quite sure what you mean. Swift has a type called Int8 that represents numbers from -128 to 127 using 8 bits. I don't see how this "excludes" computers.
On Fri, Jun 17, 2016 at 13:01 Daryle Walker via swift-evolution < swift-evolution@swift.org> wrote: > When I first looked into Swift, I noticed that the base type was called > “UInt8” (and “Int8”) and not something like “Byte.” I know modern > computers have followed the bog standard 8/16/32(/64) architecture for > decades, but why hard code it into the language/library? Why should 36-bit > processors with 9-bit bytes, or processors that start at 16 bits, be > excluded right off the bat? Did you guys see a problem with how > (Objective-)C(++) had to define its base types in a mushy way to > accommodate the possibility non-octet bytes? > > BTW, is there an equivalent of CHAR_BIT, the number of bits per byte, in > the library? Or are we supposed to hard code an “8”? > > — > Daryle Walker > Mac, Internet, and Video Game Junkie > darylew AT mac DOT com > > _______________________________________________ > swift-evolution mailing list > swift-evolution@swift.org > https://lists.swift.org/mailman/listinfo/swift-evolution > -- -Saagar Jha
_______________________________________________ swift-evolution mailing list swift-evolution@swift.org https://lists.swift.org/mailman/listinfo/swift-evolution