You raise an interesting point.  To explore this further: we could definitely 
just lower a lot of it to the appropriate integer-width arithmetic in LLVM.  I 
suspect the limitations of the standard library implementation you bring up 
exist because "nonstandard" types such as these don't show up when we have to 
bridge C and ObjC so it isn't as much a priority that we generalize over the 
entire space.  Doing so would also seem to require the ability to use, say, 
integer literals in generics like C++.

As for the char size issue, we define both sizeof and a platform-dependent 
CChar typealias that you can measure against.

~Robert Widmann

2016/06/17 13:01、Daryle Walker via swift-evolution <[email protected]> 
のメッセージ:

> When I first looked into Swift, I noticed that the base type was called 
> “UInt8” (and “Int8”) and not something like “Byte.”  I know modern computers 
> have followed the bog standard 8/16/32(/64) architecture for decades, but why 
> hard code it into the language/library?  Why should 36-bit processors with 
> 9-bit bytes, or processors that start at 16 bits, be excluded right off the 
> bat?  Did you guys see a problem with how (Objective-)C(++) had to define its 
> base types in a mushy way to accommodate the possibility non-octet bytes?
> 
> BTW, is there an equivalent of CHAR_BIT, the number of bits per byte, in the 
> library?  Or are we supposed to hard code an “8”?
> 
> ― 
> Daryle Walker
> Mac, Internet, and Video Game Junkie
> darylew AT mac DOT com 
> 
> _______________________________________________
> swift-evolution mailing list
> [email protected]
> https://lists.swift.org/mailman/listinfo/swift-evolution
_______________________________________________
swift-evolution mailing list
[email protected]
https://lists.swift.org/mailman/listinfo/swift-evolution

Reply via email to