> On Jun 19, 2016, at 1:04 AM, Chris Lattner wrote:
>
>> On Jun 17, 2016, at 1:01 PM, Daryle Walker via swift-evolution
>> wrote:
>>
>> When I first looked into Swift, I noticed that the base type was called
>> “UInt8” (and “Int8”) and not
> On Jun 17, 2016, at 1:01 PM, Daryle Walker via swift-evolution
> wrote:
>
> When I first looked into Swift, I noticed that the base type was called
> “UInt8” (and “Int8”) and not something like “Byte.” I know modern computers
> have followed the bog standard
Old old old architectures. We're talking Multics days.
~Robert Widmann
2016/06/17 21:35、David Sweeris via swift-evolution
のメッセージ:
> IIRC, a bunch of Ye Olde systems used 6-bit bytes. And I think 36-bit ints
> were used in a few architectures, but don't quote me on
IIRC, a bunch of Ye Olde systems used 6-bit bytes. And I think 36-bit ints were
used in a few architectures, but don't quote me on that.
- Dave Sweeris
> On Jun 17, 2016, at 22:48, Félix Cloutier via swift-evolution
> wrote:
>
> Out of curiosity, can you name an
Out of curiosity, can you name an architecture that doesn't use 8-bit bytes?
Félix
> Le 17 juin 2016 à 13:01:33, Daryle Walker via swift-evolution
> a écrit :
>
> When I first looked into Swift, I noticed that the base type was called
> “UInt8” (and “Int8”) and not
You raise an interesting point. To explore this further: we could definitely
just lower a lot of it to the appropriate integer-width arithmetic in LLVM. I
suspect the limitations of the standard library implementation you bring up
exist because "nonstandard" types such as these don't show up
I'm not quite sure what you mean. Swift has a type called Int8 that
represents numbers from -128 to 127 using 8 bits. I don't see how this
"excludes" computers.
On Fri, Jun 17, 2016 at 13:01 Daryle Walker via swift-evolution <
swift-evolution@swift.org> wrote:
> When I first looked into Swift, I
When I first looked into Swift, I noticed that the base type was called “UInt8”
(and “Int8”) and not something like “Byte.” I know modern computers have
followed the bog standard 8/16/32(/64) architecture for decades, but why hard
code it into the language/library? Why should 36-bit