Types where it makes sense, or types for which such semantics would be a good
idea? Because, for example, you could do something like this:
struct HTMLParser : IntegerLiteralConvertible {
init(integerLiteral value: IntegerLiteralType) {
htmlMajorVersion = value
htmlMinorVersion = 0
}
}
Back in the realm of math, I don’t think Sedenions — a 16-demensional (in the
sense that complex numbers are 2-dimensional) number — have a well-defined
division operator.
As a more likely example, I don’t think it’d be too much of a stretch to attach
integer literal semantics to matrices:
let x: Matrix = 1 // Sets diagonal to 1
Matrices don’t have a division operator, and you can’t do any of the
`Arithmetic` functions to two matrices without first checking their dimensions.
Plus, inherently-dimensioned matrix types:
var x = Matrix<_2,_3>() // "_2" and "_3" are dummy types
can’t implement `*`, unless their two dimensions happen to be equal —
"Matrix<2,3>() * Matrix<2,3>()” doesn’t have a valid definition.
- Dave Sweeris
> On Jun 29, 2016, at 7:49 AM, Steve Canon via swift-evolution
> <[email protected]> wrote:
>
> Semi-serious question for integer literals in particular: do we need a
> separate protocol at all? Are there types we want to support where integer
> literals should be supported, but + doesn't make sense? Where 1+1 actually
> isn't 2?
>
> If not, are integer literals really just part of Arithmetic?
>
> - Steve
_______________________________________________
swift-evolution mailing list
[email protected]
https://lists.swift.org/mailman/listinfo/swift-evolution