> On Feb 23, 2017, at 07:33, plx via swift-evolution
> <[email protected]> wrote:
>
> Next quibble: `Magnitude` being defined as the “absolute value” of the
> underlying type seems a bit nonsensical if e.g. `Number` is intended to be
> adoptable by complex numbers (let alone quaternions, matrices, and whatnot).
> What *would* a conforming complex # implementation be expected to return as
> e.g. `(1-i).magnitude`?
sqrt(2), I believe, or as close to it as can be represented by `Magnitude`.
Speaking of which, in my own numerical code, `Magnitude` is typealiased to
`Double` specifically because in complex numbers/quaternions/etc, it's almost
guaranteed to never be reasonably approximated by an integer (at least for
small numbers). What about having a "BuiltinScalar" protocol and setting it
there, so that the non-scalar types that we write can define it as `Double` or
some other appropriate real-valued type?
> PS: I suspect this ship has *long* sailed but I suspect that it would be
> better in the long run to go the route of `Addable`, `Multipliable`, etc.,
> and use typealiases to define common groupings thereof; note that once you
> remove `Magnitude`, the only difference between:
>
> - `Number` (as defined)
> - `typealias Number = Equatable & ExpressibleByIntegerLiteral & Addable &
> Multipliable`
>
> …would be `init?<T : BinaryInteger>(exactly source: T)`.
That's come up before, and IIRC, the decision was that it wasn't necessary for
reasons that I no longer recall off the top of my head. Thanks for reminding me
about it, though, as I've been meaning to go back and make sure that the
reasoning still holds. (I'm sure it does, but part of the point of reviews is
to have the community double-check the work to make sure we didn't miss
anything, right?)
- Dave Sweeris
_______________________________________________
swift-evolution mailing list
[email protected]
https://lists.swift.org/mailman/listinfo/swift-evolution