I'm not sure I understand the distinction you make. You declare a typeclass
by defining the functions needed to qualify for it, as well as default
implementations. e.g.

class  Eq a  where
  (==), (/=)            :: a -> a -> Bool
  x /= y                =  not (x == y)

the typeclass 'Eq a' requires implementation of two functions, (==) and
(/=), of type a -> a -> Bool, which would look like (a,a) --> Bool in the
proposed Julia function type syntax). The (/=) function has a default
implementation in terms of the (==) function, though you could define your
own for your own type if it were an instance of this typeclass.


*Sebastian Good*


On Fri, Nov 21, 2014 at 2:11 PM, Mauro <[email protected]> wrote:

> Sebastian, in Haskell, is there a way to get all functions which are
> constrained by one or several type classes?  I.e. which functions are
> provided by a type-class?  (as opposed to which functions need to be
> implemented to belong to a type-class)
>
> On Fri, 2014-11-21 at 16:54, Jiahao Chen <[email protected]> wrote:
> >> If instead I want to say "this new type acts like an Integer", there's
> no
> > canonical place for me to find out what all the functions are I need to
> > implement.
> >
> > The closest thing we have now is methodswith(Integer)
> > and methodswith(Integer, true) (the latter gives also all the methods
> that
> > Integer inherits from its supertypes).
> >
> > Thanks,
> >
> > Jiahao Chen
> > Staff Research Scientist
> > MIT Computer Science and Artificial Intelligence Laboratory
> >
> > On Fri, Nov 21, 2014 at 9:54 AM, Sebastian Good <
> > [email protected]> wrote:
> >
> >> I will look into Traits.jl -- interesting package.
> >>
> >> To get traction and some of the great power of comparability, the base
> >> library will need to be carefully decomposed into traits, which (as
> noted
> >> in some of the issue conversations on github) takes you straight to the
> >> great research Haskell is doing in this area.
> >>
> >> *Sebastian Good*
> >>
> >>
> >> On Fri, Nov 21, 2014 at 9:38 AM, John Myles White <
> >> [email protected]> wrote:
> >>
> >>> This sounds a bit like a mix of two problems:
> >>>
> >>> (1) A lack of interfaces:
> >>>
> >>>  - a) A lack of formal interfaces, which will hopefully be addressed by
> >>> something like Traits.jl at some point. (
> >>> https://github.com/JuliaLang/julia/issues/6975)
> >>>
> >>>  - b) A lack of documentation for informal interfaces, such as the
> >>> methods that AbstractArray objects must implement.
> >>>
> >>> (2) A lack of delegation when you make wrapper types:
> >>> https://github.com/JuliaLang/julia/pull/3292
> >>>
> >>> The first has moved forward a bunch thanks to Mauro's work. The second
> >>> has not gotten much further, although Kevin Squire wrote a different
> >>> delegate macro that's noticeably better than the draft I wrote.
> >>>
> >>>  -- John
> >>>
> >>> On Nov 21, 2014, at 2:31 PM, Sebastian Good <
> >>> [email protected]> wrote:
> >>>
> >>> In implementing new kinds of numbers, I've found it difficult to know
> >>> just how many functions I need to implement for the general library to
> >>> "just work" on them. Take as an example a byte-swapped, e.g.
> big-endian,
> >>> integer. This is handy when doing memory-mapped I/O on a file with data
> >>> written in network order. It would be nice to just implement, say,
> >>> Int32BigEndian and have it act like a real number. (Then I could just
> >>> reinterpret a mmaped array and work directly off it) In general, we'd
> >>> convert to Int32 at the earliest opportunity we had. For instance the
> >>> following macro introduces a new type which claims to be derived from
> >>> $base_type, and implements conversions and promotion rules to get it
> into a
> >>> native form ($n_type) whenever it's used.
> >>>
> >>> macro encoded_bitstype(name, base_type, bits_type, n_type, to_n,
> from_n)
> >>>     quote
> >>>         immutable $name <: $base_type
> >>>             bits::$bits_type
> >>>         end
> >>>
> >>>         Base.bits(x::$name) = bits(x.bits)
> >>>         Base.bswap(x::$name) = $name(bswap(x.bits))
> >>>
> >>>         Base.convert(::Type{$n_type}, x::$name) = $to_n(x.bits)
> >>>         Base.convert(::Type{$name}, x::$n_type) = $name($from_n(x))
> >>>         Base.promote_rule(::Type{$name}, ::Type{$n_type}) = $n_type
> >>>         Base.promote_rule(::Type{$name}, ::Type{$base_type}) = $n_type
> >>>     end
> >>> end
> >>>
> >>> I can use it like this
> >>>
> >>> @encoded_bitstype(Int32BigEndian, Signed, Int32, Int32, bswap, bswap)
> >>>
> >>> But unfortunately, it doesn't work out of the box because the
> conversions
> >>> need to be explicit. I noticed that many of the math functions promote
> >>> their arguments to a common type, but the following trick doesn't work,
> >>> presumably because the promotion algorithm doesn't ask to promote types
> >>> that are already identical.
> >>>
> >>>         Base.promote_rule(::Type{$name}, ::Type{$name}) = $n_type
> >>>
> >>> It seems like there are a couple of issues this raises, and I know I've
> >>> seen similar questions on this list as people implement new kinds of
> >>> things, e.g. exotic matrices.
> >>>
> >>> 1. One possibility would be to allow an implicit promotion, perhaps
> >>> expressed as the self-promotion above. I say I'm a Int32BigEndian, or
> >>> CompressedVector, or what have you, and provide a way to turn me into
> an
> >>> Int32 or Vector implicitly to take advantage of all the functions
> already
> >>> written on those types. I'm not sure this is a great option for the
> >>> language since it's been explicitly avoided elsewhere. but I'm curious
> if
> >>> there have been any discussions in this direction
> >>>
> >>> 2. If instead I want to say "this new type acts like an Integer",
> there's
> >>> no canonical place for me to find out what all the functions are I
> need to
> >>> implement. Ultimately, these are like Haskell's typeclasses, Ord, Eq,
> etc.
> >>> By trial and error, we can determine many of them and implement them
> this
> >>> way
> >>>
> >>> macro as_number(name, n_type)
> >>>      quote
> >>>         global +(x::$name, y::$name) = +(convert($n_type, x),
> >>> convert($n_type, y))
> >>>         global *(x::$name, y::$name) = *(convert($n_type, x),
> >>> convert($n_type, y))
> >>>         global -(x::$name, y::$name) = -(convert($n_type, x),
> >>> convert($n_type, y))
> >>>         global -(x::$name) = -convert($n_type, x)
> >>>         global /(x::$name, y::$name) = /(convert($n_type, x),
> >>> convert($n_type, y))
> >>>         global ^(x::$name, y::$name) = ^(convert($n_type, x),
> >>> convert($n_type, y))
> >>>         global ==(x::$name, y::$name) = (==)(convert($n_type, x),
> >>> convert($n_type, y))
> >>>         global < (x::$name, y::$name) = (< )(convert($n_type, x),
> >>> convert($n_type, y))
> >>>         Base.flipsign(x::$name, y::$name) =
> >>> Base.flipsign(convert($n_type, x), convert($n_type, y))
> >>>     end
> >>> end
> >>>
> >>> But I don't know if I've found them all, and my guesses may well change
> >>> as implementation details inside the base library change. Gradual
> typing is
> >>> great, but with such a powerful base library already in place, it
> would be
> >>> good to have a facility to know which functions are associated with
> which
> >>> named behaviors.
> >>>
> >>> Since we already have abstract classes in place, e.g. Signed, Number,
> >>> etc., it would be natural to extract a list of functions which operate
> on
> >>> them, or, even better, allow the type declarer to specify which
> functions
> >>> *should* operate on that abstract class, typeclass or interface style?
> >>>
> >>> Are there any recommendations in place, or updates to the language
> >>> planned, to address these sorts of topics?
> >>>
> >>>
> >>>
> >>>
> >>>
> >>>
> >>>
> >>
>
>

Reply via email to