Hi Julia users,

As this is my first post, I must say I'm extremely impressed by julia, met 
2 weeks ago. For many months I've meant to clean-up a C++/python lib of 
mine, for public consumption. I couldn't help but procrastinate, as the 
maintenance cost is so high a tax (eg. too clever hacks, big compilation 
times (mainly due to having to instantiate non-lazily an explosion of 
templates to make them accessible from within python), the last version of 
gcc not accepting my code anymore, for probably valid reasons, I guess, 
etc.). So big thanks to all julia developers for destroying my problem and 
freeing me from C++. I gave in after checking by myself the hard-to-believe 
efficiency promise (speed within 120% of C) on the non-toy DES crypto 
algorithm. I'm very grateful for this awesome beautiful, fast, fun, 
language, that I didn't dare to dream of.

The problem of higher-order-functions inlining (via callable types, aka 
functors) is getting a lot of attention, but I will need fast solutions 
soon and don't want to give up the right to use cost-free abstractions 
(offered by C++). So should we hold our breath on built-in functors, are is 
it still worth investigating library solutions?

I found NumericFunctors/NumericFuns (and base/reduce.jl) based on `abstract 
Functor{N}`, are there others built on `Functor{Result, Arg1, ..., ArgN}`?

I wanted to share a tiny trick that occurred to me today, which doesn't 
seem to be widely known, unleashing stateless functors (type constructors 
have a "dual nature"). Please let me know if it relies on undefined 
behavior.

abstract Functor{N}
typealias UnaryFunctor Functor{1}
typealias BinaryFunctor Functor{2}

type double <: UnaryFunctor end
double(x) = 2*x

type plus <: BinaryFunctor end
plus(x, y) = x+y
plus(x::String, y::String) = x*y # ;-)

type compose{Out<:Functor, In<:Functor} <: Functor
    compose(x...) = Out(In(x...))
    # compose(x...) = Out(In(x...)...) # more general but awfully slow
end

doubleplus = compose{double, plus}

# constrained function:
f{F<:BinaryFunctor}(fun::Type{F}, x) = fun(x, x)
f(plus, 1)

# etc.

This can't serve as a drop-in replacement for normal functions, as many API 
hardcode `Function` in their type signatures.
I only barely tested the performances. In the benchmark code at [1], 
statement (1) is 10 times slower than statement (2) on my machine, but if 
`plus` gets replaced by the definition above, it's only about 20% slower. 
And then becomes unsurprisigly faster for `doubleplus`. However, for e.g. 
`reduce(plus, 0, 1:n)` I observed no gain (compared to plus(x,y)=x+y, as 
reduce(+,...) is special-cased and optimized), any ideas why?

Note that the native_code of `doubleplus` "seems" to be more optimized with 
this finer grained definition of compose (which improves only slightly the 
benchmark):

type compose{Out<:Functor, In<:Functor} <: Functor
    if In <: UnaryFunctor
        compose(x) = Out(In(x))
    elseif In <: BinaryFunctor
       compose(x, y) = Out(In(x, y))
    else
        compose(x...) = Out(In(x...))
    end
end

On a related note, my last question: is there a variation of the following 
definition that would compile?

arity{N, F<:Functor{N}}(::Type{F}) = N

Thanks,
Rafael

[1] https://github.com/timholy/NumericFunctors.jl, benchmark code below:
plus(x, y) = x + y
map_plus(x, y) = map(plus, x, y)

a = rand(1000, 1000)
b = rand(1000, 1000)

 # warming up and get map_plus compiled
a + b
map_plus(a, b)

 # benchmark
@time for i in 1 : 10 map_plus(a, b) end  # -- statement (1)
@time for i in 1 : 10 a + b end           # -- statement (2)

Reply via email to