Re: [julia-users] inner constructor returning an object of a different type

2016-06-11 Thread Rafael Fourquet
In the pre-jb/functions era, where higher-order functions where
suboptimal (because function calls would not specialize on them),
there was a trick using the feature you noticed to overcome this
limitation, i.e. make it fast. Cf the discussion at
https://groups.google.com/d/topic/julia-users/qscRyNqRrB4/Idiscussion.
Note that the fast-anonymous package by Tim Holy was using this trick.


Re: [julia-users] Re: Uniform syntax

2016-06-05 Thread Rafael Fourquet
I think the OP's question is not about the difference between a macro
and a function, but rather about the syntactic way of defining a
macro: if a macro can be seen as a function taking an expression and
returning another one, why can't we just define a macro with the
standard function syntax (i.e. without the macro keyword), and inform
the compiler that we want to use it as a macro only at the call site,
by invoking it prepended with the @ sign.


Re: [julia-users] Lack of an explicit return in Julia, heartache or happiness?

2016-05-26 Thread Rafael Fourquet
And indeed, as Scott points, a function can switch from using the
short form to the long form only because the number of characters
grows a bit, which is uncorelated to the functionalness. Having the
short form and long form disagree on the "default" returned value
would increase the risk to introduce a bug each time a conversion to
long form is made.


Re: [julia-users] Lack of an explicit return in Julia, heartache or happiness?

2016-05-26 Thread Rafael Fourquet
I would be sorry to see this so elegant functional aspect of Julia
return to the void. It fits well with the
expression-rather-than-statement feel of Julia.
Regarding the annoyance of having to explicitly "return nothing" for
"procedures" (functions which are called only for side-effects): I
would find annoying to having to return explicitly something in
"functional functions", and I like the explicit "return nothing" to
indicate a procedure.
I don't agree that long form functions are inherently non-functional.
First, as Jeff mentioned, the lisp languages (considered functional)
have such features (e.g. "begin" in scheme), to return implicitly the
last expression in a block a mixed imperative/functional code.
This is true also in Haskell in do blocks (which implement side-effect
computations):

greet :: Int -> IO String
greet intensity = do
  name <- getLine
  return $ "Hello " ++ name ++ (take intensity $ repeat '!')

haskell> greet 3
Julia
"Hello Julia!!!"

It should be noted that the return here may be misleading: it's a
standard function which computes a value, which is then *implicitly*
returned to the caller of greet because it's the last expression of
the do block.

Second, a whole bunch a functions are "pure" (non-mutating) and can be
seen as implicit let forms:

function f(x)
y = 2*x
z = x^2
x + y - z
end

which corresponds e.g. in Haskell to

f x = let
  y = 2*x
  z = x*x
in x+y-z

I don't like to argue too much, but the status-quo camp is
under-represented in this thread!

I genuinely don't understand how annotating the function as ::Void or
::Nothing is preferable to "return nothing". And why not a new keyword
"procedure" then?


Re: [julia-users] How to change REPL mode on startup?

2016-05-15 Thread Rafael Fourquet
John, I tried your branch which works as expected, thank you. I found
that there has been a PR at
https://github.com/JuliaLang/julia/pull/13545 related to REPL hooks,
not sure how much this overlaps with your solution. In any case, I
hope to see this functionality merged.


Re: [julia-users] How to change REPL mode on startup?

2016-04-19 Thread Rafael Fourquet
> Again, I don't know if there is any demand for adding a general
> facility for this.

If you have in mind to make a PR, I would be a client for such a
facility. IIUC, this would e.g. allow me to change automatically the
prompt by calling a function from .juliarc.jl ? (which I do manually
now with "Base.active_repl.interface.modes[1].prompt = ...").


Re: [julia-users] Newbie Question : Averaging neighbours for every element

2016-02-01 Thread Rafael Fourquet
This in-preparation blogpost may be relevant?
https://github.com/JuliaLang/julialang.github.com/pull/324/files


Re: [julia-users] randperm run time is slow

2016-01-23 Thread Rafael Fourquet
The problem I think came essentially from the repeated creation of
RangeGenerator objects, cf.
https://github.com/JuliaLang/julia/pull/14772.


Re: [julia-users] randperm run time is slow

2016-01-22 Thread Rafael Fourquet
> Let's capture this as a Julia performance issue on github,
> if we can't figure out an easy way to speed this up right away.

I think I remember having identified a potentially sub-optimal
implementation of this function few weeks back (perhaps no more than
what Tim suggested) and had planned to investigate further (when time
permits...)


Re: [julia-users] Make a Copy of an Immutable with Specific Fields Changed

2014-09-18 Thread Rafael Fourquet
I think the idiomatic way remains to be designed:
https://github.com/JuliaLang/julia/issues/5333.


Re: [julia-users] Does Julia have something similar to Python's documentation string?

2014-09-15 Thread Rafael Fourquet

 Docile.jl looks great, but I think that the API should be made into
 comments. One of Julia's goals is to have a simple syntax that even people
 who are not acquainted with programming can easily understand.


Python, despite using docstrings, is a great example of a language having
a simple syntax that ...  understand


 I believe that a tagged comment is much more readable than a block
 introduced by @doc or doc.


Much more readable is maybe a bit exaggerated, can you explain why you
believe so?


Re: [julia-users] Does Julia have something similar to Python's documentation string?

2014-09-13 Thread Rafael Fourquet
 To me the only difference is that I
` really don't want to write

 @doc 
 commentary
 
 function ...


 whereas I already write things along the lines of

 # commentary
 function ...

doc function doc
function ...

is already better, and then let's get rid of even the doc keyword. It would
be kind of less breaking a change, as currently comments are mainly written
for developpers consumption and not meant for documenting public API and
would need to be fixed all at once. As both developper comments and API
documentation are needed, I find it useful to have two distincts means:
comments and strings.


Re: [julia-users] Copy a BigFloat?

2014-09-13 Thread Rafael Fourquet
BigFloats are indeed immutable in spirit but not really:

julia BigFloat.mutable == isimmutable(big(0.1))
true

So copy is a no-op (returns its argument as for every number), and deepopy
returns a new instance (deepcopy(a) === a is false), but then there is no
public API to mutate a or its copy.  So there is no use for deepcopy on
BigFloats in client code, they behave as Float64.


Re: [julia-users] why sum(abs(A)) is very slow

2014-08-23 Thread Rafael Fourquet

 There's a complicated limit to when you want to fuse loops – at some point
 multiple iterations becomes better than fused loops and it all depends on
 how much and what kind of work you're doing. In general doing things lazily
 does not cut down on allocation since you have to allocate the
 representation of the operations that you're deferring and close over any
 values that they depend on.

This particular example only works out so well because the iterable is so
 simple that the compiler can eliminate the laziness and do the eager loop
 fused version for you. This will not generally be the case.


Thank you for taking so much time to explain and for your patience!


 You're welcome to experiment (and Julia's type system makes it pretty easy
 to do so), but I think that you'll quickly find that more laziness is not a
 panacea for performance problems.


My question came partly from Python3 having lazy map and reduce. But having
the choice is good and in Julia all laziness can be provided now by imap
etc. If someone has an (self-contained) example where lazy element-wise
computations is worse than eager, please post! (I'm interested in
understanding better the above mentioned limit)


Re: [julia-users] why sum(abs(A)) is very slow

2014-08-22 Thread Rafael Fourquet

 Obviously it would be even nicer not to have to do that :-)


My naive answer is then why not make vectorized functions lazy (like iabs
above, plus dimensions information) by default? Do you have links to
relevant discussions?


Re: [julia-users] why sum(abs(A)) is very slow

2014-08-22 Thread Rafael Fourquet
  If that was the way things worked, would sum(abs(A)) do the computation
 right away or just wait until you ask for the result? In other words,
 should sum also be lazy if we're doing all vectorized computations that
 way?


sum(abs(A)) returns a scalar, so lazy would buy nothing here (in most cases
at least, let's not be haskell!)


  What about sum(abs(A),1)? Lazy or eager?


If dim A1, the result is an array so lazy.
In short, be lazy when it gives opportunity for loop fusion, and saves
allocations.


 What about A*B when A and B are matrices?


I was more thinking of operations done element-wise (of the form of map(f,
A1, ...),  like abs and +). Optimizing a product A*B is less trivial (C++
expressions templates...), si I prefer not answer!


Re: [julia-users] why sum(abs(A)) is very slow

2014-08-22 Thread Rafael Fourquet
 Could you please explain why the iterator version is so much faster?  Is
 it simply from avoiding temporary array allocation?


That's what I understand, and maybe marginally because there is only one
pass over the data.


Re: [julia-users] Re: We have typed functions, don't we?

2014-08-20 Thread Rafael Fourquet

 'Traditional' Julia: you can pass a function f as an argument to another
 function g.

 Rafael's functors: instead you create new type F whose constructor is f,
 and then you make g a parametric function with a parameter F instead of an
 argument f.


A typo here, the constructor of type F is F, and that's the point: F is
both a type and a callable.


 And the point of this technique is that you can potentially boost
 performance because the compiler can in-line the implicit call to f
 inherent in g{F}(...), whereas the call to f in g(f,...) cannot be inlined.


Yes (but I think it is planned to make the compiler be able to inline
normal functions... maybe by being desugared to functors ;-) ).
The declaration of g must be like g{F}(::Type{F}, ...) to enable inlining,
and simply g(F, ...) or g(F::DataType, ...) otherwise (AFAIU, g would then
have only one specialization for all F::DataType, as typeof(F)==DataType).

Have I got it?  Are there other advantages to Rafael's functors (simplified
 code, better error-checking...) compared to 'traditional' Julia?


One application is using the inheritance machinery (type constraints...),
eg to take Stefan's example from
https://github.com/JuliaLang/julia/issues/1470:

abstract BinaryOperator
function call{op:BinaryOperator}(::Type{op}, v::Vector, w::Vector)
# implement generic vectorized operation in terms of functor op
end
function call{op:BinaryOperator}(::Type{op}, v::Vector, x)
# implement generic op(vector, scalar)
end

type plus : BinaryOperator end
plus(a::Number, b::Number) = a+b
plus(args...) = call(plus, args...) # delegate everything else to super
types

This is less ideal than what is proposed in issue 1470, as one has to write
the delegating code (last line above) for each new sub-type of
BinaryOperator, however only one generic such line is necessary (and can be
macroized :) ).


Re: [julia-users] We have typed functions, don't we?

2014-08-19 Thread Rafael Fourquet

 This is a really cool family of tricks. Time for me to start replacing some
 ::Function specifiers in my argument lists...


I saw in julia base that `Base.Callable`, an alias for Union(Function,
DataType), is used in argument lists.
I'm starting to consider replacing most normal functions by functors. I
can't see any drawbacks for now, and I just found at https://github.com/
JuliaLang/julia/issues/1470 that There is actually no requirement that new
be called within a constructor (Jeff), so this may be safe too.

Also in my imap implementation example, the field F is not really needed
for functors (but does not seem to incur overhead), but is used when
extending imap to normal functions, with this (outer) constructor:
imap{X}(f::Function, x::X) = imap{Function, X}(f, x)

Finally, I tested sum(imap(sinc_plus_x, x)) on a more recent computer, on
which it is 7% slower than the hand-written version (same result with
reduce(plus,
0.0, imap(sinc_plus_x, x)) for a functor-aware version of reduce).


Re: [julia-users] We have typed functions, don't we?

2014-08-18 Thread Rafael Fourquet
I'm glad to report that the general and beautifully functional solution sum(
imap(sinc_plus_x, x)) is in fact as efficient as the devectorized
hand-written one!

It turns out I made the mistake to forget to forward an expression like
{Type{F}} to the imap iterator constructor (cf. code below), making it
specialized on DataType instead of sinc_plus_x. (While I can see why this
prevented inlining, I still don't understand this caused allocations).

For the record, here is a possible implementation of imap for functors
with 1 argument:

immutable imap{TF, X}
F::TF
x::X
end
imap{F, X}(::Type{F}, x::X) = imap{Type{F}, X}(F, x)

Base.start(i::imap) = start(i.x)
Base.next(i::imap, s) = ((v, s) = next(i.x, s); (i.F(v), s))
Base.done(i::imap, s) = done(i.x, s)


Re: [julia-users] We have typed functions, don't we?

2014-08-17 Thread Rafael Fourquet
 Love it! I get your point much better now. I'll be using it in the future
 :).


:) Yes I was probably a bit succint.


 Re Iterators, yes, your guess is spot on---it's those pesky tuples (which
 are
 way better than they used to be, see
 https://github.com/JuliaLang/julia/pull/4042). In high performance code,
 iterators from Iterators are usually a bottleneck.


OK thanks. I actually used a simpler imap (which handles only single
argument functions) than in Iterators.jl, getting much better results, but
still 48*lenght(x) bytes allocated, and slower (roughly by x10) than sumf.


 The recently-added iterator
 macros can sidestep that problem in some circumstances, but not without
 internal changes. Perhaps one option would be to explore using them more
 widely in Iterators?


Maybe. For my tests I annotated the functions with the :inline hint (from
base/inference.jl, disabled in master, but it's 4 lines to uncomment),
which is supposed to inline the calls to the imap wrapper functions, with
no gain in this case. So I would believe that the overhead is caused only
by heap allocations (+ gc), and not by not inlining.


Re: [julia-users] We have typed functions, don't we?

2014-08-15 Thread Rafael Fourquet
Oh, and by the way functions specialized on values can be emulated, e.g.

type plusN{N}
plusN(x) = x+N
end
plus{10}(1)

And writing a constrained function can be slightly simpler than in my
previous post:

# constrained function:
f{F:BinaryFunctor}(::Type{F}, x) = F(x, x)
f(plus, 1)

Or, as a functor:

type f{F:BinaryFunctor}
f(x) = F(x, x)
end
f{plus}(1)

 I just hope that someone can assert this is a safe abuse of constructors.


Re: [julia-users] We have typed functions, don't we?

2014-08-15 Thread Rafael Fourquet
 Hi Rafael, I recently posted an example of using function types, see:

Thanks Adrian, but I couldn't see how it helps to make a
function specialize on its (higher-order) function parameter (and
possibly inline it).


Re: [julia-users] We have typed functions, don't we?

2014-08-15 Thread Rafael Fourquet
 I've not used NumericFuctors so I can't comment on your main question. If
 it's of any use, there's an entirely different approach (more of a dirty
 trick, really) to inlining functions passed in as arguments. Here's a gist
 that shows the trick:
 https://gist.github.com/timholy/bdcee95f9b7725214d8b


Thanks Tim, it is useful! by making more explicit the relevance of the
constructor trick:

type sinc_plus_x end
sinc_plus_x(x) = sin(x)/x + x

function sumf{F}(::Type{F}, x::AbstractArray)
s = 0.0
for xs in x
s += F(xs)
end
s
end

The above sumf is conceptually exactly the same as the one in your gist
(and so achieves the same performance), only it delegates the
meta-programming stuff to the compiler :-)

Unfortunately, I failed to find a more general efficient solution, in the
vein of
sum(imap(sinc_plus_x, x))
where imap is an iterator similar to the one in package Iterators.jl (or
python's imap): memory allocations are triggered, for no obvious (to me)
reasons (my guess would be tuples allocations in the next() implementation).


Re: [julia-users] Equivalent of c++ functor objects, std::bind, and lambdas

2014-08-15 Thread Rafael Fourquet
If the function copy is implemented for z:

z = ...
newfun = let zz = copy(z); (x, y) - f(zz, x, y) end

I think I understood that lambdas are less efficient than functions so this
may be faster:

let zz = copy(z)
global newfun
newfun(x, y) = f(zz, x, y)
end


Re: [julia-users] Equivalent of c++ functor objects, std::bind, and lambdas

2014-08-15 Thread Rafael Fourquet
Ok thanks. I guess  the heart of the question is overcoming Julia's builtin
pass-by-reference behavior. I would be fine using an explicit copy
function, but is there any way I can avoid defining a copy function for all
my types, which would be annoying?

OK, it seems that deepcopy corresponds better to C++ copy semantics, and
apparently julia provides a default implementation of deepcopy for user
types.


[julia-users] We have typed functions, don't we?

2014-08-14 Thread Rafael Fourquet
Hi Julia users,

As this is my first post, I must say I'm extremely impressed by julia, met 
2 weeks ago. For many months I've meant to clean-up a C++/python lib of 
mine, for public consumption. I couldn't help but procrastinate, as the 
maintenance cost is so high a tax (eg. too clever hacks, big compilation 
times (mainly due to having to instantiate non-lazily an explosion of 
templates to make them accessible from within python), the last version of 
gcc not accepting my code anymore, for probably valid reasons, I guess, 
etc.). So big thanks to all julia developers for destroying my problem and 
freeing me from C++. I gave in after checking by myself the hard-to-believe 
efficiency promise (speed within 120% of C) on the non-toy DES crypto 
algorithm. I'm very grateful for this awesome beautiful, fast, fun, 
language, that I didn't dare to dream of.

The problem of higher-order-functions inlining (via callable types, aka 
functors) is getting a lot of attention, but I will need fast solutions 
soon and don't want to give up the right to use cost-free abstractions 
(offered by C++). So should we hold our breath on built-in functors, are is 
it still worth investigating library solutions?

I found NumericFunctors/NumericFuns (and base/reduce.jl) based on `abstract 
Functor{N}`, are there others built on `Functor{Result, Arg1, ..., ArgN}`?

I wanted to share a tiny trick that occurred to me today, which doesn't 
seem to be widely known, unleashing stateless functors (type constructors 
have a dual nature). Please let me know if it relies on undefined 
behavior.

abstract Functor{N}
typealias UnaryFunctor Functor{1}
typealias BinaryFunctor Functor{2}

type double : UnaryFunctor end
double(x) = 2*x

type plus : BinaryFunctor end
plus(x, y) = x+y
plus(x::String, y::String) = x*y # ;-)

type compose{Out:Functor, In:Functor} : Functor
compose(x...) = Out(In(x...))
# compose(x...) = Out(In(x...)...) # more general but awfully slow
end

doubleplus = compose{double, plus}

# constrained function:
f{F:BinaryFunctor}(fun::Type{F}, x) = fun(x, x)
f(plus, 1)

# etc.

This can't serve as a drop-in replacement for normal functions, as many API 
hardcode `Function` in their type signatures.
I only barely tested the performances. In the benchmark code at [1], 
statement (1) is 10 times slower than statement (2) on my machine, but if 
`plus` gets replaced by the definition above, it's only about 20% slower. 
And then becomes unsurprisigly faster for `doubleplus`. However, for e.g. 
`reduce(plus, 0, 1:n)` I observed no gain (compared to plus(x,y)=x+y, as 
reduce(+,...) is special-cased and optimized), any ideas why?

Note that the native_code of `doubleplus` seems to be more optimized with 
this finer grained definition of compose (which improves only slightly the 
benchmark):

type compose{Out:Functor, In:Functor} : Functor
if In : UnaryFunctor
compose(x) = Out(In(x))
elseif In : BinaryFunctor
   compose(x, y) = Out(In(x, y))
else
compose(x...) = Out(In(x...))
end
end

On a related note, my last question: is there a variation of the following 
definition that would compile?

arity{N, F:Functor{N}}(::Type{F}) = N

Thanks,
Rafael

[1] https://github.com/timholy/NumericFunctors.jl, benchmark code below:
plus(x, y) = x + y
map_plus(x, y) = map(plus, x, y)

a = rand(1000, 1000)
b = rand(1000, 1000)

 # warming up and get map_plus compiled
a + b
map_plus(a, b)

 # benchmark
@time for i in 1 : 10 map_plus(a, b) end  # -- statement (1)
@time for i in 1 : 10 a + b end   # -- statement (2)