I have to admit that I am quite unhappy with some of the changed features in
Julia version 0.3.0, especially the 'dot' notation. Here are some examples.


Let x be a vector defined as  x = [0.1, 0.2, 0.3] . Then typing

    julia> 5 + x
    WARNING: x::Number + A::Array is deprecated, use x .+ A instead.

but  5 + x  is a universal mathematical notation that should be allowed
to be used regardless of any programming language considerations.
On the other hand, both

    julia> 5 * x;
    julia> 5 .* x;

work without warning. Why is  5 * x  not also deprecated?

If I want to write, e.g., Runge's function in a vectorized form,

    julia> runge(x) = 1 ./ (1 .+ 5.*x.^2)

then this looks quite ugly and difficult to grasp on first view.


As another example, look at the max / maximum 'dichotomy':

    julia> maximum([x, 0.5])
    0.5

    julia> maximum(x, 0.5)
    3-element Array{Float64,1}:
     0.1
     0.2
     0.3

The first answer looks natural, but I have difficulties understanding the
meaning of the second case. On the other hand:

    julia> max(x, 0.5)
    3-element Array{Float64,1}:
     0.5
     0.5
     0.5

while  max(x, [0.5])  will lead to a dimension error, and  max([x, 0.5]) 
to a deprecation warning (which I seem to understand why).

I think all this is quite confusing for someone wanting to use Julia mostly
for technical computing, as the logo promises.
I am sure this has been discussed before, so probably I missed it. Sorry.

Reply via email to