[julia-users] Re: Help for 'mean' (version 0.4)

2015-09-01 Thread Seth
Unreproducible here:

  _
   _   _ _(_)_ |  A fresh approach to technical computing
  (_) | (_) (_)|  Documentation: http://docs.julialang.org
   _ _   _| |_  __ _   |  Type "?help" for help.
  | | | | | | |/ _` |  |
  | | |_| | | | (_| |  |  Version 0.4.0-pre+7107 (2015-08-31 16:51 UTC)
 _/ |\__'_|_|_|\__'_|  |  Commit 4e44a1c (1 day old master)
|__/   |  x86_64-apple-darwin14.5.0


help?> mean
search: mean mean! median median! SegmentationFault macroexpand 
module_parent Meta enumerate Enumerate timedwait primesmask


  mean(v[, region])


  Compute the mean of whole array v, or optionally along the dimensions in 
region. Note: Julia does not ignore NaN values in the
  computation. For applications requiring the handling of missing data, the 
DataArray package is recommended.


On Tuesday, September 1, 2015 at 10:12:34 AM UTC-7, cormu...@mac.com wrote:
>
> Using 0.4: 
>
> help?> mean
> search: mean mean! median median! SegmentationFault macroexpand 
> module_parent Meta enumerate Enumerate timedwait primes mask 
> remotecall remotecall_wait remotecall_fetch MethodTable
>
>   0 (zero; BrE: /ˈzɪərəʊ/ or AmE: /ˈziːroʊ/) is both a number and the 
> numerical digit used to represent that number in numerals. It fulfills a 
> central role in 
>   mathematics as the additive identity of the integers, real numbers, 
> and many other algebraic structures. As a digit, 0 is used as a placeholder 
> in 
>   place value systems. Names for the number 0 in English include zero, 
> nought or (US) naught (/ˈnɔːt/), nil, or — in contexts where at least one 
>   adjacent digit distinguishes it from the letter "O" — oh or o 
> (/ˈoʊ/). Informal or slang terms for zero include zilch and zip. Ought and 
> aught (/ˈɔːt/), 
>   as well as cipher, have also been used historically.
>
> I like this, but was expecting information about the `mean` function... :)
>
> I tried a few other words, but couldn't get their definitions. 
> Disappointed not to find `zilch`:
>
> Couldn't find zilch
> Perhaps you meant zip
>
> True enough.
>
> Or perhaps this is just a nice Easter egg, rather than an interesting 
> glitch in the software matrix...
>
>

[julia-users] Re: Julia 0.4 warnings and how to fix

2015-09-01 Thread Michael Francis
Thanks

Does anybody have a pointer to why operators are not imported in 0.4 by 
default?  

On Tuesday, September 1, 2015 at 12:36:14 PM UTC-4, Tim Wheeler wrote:
>
> Thanks! I was trying to figure out how to do Base.== for a long time. It 
> turns out the proper way to do it is as follows (found this in 
> DataArrays.jl):
>
> Base.(:(==))( ... ) 
>


[julia-users] Re: Type stability (or not) in core stats functions

2015-09-01 Thread Michael Francis
Thanks, that is a good pointer.

In this specific case its unfortunate that there is a keyword arg in the 
API at all, having two functions one with a mean supplied and one without 
would avoid this issue and remove the branch logic replacing it with static 
dispatch. 

On Tuesday, September 1, 2015 at 1:02:17 PM UTC-4, Jarrett Revels wrote:
>
> Actually, just saw this: https://github.com/JuliaLang/julia/issues/9818 
> . Ignore the messed up 
> @code_typed stuff in my previous reply to this thread.
>
> I believe the type-inference concerns are still there, however, even if 
> @code_typed doesn't correctly report them, so the fixes I listed should 
> still be useful for patching over inferencing problems with keyword 
> arguments.
>
> Best,
> Jarrett
>
> On Tuesday, September 1, 2015 at 12:49:02 PM UTC-4, Jarrett Revels wrote:
>>
>> Related: https://github.com/JuliaLang/julia/issues/9551
>>
>> Unfortunately, as you've seen, type-variadic keyword arguments can really 
>> mess up type-inferencing. It appears that keyword argument types are pulled 
>> from the default arguments rather than those actually passed in at runtime:
>>
>> *julia> f(x; a=1, b=2) = a*x^b*
>> *f (generic function with 1 method)*
>>
>> *julia> f(1)*
>> *1*
>>
>> *julia> f(1, a=(3+im), b=5.15)*
>> *3.0 + 1.0im*
>>
>> *julia> @code_typed f(1, a=(3+im), b=5.15)*
>> *1-element Array{Any,1}:*
>> * :($(Expr(:lambda, Any[:x], 
>> Any[Any[Any[:x,Int64,0]],Any[],Any[Int64],Any[]], :(begin $(Expr(:line, 1, 
>> :none, symbol("")))*
>> *GenSym(0) = (Base.power_by_squaring)(x::Int64,2)::Int64*
>> *return (Base.box)(Int64,(Base.mul_int)(1,GenSym(0)))::Int64*
>> *end::Int64*
>>
>> Obviously, that specific call to f does NOT return an Int64.
>>
>> I know of only two reasonable ways to handle it at the moment:
>>
>> 1. If you're the method author: Restrict every keyword argument to a 
>> declared, concrete type, which ensures that the argument isn't 
>> type-variadic. Yichao basically gave an example of this.
>> 2. If you're the method caller: Manually assert the return type. You can 
>> do this pretty easily in most cases using a wrapper function. 
>> Using `f` from above as an example:
>>
>> *julia> g{X,A,B}(x::X, a::A, b::B) = f(x, a=a, b=b)::promote_type(X, A, 
>> B)*
>> *g (generic function with 2 methods)*
>>
>> *julia> @code_typed g(1,2,3)*
>> *1-element Array{Any,1}:*
>> * :($(Expr(:lambda, Any[:x,:a,:b], 
>> Any[Any[Any[:x,Int64,0],Any[:a,Int64,0],Any[:b,Int64,0]],Any[],Any[Int64],Any[:X,:A,:B]],
>>  
>> :(begin  # none, line 1:*
>> *return 
>> (top(typeassert))((top(kwcall))((top(getfield))(Main,:call)::F,2,:a,a::Int64,:b,b::Int64,Main.f,(top(ccall))(:jl_alloc_array_1d,(top(apply_type))(Base.Array,Any,1)::Type{Array{Any,1}},(top(svec))(Base.Any,Base.Int)::SimpleVector,Array{Any,1},0,4,0)::Array{Any,1},x::Int64),Int64)::Int64*
>> *end::Int64*
>>
>> *julia> @code_typed g(1,2,3.0)*
>> *1-element Array{Any,1}:*
>> * :($(Expr(:lambda, Any[:x,:a,:b], 
>> Any[Any[Any[:x,Int64,0],Any[:a,Int64,0],Any[:b,Float64,0]],Any[],Any[Int64],Any[:X,:A,:B]],
>>  
>> :(begin  # none, line 1:*
>> *return 
>> (top(typeassert))((top(kwcall))((top(getfield))(Main,:call)::F,2,:a,a::Int64,:b,b::Float64,Main.f,(top(ccall))(:jl_alloc_array_1d,(top(apply_type))(Base.Array,Any,1)::Type{Array{Any,1}},(top(svec))(Base.Any,Base.Int)::SimpleVector,Array{Any,1},0,4,0)::Array{Any,1},x::Int64),Float64)::Float64*
>> *end::Float64*
>>
>> *julia> @code_typed g(1,2,3.0+im)*
>> *1-element Array{Any,1}:*
>> * :($(Expr(:lambda, Any[:x,:a,:b], 
>> Any[Any[Any[:x,Int64,0],Any[:a,Int64,0],Any[:b,Complex{Float64},0]],Any[],Any[Int64],Any[:X,:A,:B]],
>>  
>> :(begin  # none, line 1:*
>> *return 
>> (top(typeassert))((top(kwcall))((top(getfield))(Main,:call)::F,2,:a,a::Int64,:b,b::Complex{Float64},Main.f,(top(ccall))(:jl_alloc_array_1d,(top(apply_type))(Base.Array,Any,1)::Type{Array{Any,1}},(top(svec))(Base.Any,Base.Int)::SimpleVector,Array{Any,1},0,4,0)::Array{Any,1},x::Int64),Complex{Float64})::Complex{Float64}*
>> *end::Complex{Float64}*
>>
>> Thus, downstream functions can call *f* through *g, *preventing 
>> type-instability from "bubbling up" to the calling methods (as it would if 
>> they called *f* directly).
>>
>> Best,
>> Jarrett
>>
>> On Tuesday, September 1, 2015 at 8:39:11 AM UTC-4, Michael Francis wrote:
>>>
>>> 2) The underlying functions are only stable if the mean passed to them 
>>> is of the correct type, e.g. a number. Essentially this is a type inference 
>>> issue, if the compiler was able to optimize  the branches then it would be 
>>> likely be ok, it looks from the LLVM code that this is not the case today. 
>>>
>>> FWIW using a type stable version (e.g. directly calling covm) looks to 
>>> be about 18% faster for small (100 element) AbstractArray pairs. 
>>>
>>> On Monday, August 31, 2015 at 9:06:58 PM UTC-4, Sisyphuss wrote:

 IMO:
 1) 

Re: [julia-users] How do I know if a type is concrete?

2015-09-01 Thread Mauro
Try isleaftype

otherwise @code_warntype is a great tool which might be applicable to
your case.

On Tue, 2015-09-01 at 20:05, Cedric St-Jean  wrote:
> I have a type
>
>
> typealias TColor RGBA{UfixedBase{Uint8,8}}
> typealias TImage Image{TColor, 2, Array{TColor, 2}}
>
> How do I know if it's a concrete type or not? Is there an isconcrete 
> function somewhere?
>
> Incidentally, I'm asking this because I just updated my packages and had to 
> adjust to the recent Colors changes. My former definition of TColor was:
>
> typealias TColor AlphaColorValue{RGB{UfixedBase{Uint8,8}},UfixedBase{Uint8,8
> }}
>
> and now for whatever reason concatenating images together is ~10X slower. 
> Since those are just arrays, I assume it's a type  problem. Those are more 
> frustrating and frequent than I expected, I must be doing something wrong...
>
> Cédric



[julia-users] Re: Julia 0.4 warnings and how to fix

2015-09-01 Thread Tony Kelman
https://github.com/JuliaLang/julia/issues/8113
https://github.com/JuliaLang/julia/pull/12235


On Tuesday, September 1, 2015 at 10:22:05 AM UTC-7, Michael Francis wrote:
>
> Thanks
>
> Does anybody have a pointer to why operators are not imported in 0.4 by 
> default?  
>
> On Tuesday, September 1, 2015 at 12:36:14 PM UTC-4, Tim Wheeler wrote:
>>
>> Thanks! I was trying to figure out how to do Base.== for a long time. It 
>> turns out the proper way to do it is as follows (found this in 
>> DataArrays.jl):
>>
>> Base.(:(==))( ... ) 
>>
>

[julia-users] Re: Type stability (or not) in core stats functions

2015-09-01 Thread Jarrett Revels
Actually, just saw this: https://github.com/JuliaLang/julia/issues/9818 
. Ignore the messed up 
@code_typed stuff in my previous reply to this thread.

I believe the type-inference concerns are still there, however, even if 
@code_typed doesn't correctly report them, so the fixes I listed should 
still be useful for patching over inferencing problems with keyword 
arguments.

Best,
Jarrett

On Tuesday, September 1, 2015 at 12:49:02 PM UTC-4, Jarrett Revels wrote:
>
> Related: https://github.com/JuliaLang/julia/issues/9551
>
> Unfortunately, as you've seen, type-variadic keyword arguments can really 
> mess up type-inferencing. It appears that keyword argument types are pulled 
> from the default arguments rather than those actually passed in at runtime:
>
> *julia> f(x; a=1, b=2) = a*x^b*
> *f (generic function with 1 method)*
>
> *julia> f(1)*
> *1*
>
> *julia> f(1, a=(3+im), b=5.15)*
> *3.0 + 1.0im*
>
> *julia> @code_typed f(1, a=(3+im), b=5.15)*
> *1-element Array{Any,1}:*
> * :($(Expr(:lambda, Any[:x], 
> Any[Any[Any[:x,Int64,0]],Any[],Any[Int64],Any[]], :(begin $(Expr(:line, 1, 
> :none, symbol("")))*
> *GenSym(0) = (Base.power_by_squaring)(x::Int64,2)::Int64*
> *return (Base.box)(Int64,(Base.mul_int)(1,GenSym(0)))::Int64*
> *end::Int64*
>
> Obviously, that specific call to f does NOT return an Int64.
>
> I know of only two reasonable ways to handle it at the moment:
>
> 1. If you're the method author: Restrict every keyword argument to a 
> declared, concrete type, which ensures that the argument isn't 
> type-variadic. Yichao basically gave an example of this.
> 2. If you're the method caller: Manually assert the return type. You can 
> do this pretty easily in most cases using a wrapper function. 
> Using `f` from above as an example:
>
> *julia> g{X,A,B}(x::X, a::A, b::B) = f(x, a=a, b=b)::promote_type(X, A, B)*
> *g (generic function with 2 methods)*
>
> *julia> @code_typed g(1,2,3)*
> *1-element Array{Any,1}:*
> * :($(Expr(:lambda, Any[:x,:a,:b], 
> Any[Any[Any[:x,Int64,0],Any[:a,Int64,0],Any[:b,Int64,0]],Any[],Any[Int64],Any[:X,:A,:B]],
>  
> :(begin  # none, line 1:*
> *return 
> (top(typeassert))((top(kwcall))((top(getfield))(Main,:call)::F,2,:a,a::Int64,:b,b::Int64,Main.f,(top(ccall))(:jl_alloc_array_1d,(top(apply_type))(Base.Array,Any,1)::Type{Array{Any,1}},(top(svec))(Base.Any,Base.Int)::SimpleVector,Array{Any,1},0,4,0)::Array{Any,1},x::Int64),Int64)::Int64*
> *end::Int64*
>
> *julia> @code_typed g(1,2,3.0)*
> *1-element Array{Any,1}:*
> * :($(Expr(:lambda, Any[:x,:a,:b], 
> Any[Any[Any[:x,Int64,0],Any[:a,Int64,0],Any[:b,Float64,0]],Any[],Any[Int64],Any[:X,:A,:B]],
>  
> :(begin  # none, line 1:*
> *return 
> (top(typeassert))((top(kwcall))((top(getfield))(Main,:call)::F,2,:a,a::Int64,:b,b::Float64,Main.f,(top(ccall))(:jl_alloc_array_1d,(top(apply_type))(Base.Array,Any,1)::Type{Array{Any,1}},(top(svec))(Base.Any,Base.Int)::SimpleVector,Array{Any,1},0,4,0)::Array{Any,1},x::Int64),Float64)::Float64*
> *end::Float64*
>
> *julia> @code_typed g(1,2,3.0+im)*
> *1-element Array{Any,1}:*
> * :($(Expr(:lambda, Any[:x,:a,:b], 
> Any[Any[Any[:x,Int64,0],Any[:a,Int64,0],Any[:b,Complex{Float64},0]],Any[],Any[Int64],Any[:X,:A,:B]],
>  
> :(begin  # none, line 1:*
> *return 
> (top(typeassert))((top(kwcall))((top(getfield))(Main,:call)::F,2,:a,a::Int64,:b,b::Complex{Float64},Main.f,(top(ccall))(:jl_alloc_array_1d,(top(apply_type))(Base.Array,Any,1)::Type{Array{Any,1}},(top(svec))(Base.Any,Base.Int)::SimpleVector,Array{Any,1},0,4,0)::Array{Any,1},x::Int64),Complex{Float64})::Complex{Float64}*
> *end::Complex{Float64}*
>
> Thus, downstream functions can call *f* through *g, *preventing 
> type-instability from "bubbling up" to the calling methods (as it would if 
> they called *f* directly).
>
> Best,
> Jarrett
>
> On Tuesday, September 1, 2015 at 8:39:11 AM UTC-4, Michael Francis wrote:
>>
>> 2) The underlying functions are only stable if the mean passed to them is 
>> of the correct type, e.g. a number. Essentially this is a type inference 
>> issue, if the compiler was able to optimize  the branches then it would be 
>> likely be ok, it looks from the LLVM code that this is not the case today. 
>>
>> FWIW using a type stable version (e.g. directly calling covm) looks to be 
>> about 18% faster for small (100 element) AbstractArray pairs. 
>>
>> On Monday, August 31, 2015 at 9:06:58 PM UTC-4, Sisyphuss wrote:
>>>
>>> IMO:
>>> 1) This is called keyword argument (not named optional argument).
>>> 2) The returned value depends only on `corzm`, and `corm`. If these two 
>>> functions are type stable, then `cor` is type stable.
>>> 3) I'm not sure whether this is the "correct" way to write this function.
>>>
>>> On Monday, August 31, 2015 at 11:48:37 PM UTC+2, Michael Francis wrote:

 The following is taken from statistics.jl line 428 

 function cor(x::AbstractVector, 

Re: [julia-users] Help for 'mean' (version 0.4)

2015-09-01 Thread cormullion
You're right, Matt, my testing does odd things. (In version 0.3, `mean = 0` 
used to give a warning, it doesn't any more).

I'm pleased that you can get help on 0. A little disappointed that you 
can't get help on all the integers... :) 


[julia-users] Re: How to assign an arry from inside a cycle

2015-09-01 Thread Sisyphuss
delete the comma behind `1:3`, then you will get the right result.


On Tuesday, September 1, 2015 at 8:17:18 PM UTC+2, Igor wrote:
>
> Hello everybody!
>
> How is it possible to write something in array from "for" loop?  I have 
> following code:
>
> t = Array(Float64, 3, 4)
> for i = 1:3,
>   t[i, :] = [1., 2., 3., 4.]
> end
>
>
>
> But after this, all the elements inside "t" are equal to "4". How to make 
> them "1 2 3 4" ?
>
> Best regards, Igor.
>


[julia-users] Performing conversion on a tuple of arguments

2015-09-01 Thread Scott Jones
What is the best Julian way of performing a conversion on all of the 
arguments to a method?
Say I have a tuple, passed in as I::Real..., and I wish to convert all of 
the values to type Int?

Thanks,
Scott



[julia-users] How to assign an arry from inside a cycle

2015-09-01 Thread Igor
Hello everybody!

How is it possible to write something in array from "for" loop?  I have 
following code:

t = Array(Float64, 3, 4)
for i = 1:3,
  t[i, :] = [1., 2., 3., 4.]
end



But after this, all the elements inside "t" are equal to "4". How to make 
them "1 2 3 4" ?

Best regards, Igor.


[julia-users] Re: Julia 0.4 warnings and how to fix

2015-09-01 Thread Michael Francis
Thanks - have to say this removal makes no sense to me. We will in the next 
revision 0.5 of Julia allow accidental shadowing of == (and associated) 
operators. Seems like a regressive step. 

On Tuesday, September 1, 2015 at 1:53:47 PM UTC-4, Tony Kelman wrote:
>
> https://github.com/JuliaLang/julia/issues/8113
> https://github.com/JuliaLang/julia/pull/12235
>
>
> On Tuesday, September 1, 2015 at 10:22:05 AM UTC-7, Michael Francis wrote:
>>
>> Thanks
>>
>> Does anybody have a pointer to why operators are not imported in 0.4 by 
>> default?  
>>
>> On Tuesday, September 1, 2015 at 12:36:14 PM UTC-4, Tim Wheeler wrote:
>>>
>>> Thanks! I was trying to figure out how to do Base.== for a long time. It 
>>> turns out the proper way to do it is as follows (found this in 
>>> DataArrays.jl):
>>>
>>> Base.(:(==))( ... ) 
>>>
>>

[julia-users] Tk (Cairo) Canvas not scrollable?

2015-09-01 Thread Dömötör Gulyás
I'm trying to build a scrollable Cario canvas in a Tk GUI, but I keep 
getting "TclError("unknown option -xscrollcommand")" trying to add the 
scrollbars, apparently the canvas is a frame ("winfo class" returns 
"Frame"), which is not scrollable.

Is there a way to add scrollbars to a Cairo canvas, or can something be 
done to make it the proper widget class?. I've tried changing the 
constructor from c = TkWidget(parent, "ttk::frame") to c = TkWidget(parent, 
"ttk::canvas"), but that doesnt seem to help, and I see no other obvious 
place to change.

Anyone have any idea how to go about this? I've also tried to build the GUI 
in Gtk, but that hasn't worked out for other reasons.



Re: [julia-users] Re: MongoDB and Julia

2015-09-01 Thread Ferenc Szalma
Kevin,

I also managed to get Pzion's Mongo.jl to work in Julia v0.3. Now, I am 
trying to make it work in v0.4 but getting an error message while trying to 
insert:

oid = insert(collection, {"name"=>"time series"})



WARNING: deprecated syntax "{a=>b, ...}" at In[36]:1. Use "Dict{Any,Any}(a=>b, 
...)" instead. 

LoadError: MethodError: `convert` has no method matching convert(::Type{Ptr{
Void}}, ::Array{UInt8,1})
This may have arisen from a call to the constructor Ptr{Void}(...),
since type constructors fall back to convert methods.
Closest candidates are:
 call{T}(::Type{T}, ::Any)
 convert{T}(::Type{Ptr{T}}, !Matched::UInt64)
 convert{T}(::Type{Ptr{T}}, !Matched::Int64)
 ...
while loading In[36], in expression starting on line 1
 

 in insert at /Users/szalmaf/.julia/v0.4/Mongo/src/MongoCollection.jl:42 


Did you try Mongo.jl in Julia v0.4? Do you have any suggestions as to how 
to go about getting rid of the LoadError above? It seems like a generic 
problem when switching from v0.3 to v0.4.

Cheers



[julia-users] Julia's text mining capabilities

2015-09-01 Thread Venkat Ramakrishnan
Folks,

How fast is Julia compared to other languages like R and Python in text 
processing?
Any benchmarks?

Any parallel processing facility specifically available for text processing 
in Julia?

Thanks,
Venkat.



Re: [julia-users] How do I know if a type is concrete?

2015-09-01 Thread Cedric St-Jean
Thank you. That doesn't seem to be the problem, I'll post an issue to 
Images.jl

On Tuesday, September 1, 2015 at 2:18:43 PM UTC-4, Mauro wrote:
>
> Try isleaftype 
>
> otherwise @code_warntype is a great tool which might be applicable to 
> your case. 
>
> On Tue, 2015-09-01 at 20:05, Cedric St-Jean  > wrote: 
> > I have a type 
> > 
> > 
> > typealias TColor RGBA{UfixedBase{Uint8,8}} 
> > typealias TImage Image{TColor, 2, Array{TColor, 2}} 
> > 
> > How do I know if it's a concrete type or not? Is there an isconcrete 
> > function somewhere? 
> > 
> > Incidentally, I'm asking this because I just updated my packages and had 
> to 
> > adjust to the recent Colors changes. My former definition of TColor was: 
> > 
> > typealias TColor 
> AlphaColorValue{RGB{UfixedBase{Uint8,8}},UfixedBase{Uint8,8 
> > }} 
> > 
> > and now for whatever reason concatenating images together is ~10X 
> slower. 
> > Since those are just arrays, I assume it's a type  problem. Those are 
> more 
> > frustrating and frequent than I expected, I must be doing something 
> wrong... 
> > 
> > Cédric 
>
>

[julia-users] Re: Type stability (or not) in core stats functions

2015-09-01 Thread Sisyphuss
I can't read low level code or tweak with the compiler. Could you try 
giving `mean` the default value `NaN`?



On Tuesday, September 1, 2015 at 7:29:59 PM UTC+2, Michael Francis wrote:
>
> Thanks, that is a good pointer.
>
> In this specific case its unfortunate that there is a keyword arg in the 
> API at all, having two functions one with a mean supplied and one without 
> would avoid this issue and remove the branch logic replacing it with static 
> dispatch. 
>
> On Tuesday, September 1, 2015 at 1:02:17 PM UTC-4, Jarrett Revels wrote:
>>
>> Actually, just saw this: https://github.com/JuliaLang/julia/issues/9818 
>> . Ignore the messed up 
>> @code_typed stuff in my previous reply to this thread.
>>
>> I believe the type-inference concerns are still there, however, even if 
>> @code_typed doesn't correctly report them, so the fixes I listed should 
>> still be useful for patching over inferencing problems with keyword 
>> arguments.
>>
>> Best,
>> Jarrett
>>
>> On Tuesday, September 1, 2015 at 12:49:02 PM UTC-4, Jarrett Revels wrote:
>>>
>>> Related: https://github.com/JuliaLang/julia/issues/9551
>>>
>>> Unfortunately, as you've seen, type-variadic keyword arguments can 
>>> really mess up type-inferencing. It appears that keyword argument types are 
>>> pulled from the default arguments rather than those actually passed in at 
>>> runtime:
>>>
>>> *julia> f(x; a=1, b=2) = a*x^b*
>>> *f (generic function with 1 method)*
>>>
>>> *julia> f(1)*
>>> *1*
>>>
>>> *julia> f(1, a=(3+im), b=5.15)*
>>> *3.0 + 1.0im*
>>>
>>> *julia> @code_typed f(1, a=(3+im), b=5.15)*
>>> *1-element Array{Any,1}:*
>>> * :($(Expr(:lambda, Any[:x], 
>>> Any[Any[Any[:x,Int64,0]],Any[],Any[Int64],Any[]], :(begin $(Expr(:line, 1, 
>>> :none, symbol("")))*
>>> *GenSym(0) = (Base.power_by_squaring)(x::Int64,2)::Int64*
>>> *return (Base.box)(Int64,(Base.mul_int)(1,GenSym(0)))::Int64*
>>> *end::Int64*
>>>
>>> Obviously, that specific call to f does NOT return an Int64.
>>>
>>> I know of only two reasonable ways to handle it at the moment:
>>>
>>> 1. If you're the method author: Restrict every keyword argument to a 
>>> declared, concrete type, which ensures that the argument isn't 
>>> type-variadic. Yichao basically gave an example of this.
>>> 2. If you're the method caller: Manually assert the return type. You can 
>>> do this pretty easily in most cases using a wrapper function. 
>>> Using `f` from above as an example:
>>>
>>> *julia> g{X,A,B}(x::X, a::A, b::B) = f(x, a=a, b=b)::promote_type(X, A, 
>>> B)*
>>> *g (generic function with 2 methods)*
>>>
>>> *julia> @code_typed g(1,2,3)*
>>> *1-element Array{Any,1}:*
>>> * :($(Expr(:lambda, Any[:x,:a,:b], 
>>> Any[Any[Any[:x,Int64,0],Any[:a,Int64,0],Any[:b,Int64,0]],Any[],Any[Int64],Any[:X,:A,:B]],
>>>  
>>> :(begin  # none, line 1:*
>>> *return 
>>> (top(typeassert))((top(kwcall))((top(getfield))(Main,:call)::F,2,:a,a::Int64,:b,b::Int64,Main.f,(top(ccall))(:jl_alloc_array_1d,(top(apply_type))(Base.Array,Any,1)::Type{Array{Any,1}},(top(svec))(Base.Any,Base.Int)::SimpleVector,Array{Any,1},0,4,0)::Array{Any,1},x::Int64),Int64)::Int64*
>>> *end::Int64*
>>>
>>> *julia> @code_typed g(1,2,3.0)*
>>> *1-element Array{Any,1}:*
>>> * :($(Expr(:lambda, Any[:x,:a,:b], 
>>> Any[Any[Any[:x,Int64,0],Any[:a,Int64,0],Any[:b,Float64,0]],Any[],Any[Int64],Any[:X,:A,:B]],
>>>  
>>> :(begin  # none, line 1:*
>>> *return 
>>> (top(typeassert))((top(kwcall))((top(getfield))(Main,:call)::F,2,:a,a::Int64,:b,b::Float64,Main.f,(top(ccall))(:jl_alloc_array_1d,(top(apply_type))(Base.Array,Any,1)::Type{Array{Any,1}},(top(svec))(Base.Any,Base.Int)::SimpleVector,Array{Any,1},0,4,0)::Array{Any,1},x::Int64),Float64)::Float64*
>>> *end::Float64*
>>>
>>> *julia> @code_typed g(1,2,3.0+im)*
>>> *1-element Array{Any,1}:*
>>> * :($(Expr(:lambda, Any[:x,:a,:b], 
>>> Any[Any[Any[:x,Int64,0],Any[:a,Int64,0],Any[:b,Complex{Float64},0]],Any[],Any[Int64],Any[:X,:A,:B]],
>>>  
>>> :(begin  # none, line 1:*
>>> *return 
>>> (top(typeassert))((top(kwcall))((top(getfield))(Main,:call)::F,2,:a,a::Int64,:b,b::Complex{Float64},Main.f,(top(ccall))(:jl_alloc_array_1d,(top(apply_type))(Base.Array,Any,1)::Type{Array{Any,1}},(top(svec))(Base.Any,Base.Int)::SimpleVector,Array{Any,1},0,4,0)::Array{Any,1},x::Int64),Complex{Float64})::Complex{Float64}*
>>> *end::Complex{Float64}*
>>>
>>> Thus, downstream functions can call *f* through *g, *preventing 
>>> type-instability from "bubbling up" to the calling methods (as it would if 
>>> they called *f* directly).
>>>
>>> Best,
>>> Jarrett
>>>
>>> On Tuesday, September 1, 2015 at 8:39:11 AM UTC-4, Michael Francis wrote:

 2) The underlying functions are only stable if the mean passed to them 
 is of the correct type, e.g. a number. Essentially this is a type 
 inference 
 issue, if the compiler was able to optimize  the branches then it would be 
 likely be 

[julia-users] Re: Performing conversion on a tuple of arguments

2015-09-01 Thread Scott Jones
Thanks everybody!  That helped!

I ended up doing: getindex(c, map(Int, I)...).
Performance isn't so much of an issue, I wanted it to be Julian and 
succint, to put in a deprecation warning.

Scott

On Tuesday, September 1, 2015 at 2:59:54 PM UTC-4, Jeffrey Sarnoff wrote:
>
>
> Of the variations I timed, all with the signature test(a::Tuple), this ran 
> fastest:
>
> using FastAnonymous
>
> fa = @anon x->floor(Integer,x)
>
> function test(a::Tuple)
>  [x->fa(x) for x in a]
> end
>
> I do not know how to access the tuple that is the args as passed directly.
> Good question, it raises others.
>
>
> On Tuesday, September 1, 2015 at 2:34:57 PM UTC-4, Seth wrote:
>>
>> More precisely,
>>
>> foo(a::Real...) = map(x->floor(Integer,x),a)
>>
>>
>>
>> On Tuesday, September 1, 2015 at 11:32:15 AM UTC-7, Seth wrote:
>>>
>>> Assuming you have such a conversion function available (let's use 
>>> floor() for our example), you could do
>>> a = (1.1, 2.2, 3.3, 4.4)
>>> map(x->floor(Integer,x), a)
>>>
>>>
>>> which yields a tuple of ints. I don't claim "best" or even "Julian"; 
>>> there may be a more clever approach. :)
>>>
>>>
>>> On Tuesday, September 1, 2015 at 11:02:00 AM UTC-7, Scott Jones wrote:

 What is the best Julian way of performing a conversion on all of the 
 arguments to a method?
 Say I have a tuple, passed in as I::Real..., and I wish to convert all 
 of the values to type Int?

 Thanks,
 Scott



[julia-users] Re: ANN: JuMP 0.10 released

2015-09-01 Thread Jarrett Revels
Woohoo! That's a lot of new stuff.

-- Jarrett

On Tuesday, September 1, 2015 at 12:41:26 AM UTC-4, Miles Lubin wrote:
>
> The JuMP team is happy to announce the release of JuMP 0.10.
>
> This is a major release with the greatest amount of new functionality 
> since the addition of nonlinear modeling last year. This will likely be the 
> last major release of JuMP to support Julia 0.3. Thanks to the heroic work 
> of Joey Huchette, JuMP now supports *vectorized syntax* and modeling for 
> *semidefinite 
> programming*.
>
> You can now write, for example:
>
> @defVar(m, x[1:5])
> @addConstraint(m, A*x .== 0)
>
> where A is a Julia matrix (dense or sparse). Note that we require dot 
> comparison operators .== (and similarly .<= and .>=) for vectorized 
> constraints. The vectorized syntax extends to quadratic but not general 
> nonlinear expressions.
>
> An important new concept to keep in mind is that this vectorized syntax 
> only applies to sets of variables which are one-based arrays. If you 
> declare variables indexed by more complicated sets, e.g.,
>
> @defVar(m, y[3:5])
> s = [:cat, :dog, :pizza]
> @defVar(m, z[s])
>
> then dot(y,z) and rand(3,3)*z are undefined. A result of this new concept 
> of one-based arrays is that x above now has the type Vector{JuMP.Variable}. 
> In this case, getValue() now returns a Vector{Float64} instead of an 
> opaque JuMP object. We hope users find this new distinction between 
> one-indexed array variables and all other symbolically indexed variables 
> useful and intuitive (if not, let us know).
>
> For semidefinite modeling, you can declare variables as SDP matrices and 
> add LMI (linear matrix inequality) constraints as illustrated in the 
> examples for minimal ellipse 
> 
>  and 
> max cut 
> ,
>  
> among others.
>
> We also have a *new syntax for euclidean norms:*
>
> @addConstraint(m, norm2{c[i]*x[i]+b[i],i=1:N} <= 10)
> # or
> @addConstraint(m, norm(c.*x+b) <= 10)
>
> You may be wondering how JuMP compares with Convex.jl given these new 
> additions. Not much has changed philosophically; JuMP directly translates 
> SDP constraints and euclidean norms into the sparse matrix formats as 
> required by conic solvers. Unlike Convex.jl, *JuMP accepts only 
> standard-form SDP and second-order conic constraints and will not perform 
> any automatic transformations* such as modeling nuclear norms, minimum 
> eigenvalue, geometric mean, rational norms, etc. We would recommend using 
> Convex.jl for easy modeling of such functions. Our focus, for now, is on 
> the large-scale performance and stability of the huge amount of new syntax 
> introduced in this release.
>
> Also notable in this release:
> - JuMP models now store a dictionary of attached variables, so that you 
> can look up a variable from a model by name by using the new getVar() 
> method.
> - On Julia 0.4 only, you can now have a filter variable declarations, e.g.,
> @defVar(m, x[i=1:5,j=1:5; i+j >= 3])
> will only create variables for the indices which satisfy the filter 
> condition. (These are not one-based arrays as introduced above.)
> - Dual multipliers are available for nonlinear problems from the solvers 
> which provide them
> - There is improved documentation for querying derivatives from a 
> nonlinear JuMP model 
> 
> - *We now try to print warnings for two common performance traps*: 
> calling getValue() in a tight loop and using operator overloading to 
> construct large JuMP expressions. Please let us know if these are useful or 
> annoying or both so that we can tune the warning thresholds.
> - Thanks to Tony Kelman and Jack Dunn, you can now call a large number of 
> external solvers including Bonmin and Couenne through either the .osil or 
> .nl exchange formats.
> - Module precompilation speeds up using JuMP considerably, for those on 
> Julia 0.4
>
> The delay since the last release of JuMP is mostly due to us trying to 
> test and refine the new syntax, but inevitably some bugs have slipped 
> through, so please let us know of any incorrect or confusing behavior.
>
> Also newsworthy is our new paper  
> describing the methods used in JuMP with benchmark comparisons to existing 
> open-source and commercial optimization modeling software.
>
> Miles, Iain, and Joey
>
>

Re: [julia-users] Help for 'mean' (version 0.4)

2015-09-01 Thread Matt Bauman
You have shadowed Base.mean with a local `mean` that was bound to `0`.  Try 
help for Base.mean and 0.

On Tuesday, September 1, 2015 at 1:18:17 PM UTC-4, Miguel Bazdresch wrote:
>
> The Segmentation Fault strongly suggests a glitch, not an Easter egg, 
> unfortunately... :)
>
> On Tue, Sep 1, 2015 at 1:12 PM,  wrote:
>
>> Using 0.4: 
>>
>> help?> mean
>> search: mean mean! median median! SegmentationFault macroexpand 
>> module_parent Meta enumerate Enumerate timedwait primes mask 
>> remotecall remotecall_wait remotecall_fetch MethodTable
>>
>>   0 (zero; BrE: /ˈzɪərəʊ/ or AmE: /ˈziːroʊ/) is both a number and the 
>> numerical digit used to represent that number in numerals. It fulfills a 
>> central role in 
>>   mathematics as the additive identity of the integers, real numbers, 
>> and many other algebraic structures. As a digit, 0 is used as a placeholder 
>> in 
>>   place value systems. Names for the number 0 in English include 
>> zero, nought or (US) naught (/ˈnɔːt/), nil, or — in contexts where at least 
>> one 
>>   adjacent digit distinguishes it from the letter "O" — oh or o 
>> (/ˈoʊ/). Informal or slang terms for zero include zilch and zip. Ought and 
>> aught (/ˈɔːt/), 
>>   as well as cipher, have also been used historically.
>>
>> I like this, but was expecting information about the `mean` function... :)
>>
>> I tried a few other words, but couldn't get their definitions. 
>> Disappointed not to find `zilch`:
>>
>> Couldn't find zilch
>> Perhaps you meant zip
>>
>> True enough.
>>
>> Or perhaps this is just a nice Easter egg, rather than an interesting 
>> glitch in the software matrix...
>>
>>
>

[julia-users] How do I know if a type is concrete?

2015-09-01 Thread Cedric St-Jean
I have a type


typealias TColor RGBA{UfixedBase{Uint8,8}}
typealias TImage Image{TColor, 2, Array{TColor, 2}}

How do I know if it's a concrete type or not? Is there an isconcrete 
function somewhere?

Incidentally, I'm asking this because I just updated my packages and had to 
adjust to the recent Colors changes. My former definition of TColor was:

typealias TColor AlphaColorValue{RGB{UfixedBase{Uint8,8}},UfixedBase{Uint8,8
}}

and now for whatever reason concatenating images together is ~10X slower. 
Since those are just arrays, I assume it's a type  problem. Those are more 
frustrating and frequent than I expected, I must be doing something wrong...

Cédric


[julia-users] @parallel and HDF5 is posible to work together?

2015-09-01 Thread paul analyst

@parallel  and HDF5 is posible to work together?

julia> addprocs(6)
6-element Array{Any,1}:
 2
 3
 4
 5
 6
 7

julia> @parallel for i=1:l
   dset=fid["punkts"*string(i)]
   f=vec(h5read("F.h5","F",(:,i))); 
   .
   end

WARNING: Module HDF5 not defined on process 2
WARNING: Module HDF5 not defined on process 3
julia> close(fid)

julia> fatal error on 3: WARNING: Module HDF5 not defined on process 6
WARNING: Module HDF5 not defined on process 7
WARNING: Module HDF5 not defined on process 4
fatal error on 6: fatal error on fatal error on 74: : WARNING: Module HDF5 
not defined on process 5
fatal error on 5: ERROR: HDF5 not defined
 in deserialize at serialize.jl:377
 in handle_deserialize at serialize.jl:352
 in deserialize at serialize.jl:506
 in handle_deserialize at serialize.jl:352

Paul


Re: [julia-users] Julia's text mining capabilities

2015-09-01 Thread Mauro
This may have answers:
https://youtu.be/dgfIIZ5yA4E
(though I haven't watched it yet)

On Tue, 2015-09-01 at 14:42, Venkat Ramakrishnan  
wrote:
> Folks,
>
> How fast is Julia compared to other languages like R and Python in text 
> processing?
> Any benchmarks?
>
> Any parallel processing facility specifically available for text processing 
> in Julia?
>
> Thanks,
> Venkat.



[julia-users] Re: Performing conversion on a tuple of arguments

2015-09-01 Thread Jeffrey Sarnoff

Of the variations I timed, all with the signature test(a::Tuple), this ran 
fastest:

using FastAnonymous

fa = @anon x->floor(Integer,x)

function test(a::Tuple)
 [x->fa(x) for x in a]
end

I do not know how to access the tuple that is the args as passed directly.
Good question, it raises others.


On Tuesday, September 1, 2015 at 2:34:57 PM UTC-4, Seth wrote:
>
> More precisely,
>
> foo(a::Real...) = map(x->floor(Integer,x),a)
>
>
>
> On Tuesday, September 1, 2015 at 11:32:15 AM UTC-7, Seth wrote:
>>
>> Assuming you have such a conversion function available (let's use floor() 
>> for our example), you could do
>> a = (1.1, 2.2, 3.3, 4.4)
>> map(x->floor(Integer,x), a)
>>
>>
>> which yields a tuple of ints. I don't claim "best" or even "Julian"; 
>> there may be a more clever approach. :)
>>
>>
>> On Tuesday, September 1, 2015 at 11:02:00 AM UTC-7, Scott Jones wrote:
>>>
>>> What is the best Julian way of performing a conversion on all of the 
>>> arguments to a method?
>>> Say I have a tuple, passed in as I::Real..., and I wish to convert all 
>>> of the values to type Int?
>>>
>>> Thanks,
>>> Scott
>>>
>>>

[julia-users] Re: Performing conversion on a tuple of arguments

2015-09-01 Thread Seth
Assuming you have such a conversion function available (let's use floor() 
for our example), you could do
a = (1.1, 2.2, 3.3, 4.4)
map(x->floor(Integer,x), a)


which yields a tuple of ints. I don't claim "best" or even "Julian"; there 
may be a more clever approach. :)


On Tuesday, September 1, 2015 at 11:02:00 AM UTC-7, Scott Jones wrote:
>
> What is the best Julian way of performing a conversion on all of the 
> arguments to a method?
> Say I have a tuple, passed in as I::Real..., and I wish to convert all of 
> the values to type Int?
>
> Thanks,
> Scott
>
>

[julia-users] OSX support for package testing on Travis is now available by default!

2015-09-01 Thread Tony Kelman
Package authors may find this useful, you formerly had to send an email 
requesting the feature be enabled per repository but now it's available for 
all. 
Ref 
https://github.com/travis-ci/docs-travis-ci-com/commit/8a4efe6e6bfb0dcce760eedabd2ffe640d6545d5

Assuming you're using language: julia in your .travis.yml file, this should 
be as simple as adding

os:
  - linux
  - osx





[julia-users] Help for 'mean' (version 0.4)

2015-09-01 Thread cormullion
Using 0.4: 

help?> mean
search: mean mean! median median! SegmentationFault macroexpand 
module_parent Meta enumerate Enumerate timedwait primes mask 
remotecall remotecall_wait remotecall_fetch MethodTable

  0 (zero; BrE: /ˈzɪərəʊ/ or AmE: /ˈziːroʊ/) is both a number and the 
numerical digit used to represent that number in numerals. It fulfills a 
central role in 
  mathematics as the additive identity of the integers, real numbers, 
and many other algebraic structures. As a digit, 0 is used as a placeholder 
in 
  place value systems. Names for the number 0 in English include zero, 
nought or (US) naught (/ˈnɔːt/), nil, or — in contexts where at least one 
  adjacent digit distinguishes it from the letter "O" — oh or o 
(/ˈoʊ/). Informal or slang terms for zero include zilch and zip. Ought and 
aught (/ˈɔːt/), 
  as well as cipher, have also been used historically.

I like this, but was expecting information about the `mean` function... :)

I tried a few other words, but couldn't get their definitions. Disappointed 
not to find `zilch`:

Couldn't find zilch
Perhaps you meant zip

True enough.

Or perhaps this is just a nice Easter egg, rather than an interesting 
glitch in the software matrix...



Re: [julia-users] Help for 'mean' (version 0.4)

2015-09-01 Thread Miguel Bazdresch
The Segmentation Fault strongly suggests a glitch, not an Easter egg,
unfortunately... :)

On Tue, Sep 1, 2015 at 1:12 PM,  wrote:

> Using 0.4:
>
> help?> mean
> search: mean mean! median median! SegmentationFault macroexpand
> module_parent Meta enumerate Enumerate timedwait primes mask
> remotecall remotecall_wait remotecall_fetch MethodTable
>
>   0 (zero; BrE: /ˈzɪərəʊ/ or AmE: /ˈziːroʊ/) is both a number and the
> numerical digit used to represent that number in numerals. It fulfills a
> central role in
>   mathematics as the additive identity of the integers, real numbers,
> and many other algebraic structures. As a digit, 0 is used as a placeholder
> in
>   place value systems. Names for the number 0 in English include zero,
> nought or (US) naught (/ˈnɔːt/), nil, or — in contexts where at least one
>   adjacent digit distinguishes it from the letter "O" — oh or o
> (/ˈoʊ/). Informal or slang terms for zero include zilch and zip. Ought and
> aught (/ˈɔːt/),
>   as well as cipher, have also been used historically.
>
> I like this, but was expecting information about the `mean` function... :)
>
> I tried a few other words, but couldn't get their definitions.
> Disappointed not to find `zilch`:
>
> Couldn't find zilch
> Perhaps you meant zip
>
> True enough.
>
> Or perhaps this is just a nice Easter egg, rather than an interesting
> glitch in the software matrix...
>
>


[julia-users] Re: ANN: JuMP 0.10 released

2015-09-01 Thread Fabrizio Lacalandra
V great job! Shall i suggest to start thinking at:

1) @removeconstraint
2) @defvar with indication on priority branching for integer vars, accepted 
by Cplex/Gurobi at least
3) Some kind of special Constraint Programming-like construct, such as 
classic allDiff and more advanced things?
4) Start a discussion on how modularity in the problem construction can be 
enhanced (not sure in which direction)

BTW do we have a wish list of JuMP somewhere ?

Thanks
Fabrizio 

On Tuesday, September 1, 2015 at 6:41:21 AM UTC+2, Miles Lubin wrote:
>
> The JuMP team is happy to announce the release of JuMP 0.10.
>
> This is a major release with the greatest amount of new functionality 
> since the addition of nonlinear modeling last year. This will likely be the 
> last major release of JuMP to support Julia 0.3. Thanks to the heroic work 
> of Joey Huchette, JuMP now supports *vectorized syntax* and modeling for 
> *semidefinite 
> programming*.
>
> You can now write, for example:
>
> @defVar(m, x[1:5])
> @addConstraint(m, A*x .== 0)
>
> where A is a Julia matrix (dense or sparse). Note that we require dot 
> comparison operators .== (and similarly .<= and .>=) for vectorized 
> constraints. The vectorized syntax extends to quadratic but not general 
> nonlinear expressions.
>
> An important new concept to keep in mind is that this vectorized syntax 
> only applies to sets of variables which are one-based arrays. If you 
> declare variables indexed by more complicated sets, e.g.,
>
> @defVar(m, y[3:5])
> s = [:cat, :dog, :pizza]
> @defVar(m, z[s])
>
> then dot(y,z) and rand(3,3)*z are undefined. A result of this new concept 
> of one-based arrays is that x above now has the type Vector{JuMP.Variable}. 
> In this case, getValue() now returns a Vector{Float64} instead of an 
> opaque JuMP object. We hope users find this new distinction between 
> one-indexed array variables and all other symbolically indexed variables 
> useful and intuitive (if not, let us know).
>
> For semidefinite modeling, you can declare variables as SDP matrices and 
> add LMI (linear matrix inequality) constraints as illustrated in the 
> examples for minimal ellipse 
> 
>  and 
> max cut 
> ,
>  
> among others.
>
> We also have a *new syntax for euclidean norms:*
>
> @addConstraint(m, norm2{c[i]*x[i]+b[i],i=1:N} <= 10)
> # or
> @addConstraint(m, norm(c.*x+b) <= 10)
>
> You may be wondering how JuMP compares with Convex.jl given these new 
> additions. Not much has changed philosophically; JuMP directly translates 
> SDP constraints and euclidean norms into the sparse matrix formats as 
> required by conic solvers. Unlike Convex.jl, *JuMP accepts only 
> standard-form SDP and second-order conic constraints and will not perform 
> any automatic transformations* such as modeling nuclear norms, minimum 
> eigenvalue, geometric mean, rational norms, etc. We would recommend using 
> Convex.jl for easy modeling of such functions. Our focus, for now, is on 
> the large-scale performance and stability of the huge amount of new syntax 
> introduced in this release.
>
> Also notable in this release:
> - JuMP models now store a dictionary of attached variables, so that you 
> can look up a variable from a model by name by using the new getVar() 
> method.
> - On Julia 0.4 only, you can now have a filter variable declarations, e.g.,
> @defVar(m, x[i=1:5,j=1:5; i+j >= 3])
> will only create variables for the indices which satisfy the filter 
> condition. (These are not one-based arrays as introduced above.)
> - Dual multipliers are available for nonlinear problems from the solvers 
> which provide them
> - There is improved documentation for querying derivatives from a 
> nonlinear JuMP model 
> 
> - *We now try to print warnings for two common performance traps*: 
> calling getValue() in a tight loop and using operator overloading to 
> construct large JuMP expressions. Please let us know if these are useful or 
> annoying or both so that we can tune the warning thresholds.
> - Thanks to Tony Kelman and Jack Dunn, you can now call a large number of 
> external solvers including Bonmin and Couenne through either the .osil or 
> .nl exchange formats.
> - Module precompilation speeds up using JuMP considerably, for those on 
> Julia 0.4
>
> The delay since the last release of JuMP is mostly due to us trying to 
> test and refine the new syntax, but inevitably some bugs have slipped 
> through, so please let us know of any incorrect or confusing behavior.
>
> Also newsworthy is our new paper  
> describing the methods used in JuMP with benchmark comparisons to existing 
> open-source and commercial optimization modeling software.
>
> 

[julia-users] Re: Performing conversion on a tuple of arguments

2015-09-01 Thread Seth
More precisely,

foo(a::Real...) = map(x->floor(Integer,x),a)



On Tuesday, September 1, 2015 at 11:32:15 AM UTC-7, Seth wrote:
>
> Assuming you have such a conversion function available (let's use floor() 
> for our example), you could do
> a = (1.1, 2.2, 3.3, 4.4)
> map(x->floor(Integer,x), a)
>
>
> which yields a tuple of ints. I don't claim "best" or even "Julian"; there 
> may be a more clever approach. :)
>
>
> On Tuesday, September 1, 2015 at 11:02:00 AM UTC-7, Scott Jones wrote:
>>
>> What is the best Julian way of performing a conversion on all of the 
>> arguments to a method?
>> Say I have a tuple, passed in as I::Real..., and I wish to convert all of 
>> the values to type Int?
>>
>> Thanks,
>> Scott
>>
>>

Re: [julia-users] Re: MongoDB and Julia

2015-09-01 Thread Tim Lebel
Did you try running Pkg.checkout("Mongo")? I believe that I fixed this, but
the maintainer may not have pushed a new tag.

On Tue, Sep 1, 2015 at 9:14 AM, Ferenc Szalma  wrote:

> Kevin,
>
> I also managed to get Pzion's Mongo.jl to work in Julia v0.3. Now, I am
> trying to make it work in v0.4 but getting an error message while trying to
> insert:
>
> oid = insert(collection, {"name"=>"time series"})
>
>
>
> WARNING: deprecated syntax "{a=>b, ...}" at In[36]:1. Use "Dict{Any,Any}(a=>b,
> ...)" instead.
>
> LoadError: MethodError: `convert` has no method matching convert(::Type{
> Ptr{Void}}, ::Array{UInt8,1})
> This may have arisen from a call to the constructor Ptr{Void}(...),
> since type constructors fall back to convert methods.
> Closest candidates are:
>  call{T}(::Type{T}, ::Any)
>  convert{T}(::Type{Ptr{T}}, !Matched::UInt64)
>  convert{T}(::Type{Ptr{T}}, !Matched::Int64)
>  ...
> while loading In[36], in expression starting on line 1
>
>
>  in insert at /Users/szalmaf/.julia/v0.4/Mongo/src/MongoCollection.jl:42
>
>
> Did you try Mongo.jl in Julia v0.4? Do you have any suggestions as to how
> to go about getting rid of the LoadError above? It seems like a generic
> problem when switching from v0.3 to v0.4.
>
> Cheers
>
>


[julia-users] How parallel count and save the results ?

2015-09-01 Thread paul analyst
How parallel count and save the results ?
I have a large matrix A saved in the file hdf5 10 ^ 7 x 10 ^ 3.
Each column was converted to the write down in the matrix B of the same 
size. Each column independently.
The calculations are tedious but not borne by the processor.
How does this process do parallel?
Paul


[julia-users] Re: IDE for Julia

2015-09-01 Thread Jeffrey Sarnoff
What is your environment?

On Tuesday, September 1, 2015 at 10:17:58 PM UTC-4, Jeffrey Sarnoff wrote:
>
> That happened to me when forgot to start atom from the command line and 
> used the menu entry or shortcut instead.
>
> On Tuesday, September 1, 2015 at 9:42:58 PM UTC-4, Oleg Mikulchenko wrote:
>>
>> I agree. BTW, does someone consider Eclipse plugin for Julia? Similar to 
>> pydev. For scientific work I prefer Spyder, but for debugging Pydev is more 
>> powerful. Not that I like Eclipse, just no choice to avoid it :)
>>
>> For some reasons, Python run on Atom with Hydrogen, but Julia doesn't, it 
>> is hanging forever when I put Ctrl+Alt+Enter with highlighted code. Any 
>> advice?  
>>
>> On Tuesday, September 1, 2015 at 7:05:59 AM UTC-7, STAR0SS wrote:
>>>
>>> I think a good IDE should have:
>>>
>>> - A proper console and a good way to send single line and block of codes 
>>> to it (e.g. matlab's code section)
>>> - A decent text editor
>>> - Integrated plots
>>> - Proper window management (docking, etc) so you don't have windows 
>>> everywhere 
>>>
>>> All this meet two others broad and important goals, namely having a good 
>>> plotting solution and being able to make GUI applications in Julia.
>>> For these reasons I feel like something like Julietta was the way to go.
>>>
>>>

[julia-users] Re: IDE for Julia

2015-09-01 Thread Jeffrey Sarnoff
That happened to me when forgot to start atom from the command line and 
used the menu entry or shortcut instead.

On Tuesday, September 1, 2015 at 9:42:58 PM UTC-4, Oleg Mikulchenko wrote:
>
> I agree. BTW, does someone consider Eclipse plugin for Julia? Similar to 
> pydev. For scientific work I prefer Spyder, but for debugging Pydev is more 
> powerful. Not that I like Eclipse, just no choice to avoid it :)
>
> For some reasons, Python run on Atom with Hydrogen, but Julia doesn't, it 
> is hanging forever when I put Ctrl+Alt+Enter with highlighted code. Any 
> advice?  
>
> On Tuesday, September 1, 2015 at 7:05:59 AM UTC-7, STAR0SS wrote:
>>
>> I think a good IDE should have:
>>
>> - A proper console and a good way to send single line and block of codes 
>> to it (e.g. matlab's code section)
>> - A decent text editor
>> - Integrated plots
>> - Proper window management (docking, etc) so you don't have windows 
>> everywhere 
>>
>> All this meet two others broad and important goals, namely having a good 
>> plotting solution and being able to make GUI applications in Julia.
>> For these reasons I feel like something like Julietta was the way to go.
>>
>>

[julia-users] Re: ANN: JuMP 0.10 released

2015-09-01 Thread Joey Huchette

On Tuesday, September 1, 2015 at 1:27:25 PM UTC-4, Fabrizio Lacalandra 
wrote:
>
> V great job! Shall i suggest to start thinking at:
>
> 1) @removeconstraint
>

Should be pretty easy to add a function that does this in just a few lines 
of code.
 

> 2) @defvar with indication on priority branching for integer vars, 
> accepted by Cplex/Gurobi at least
>

It's not quite at the JuMP level, but you can set the branching priority 
[for 
CPLEX](https://github.com/JuliaOpt/CPLEX.jl/blob/25ebbf1c8444c961045b9b15b1a225432eadb811/src/cpx_solve.jl#L15-L37).
 

> 3) Some kind of special Constraint Programming-like construct, such as 
> classic allDiff and more advanced things?
>

This is definitely on the agenda! It will probably be most natural as a 
separate package, but this seems like a natural choice for the next "big" 
project.
 

> 4) Start a discussion on how modularity in the problem construction can be 
> enhanced (not sure in which direction)
>

We'd definitely appreciate any user feedback on how this should work.
 

>
> BTW do we have a wish list of JuMP somewhere ?
>

No official list, but feel free to open issues for feature requests on the 
Github page.
 

>
> Thanks
> Fabrizio 
>
> On Tuesday, September 1, 2015 at 6:41:21 AM UTC+2, Miles Lubin wrote:
>>
>> The JuMP team is happy to announce the release of JuMP 0.10.
>>
>> This is a major release with the greatest amount of new functionality 
>> since the addition of nonlinear modeling last year. This will likely be the 
>> last major release of JuMP to support Julia 0.3. Thanks to the heroic work 
>> of Joey Huchette, JuMP now supports *vectorized syntax* and modeling for 
>> *semidefinite 
>> programming*.
>>
>> You can now write, for example:
>>
>> @defVar(m, x[1:5])
>> @addConstraint(m, A*x .== 0)
>>
>> where A is a Julia matrix (dense or sparse). Note that we require dot 
>> comparison operators .== (and similarly .<= and .>=) for vectorized 
>> constraints. The vectorized syntax extends to quadratic but not general 
>> nonlinear expressions.
>>
>> An important new concept to keep in mind is that this vectorized syntax 
>> only applies to sets of variables which are one-based arrays. If you 
>> declare variables indexed by more complicated sets, e.g.,
>>
>> @defVar(m, y[3:5])
>> s = [:cat, :dog, :pizza]
>> @defVar(m, z[s])
>>
>> then dot(y,z) and rand(3,3)*z are undefined. A result of this new 
>> concept of one-based arrays is that x above now has the type 
>> Vector{JuMP.Variable}. In this case, getValue() now returns a 
>> Vector{Float64} instead of an opaque JuMP object. We hope users find 
>> this new distinction between one-indexed array variables and all other 
>> symbolically indexed variables useful and intuitive (if not, let us know).
>>
>> For semidefinite modeling, you can declare variables as SDP matrices and 
>> add LMI (linear matrix inequality) constraints as illustrated in the 
>> examples for minimal ellipse 
>> 
>>  and 
>> max cut 
>> ,
>>  
>> among others.
>>
>> We also have a *new syntax for euclidean norms:*
>>
>> @addConstraint(m, norm2{c[i]*x[i]+b[i],i=1:N} <= 10)
>> # or
>> @addConstraint(m, norm(c.*x+b) <= 10)
>>
>> You may be wondering how JuMP compares with Convex.jl given these new 
>> additions. Not much has changed philosophically; JuMP directly translates 
>> SDP constraints and euclidean norms into the sparse matrix formats as 
>> required by conic solvers. Unlike Convex.jl, *JuMP accepts only 
>> standard-form SDP and second-order conic constraints and will not perform 
>> any automatic transformations* such as modeling nuclear norms, minimum 
>> eigenvalue, geometric mean, rational norms, etc. We would recommend using 
>> Convex.jl for easy modeling of such functions. Our focus, for now, is on 
>> the large-scale performance and stability of the huge amount of new syntax 
>> introduced in this release.
>>
>> Also notable in this release:
>> - JuMP models now store a dictionary of attached variables, so that you 
>> can look up a variable from a model by name by using the new getVar() 
>> method.
>> - On Julia 0.4 only, you can now have a filter variable declarations, 
>> e.g.,
>> @defVar(m, x[i=1:5,j=1:5; i+j >= 3])
>> will only create variables for the indices which satisfy the filter 
>> condition. (These are not one-based arrays as introduced above.)
>> - Dual multipliers are available for nonlinear problems from the solvers 
>> which provide them
>> - There is improved documentation for querying derivatives from a 
>> nonlinear JuMP model 
>> 
>> - *We now try to print warnings for two common performance traps*: 
>> calling getValue() in a tight loop and using operator overloading to 
>> construct large JuMP 

Re: [julia-users] Re: Pyplot graphic error

2015-09-01 Thread Juan Carlos Cuevas Bautista
Hi Steve,

After some test in my plots and googling. I changed my version of Gnuplot
from 4.6. patchlevel 4 to gnuplot 4.6 patchlevel 6 and  julia is working 
perfectly now.
I don't know how julia and Gnuplot are related but it works.

On Thursday, August 27, 2015 at 8:24:43 AM UTC-4, Juan Carlos Cuevas 
Bautista wrote:
>
> Hi Steve, 
>
> I am not using Ijulia, I am using just the terminal. I used the command 
>
> savefig("exponentialjl.pdf"); 
>
> to save my figure in julia. On the other hand I save the figure like 
> png in python 
> savefig('exponential.png') 
>
> and it's working perfectly. Actually I also use octave and the plots 
> look kind of messy too, specially the legends. 
> I think it can be a problem with gnuplot, but I know that gnuplot is 
> not related with julia, so I don't know why I have 
> this error in julia. 
>
> 2015-08-27 8:04 GMT-04:00 Steven G. Johnson : 
> > Are you using PyPlot from IJulia?  It uses PNG to display the image.   
> What 
> > happens if you savefig('foo.png') in Python? 
>


[julia-users] Tk (Cairo) Canvas not scrollable?

2015-09-01 Thread j verzani
Maybe add a sized Canvas into a TkCanvas and add scroll bars to the latter?

[julia-users] Re: IDE for Julia

2015-09-01 Thread Oleg Mikulchenko
I agree. BTW, does someone consider Eclipse plugin for Julia? Similar to 
pydev. For scientific work I prefer Spyder, but for debugging Pydev is more 
powerful. Not that I like Eclipse, just no choice to avoid it :)

For some reasons, Python run on Atom with Hydrogen, but Julia doesn't, it 
is hanging forever when I put Ctrl+Alt+Enter with highlighted code. Any 
advice?  

On Tuesday, September 1, 2015 at 7:05:59 AM UTC-7, STAR0SS wrote:
>
> I think a good IDE should have:
>
> - A proper console and a good way to send single line and block of codes 
> to it (e.g. matlab's code section)
> - A decent text editor
> - Integrated plots
> - Proper window management (docking, etc) so you don't have windows 
> everywhere 
>
> All this meet two others broad and important goals, namely having a good 
> plotting solution and being able to make GUI applications in Julia.
> For these reasons I feel like something like Julietta was the way to go.
>
>

Re: [julia-users] Re: MongoDB and Julia

2015-09-01 Thread Ferenc Szalma
Alright. So I had to checkout LibBSON and build it separately form Mongo, 
then restarting the kernel insert works in v0.4, too.

On Tuesday, September 1, 2015 at 7:14:40 PM UTC-4, Ferenc Szalma wrote:
>
> I just tried `delete` and `find` and these two operations work. `insert` 
> however does not work.
>
> On Tuesday, September 1, 2015 at 3:41:27 PM UTC-4, Tim Lebel wrote:
>>
>> Did you try running Pkg.checkout("Mongo")? I believe that I fixed this, 
>> but the maintainer may not have pushed a new tag.
>>
>> On Tue, Sep 1, 2015 at 9:14 AM, Ferenc Szalma  wrote:
>>
>>> Kevin,
>>>
>>> I also managed to get Pzion's Mongo.jl to work in Julia v0.3. Now, I am 
>>> trying to make it work in v0.4 but getting an error message while trying to 
>>> insert:
>>>
>>> oid = insert(collection, {"name"=>"time series"})
>>>
>>>
>>>
>>> WARNING: deprecated syntax "{a=>b, ...}" at In[36]:1. Use 
>>> "Dict{Any,Any}(a=>b, 
>>> ...)" instead. 
>>>
>>> LoadError: MethodError: `convert` has no method matching convert(::Type{
>>> Ptr{Void}}, ::Array{UInt8,1})
>>> This may have arisen from a call to the constructor Ptr{Void}(...),
>>> since type constructors fall back to convert methods.
>>> Closest candidates are:
>>>  call{T}(::Type{T}, ::Any)
>>>  convert{T}(::Type{Ptr{T}}, !Matched::UInt64)
>>>  convert{T}(::Type{Ptr{T}}, !Matched::Int64)
>>>  ...
>>> while loading In[36], in expression starting on line 1
>>>  
>>>
>>>  in insert at /Users/szalmaf/.julia/v0.4/Mongo/src/MongoCollection.jl:42 
>>>
>>>
>>> Did you try Mongo.jl in Julia v0.4? Do you have any suggestions as to 
>>> how to go about getting rid of the LoadError above? It seems like a generic 
>>> problem when switching from v0.3 to v0.4.
>>>
>>> Cheers
>>>
>>>
>>

[julia-users] threads and processes, @async vs. @spawn/@parallel?

2015-09-01 Thread John Brock
I've read through the parallel computing documentation and experimented 
with some toy examples, but I still have some questions about the 
fundamentals of parallel programming in Julia:

   1. It seems that @async performs work in a separate green thread, while 
   @spawn performs work in a separate julia process. Is that right?
   2. Will code executed with several calls to @async actually run on 
   separate cores?
   3. Getting true parallelism with @spawn or @parallel requires launching 
   julia with the -p flag or using addprocs(...). Does the same hold for 
   @async, i.e., will I get true parallelism with several calls to @async if I 
   only have a single julia process?
   4. In what situations should I choose @spawn over @async, and vice versa?
   5. How does scope and serialization work with regards to @async? If the 
   code being executed with @async references some Array, will each thread get 
   a copy of that Array, like if I had called @spawn instead? Or will each 
   thread have access to the same Array, obviating the need for SharedArray 
   when using @async?
   

A lot of this stuff is left ambiguous in the documentation, but I'd be 
happy to submit a pull request with updates if I can get some clear 
answers. Thanks!


-John


Re: [julia-users] Re: MongoDB and Julia

2015-09-01 Thread Ferenc Szalma
I just tried `delete` and `find` and these two operations work. `insert` 
however does not work.

On Tuesday, September 1, 2015 at 3:41:27 PM UTC-4, Tim Lebel wrote:
>
> Did you try running Pkg.checkout("Mongo")? I believe that I fixed this, 
> but the maintainer may not have pushed a new tag.
>
> On Tue, Sep 1, 2015 at 9:14 AM, Ferenc Szalma  > wrote:
>
>> Kevin,
>>
>> I also managed to get Pzion's Mongo.jl to work in Julia v0.3. Now, I am 
>> trying to make it work in v0.4 but getting an error message while trying to 
>> insert:
>>
>> oid = insert(collection, {"name"=>"time series"})
>>
>>
>>
>> WARNING: deprecated syntax "{a=>b, ...}" at In[36]:1. Use 
>> "Dict{Any,Any}(a=>b, 
>> ...)" instead. 
>>
>> LoadError: MethodError: `convert` has no method matching convert(::Type{
>> Ptr{Void}}, ::Array{UInt8,1})
>> This may have arisen from a call to the constructor Ptr{Void}(...),
>> since type constructors fall back to convert methods.
>> Closest candidates are:
>>  call{T}(::Type{T}, ::Any)
>>  convert{T}(::Type{Ptr{T}}, !Matched::UInt64)
>>  convert{T}(::Type{Ptr{T}}, !Matched::Int64)
>>  ...
>> while loading In[36], in expression starting on line 1
>>  
>>
>>  in insert at /Users/szalmaf/.julia/v0.4/Mongo/src/MongoCollection.jl:42 
>>
>>
>> Did you try Mongo.jl in Julia v0.4? Do you have any suggestions as to how 
>> to go about getting rid of the LoadError above? It seems like a generic 
>> problem when switching from v0.3 to v0.4.
>>
>> Cheers
>>
>>
>

Re: [julia-users] C interop: pointer to elementary julia type

2015-09-01 Thread Jameson Nash
use `Ref{Clong}` for the calling convention to pass an auto-allocated boxed
c-long.

it's usually best to avoid `pointer` and `pointer_from_objref`. memory is
not "guaranteed by julia": it will be cleaned up as soon as the gc detects
that you are no longer referencing the julia object. using `pointer` and
`pointer_from_objref` hides memory from the gc, making it easier for you to
create memory bugs in your program.


On Tue, Sep 1, 2015 at 5:46 PM Joosep Pata  wrote:

> I want to modify the data stored in an elementary julia object (e.g.
> Int64) using external C code. I can do this if the type is wrapped in an
> array, but is it possible to do something like `pointer(x::Float64) ->
> address of data in x` with the memory being guaranteed by julia?
>
> Here’s a short snippet to describe:
> ~~~
>
> #C side
> void myfunc(long* x) {
>   x[0] += 1;
> }
>
> #julia side, works
> x = Int64[0]
> ccall((:myfunc, lib), Void, (Ptr{Void}, ), convert(Ptr{Void}, pointer(x)))
> println(x)
> >> [1]
>
> #julia side, desired
> x = 0
> ccall((:myfunc, lib), Void, (Ptr{Void}, ), pointer(x))
> println(x)
> >> 1
> ~~~
>
> I tried pointer_from_objref but as I understand this gives me the pointer
> of the whole julia type with meta-data etc. If I write to this address in
> C, I fear I’ll have problems.


Re: [julia-users] OSX support for package testing on Travis is now available by default!

2015-09-01 Thread Tony Kelman
I know, click blame, I put it there. Not every package is using that 
template though. If you see packages still using the old language: cpp 
travis template, now there's a really good reason to file PR's to update 
them. Though keep an eye out for whether existing packages need any extra 
apt-get binary dependencies, you likely still need to handle those manually 
or with the apt 
addon: http://docs.travis-ci.com/user/apt/#Adding-APT-Packages


On Tuesday, September 1, 2015 at 2:00:33 PM UTC-7, Mauro wrote:
>
> It's now actually in the autogenerated travis.yml file: 
>
> https://github.com/JuliaLang/julia/blob/aa8cd2e80d244a203e3774f8472619afb3ea1fe5/base/pkg/generate.jl#L171
>  
>
> On Tue, 2015-09-01 at 22:02, Elliot Saba  
> wrote: 
> > That's awesome!  Many thanks to everyone who continually chips away at 
> all 
> > the usability problems to make this as easy to use as possible! 
> > -E 
> > 
> > On Tue, Sep 1, 2015 at 12:52 PM, Tony Kelman  > wrote: 
> > 
> >> Package authors may find this useful, you formerly had to send an email 
> >> requesting the feature be enabled per repository but now it's available 
> for 
> >> all. Ref 
> >> 
> https://github.com/travis-ci/docs-travis-ci-com/commit/8a4efe6e6bfb0dcce760eedabd2ffe640d6545d5
>  
> >> 
> >> Assuming you're using language: julia in your .travis.yml file, this 
> >> should be as simple as adding 
> >> 
> >> os: 
> >>   - linux 
> >>   - osx 
> >> 
> >> 
> >> 
> >> 
>
>

[julia-users] C interop: pointer to elementary julia type

2015-09-01 Thread Joosep Pata
I want to modify the data stored in an elementary julia object (e.g. Int64) 
using external C code. I can do this if the type is wrapped in an array, but is 
it possible to do something like `pointer(x::Float64) -> address of data in x` 
with the memory being guaranteed by julia?

Here’s a short snippet to describe:
~~~

#C side
void myfunc(long* x) {
  x[0] += 1;
}

#julia side, works
x = Int64[0]
ccall((:myfunc, lib), Void, (Ptr{Void}, ), convert(Ptr{Void}, pointer(x)))
println(x)
>> [1]

#julia side, desired
x = 0
ccall((:myfunc, lib), Void, (Ptr{Void}, ), pointer(x))
println(x)
>> 1
~~~

I tried pointer_from_objref but as I understand this gives me the pointer of 
the whole julia type with meta-data etc. If I write to this address in C, I 
fear I’ll have problems.

Re: [julia-users] C interop: pointer to elementary julia type

2015-09-01 Thread Joosep Pata
Thanks Jameson,
I want to later use `x` in auto-generated expressions, and if it's a Ref I 
will have to use `x.x` to get the actual data which makes generating the 
expressions more tricky, I thought I could perhaps avoid it.

Here's a more complete snippet:
~~~
type Foo
  x::Int64
  y::Vector{Int64}
end

Foo() = Foo(0, zeros(Int64, 10))

bar = :(f.x + f.y[2])

expr = :(
f = Foo()
for i=1:10
  ccall((:myfunc_x, lib), Void, (Ptr{Void}, ), pointer(h.x)) #fill the 
scalar f.x
  ccall((:myfunc_y, lib), Void, (Ptr{Void}, ), pointer(h.y)) #fill the 
array f.y
  r = $(eval(bar)) #do something with f.x and f.y
end
#done with f, OK to clean up
)
~~~ 

Does this seem like an OK approach with Refs and is there a way to avoid 
needing to do `h.x.x` in the expressions?

On Tuesday, 1 September 2015 23:51:34 UTC+2, Jameson wrote:
>
> use `Ref{Clong}` for the calling convention to pass an auto-allocated 
> boxed c-long.
>
> it's usually best to avoid `pointer` and `pointer_from_objref`. memory is 
> not "guaranteed by julia": it will be cleaned up as soon as the gc detects 
> that you are no longer referencing the julia object. using `pointer` and 
> `pointer_from_objref` hides memory from the gc, making it easier for you to 
> create memory bugs in your program.
>
>
> On Tue, Sep 1, 2015 at 5:46 PM Joosep Pata  > wrote:
>
>> I want to modify the data stored in an elementary julia object (e.g. 
>> Int64) using external C code. I can do this if the type is wrapped in an 
>> array, but is it possible to do something like `pointer(x::Float64) -> 
>> address of data in x` with the memory being guaranteed by julia?
>>
>> Here’s a short snippet to describe:
>> ~~~
>>
>> #C side
>> void myfunc(long* x) {
>>   x[0] += 1;
>> }
>>
>> #julia side, works
>> x = Int64[0]
>> ccall((:myfunc, lib), Void, (Ptr{Void}, ), convert(Ptr{Void}, pointer(x)))
>> println(x)
>> >> [1]
>>
>> #julia side, desired
>> x = 0
>> ccall((:myfunc, lib), Void, (Ptr{Void}, ), pointer(x))
>> println(x)
>> >> 1
>> ~~~
>>
>> I tried pointer_from_objref but as I understand this gives me the pointer 
>> of the whole julia type with meta-data etc. If I write to this address in 
>> C, I fear I’ll have problems.
>
>

Re: [julia-users] C interop: pointer to elementary julia type

2015-09-01 Thread Jameson Nash
I still have no idea what you are trying to do, but punning on types like
your example shows is going to get you in deep trouble with both Julia and
C. (using Expr and quoted ASTs will also cause you issues, but for
completely different reasons).

```
type Foo
  x::Clong
  y::Vector{Clong}
end

Foo() = Foo(0, zeros(Clong, 10))

bar(f::Foo) = f.x + f.y[2]

f = Foo()
for i=1:10
  ccall((:myfunc_x, lib), Void, (Ref{Foo}, ), h) #fill the scalar f
  ccall((:myfunc_y, lib), Void, (Ref{Clong}, ), h.y) #fill the array f.y
  r = bar(f)
end
```
```
struct Foo {
  long x;
  void *y;
}
void myfunc_x(foo*);
void myfunc_y(long*);
```


On Tue, Sep 1, 2015 at 6:06 PM Joosep Pata  wrote:

> Thanks Jameson,
> I want to later use `x` in auto-generated expressions, and if it's a Ref I
> will have to use `x.x` to get the actual data which makes generating the
> expressions more tricky, I thought I could perhaps avoid it.
>
> Here's a more complete snippet:
> ~~~
> type Foo
>   x::Int64
>   y::Vector{Int64}
> end
>
> Foo() = Foo(0, zeros(Int64, 10))
>
> bar = :(f.x + f.y[2])
>
> expr = :(
> f = Foo()
> for i=1:10
>   ccall((:myfunc_x, lib), Void, (Ptr{Void}, ), pointer(h.x)) #fill the
> scalar f.x
>   ccall((:myfunc_y, lib), Void, (Ptr{Void}, ), pointer(h.y)) #fill the
> array f.y
>   r = $(eval(bar)) #do something with f.x and f.y
> end
> #done with f, OK to clean up
> )
> ~~~
>
> Does this seem like an OK approach with Refs and is there a way to avoid
> needing to do `h.x.x` in the expressions?
>
> On Tuesday, 1 September 2015 23:51:34 UTC+2, Jameson wrote:
>
>> use `Ref{Clong}` for the calling convention to pass an auto-allocated
>> boxed c-long.
>>
>> it's usually best to avoid `pointer` and `pointer_from_objref`. memory is
>> not "guaranteed by julia": it will be cleaned up as soon as the gc detects
>> that you are no longer referencing the julia object. using `pointer` and
>> `pointer_from_objref` hides memory from the gc, making it easier for you to
>> create memory bugs in your program.
>>
>>
>> On Tue, Sep 1, 2015 at 5:46 PM Joosep Pata  wrote:
>>
>>> I want to modify the data stored in an elementary julia object (e.g.
>>> Int64) using external C code. I can do this if the type is wrapped in an
>>> array, but is it possible to do something like `pointer(x::Float64) ->
>>> address of data in x` with the memory being guaranteed by julia?
>>>
>>> Here’s a short snippet to describe:
>>> ~~~
>>>
>>> #C side
>>> void myfunc(long* x) {
>>>   x[0] += 1;
>>> }
>>>
>>> #julia side, works
>>> x = Int64[0]
>>> ccall((:myfunc, lib), Void, (Ptr{Void}, ), convert(Ptr{Void},
>>> pointer(x)))
>>> println(x)
>>> >> [1]
>>>
>>> #julia side, desired
>>> x = 0
>>> ccall((:myfunc, lib), Void, (Ptr{Void}, ), pointer(x))
>>> println(x)
>>> >> 1
>>> ~~~
>>>
>>> I tried pointer_from_objref but as I understand this gives me the
>>> pointer of the whole julia type with meta-data etc. If I write to this
>>> address in C, I fear I’ll have problems.
>>
>>


[julia-users] Re: @parallel and HDF5 is posible to work together?

2015-09-01 Thread Nils Gudat
As the warning says, the module is not defined on the workers. Do

addprocs()
import HDF5
@everywhere using HDF5


Re: [julia-users] OSX support for package testing on Travis is now available by default!

2015-09-01 Thread Mauro
It's now actually in the autogenerated travis.yml file:
https://github.com/JuliaLang/julia/blob/aa8cd2e80d244a203e3774f8472619afb3ea1fe5/base/pkg/generate.jl#L171

On Tue, 2015-09-01 at 22:02, Elliot Saba  wrote:
> That's awesome!  Many thanks to everyone who continually chips away at all
> the usability problems to make this as easy to use as possible!
> -E
>
> On Tue, Sep 1, 2015 at 12:52 PM, Tony Kelman  wrote:
>
>> Package authors may find this useful, you formerly had to send an email
>> requesting the feature be enabled per repository but now it's available for
>> all. Ref
>> https://github.com/travis-ci/docs-travis-ci-com/commit/8a4efe6e6bfb0dcce760eedabd2ffe640d6545d5
>>
>> Assuming you're using language: julia in your .travis.yml file, this
>> should be as simple as adding
>>
>> os:
>>   - linux
>>   - osx
>>
>>
>>
>>



Re: [julia-users] Re: MongoDB and Julia

2015-09-01 Thread Ferenc Szalma
Yes, I also tried running Pkg.checkout("Mongo"). I also checked that the 
original MongoCollection.jl had find(...), while the checked-out version 
has Base.find(...) methods. I also restarted the kernel to make sure it 
picked up the changes. 

Could you test that Mongo with the Julia v0.4 works properly on your side? 
It'd also be nice to get a hint on what this convert(::Type{Ptr{Void}}, 
::Array{UInt8,1}) is and what's the problem with. I am pretty new to Julia.


On Tuesday, September 1, 2015 at 3:41:27 PM UTC-4, Tim Lebel wrote:
>
> Did you try running Pkg.checkout("Mongo")? I believe that I fixed this, 
> but the maintainer may not have pushed a new tag.
>
> On Tue, Sep 1, 2015 at 9:14 AM, Ferenc Szalma  > wrote:
>
>> Kevin,
>>
>> I also managed to get Pzion's Mongo.jl to work in Julia v0.3. Now, I am 
>> trying to make it work in v0.4 but getting an error message while trying to 
>> insert:
>>
>> oid = insert(collection, {"name"=>"time series"})
>>
>>
>>
>> WARNING: deprecated syntax "{a=>b, ...}" at In[36]:1. Use 
>> "Dict{Any,Any}(a=>b, 
>> ...)" instead. 
>>
>> LoadError: MethodError: `convert` has no method matching convert(::Type{
>> Ptr{Void}}, ::Array{UInt8,1})
>> This may have arisen from a call to the constructor Ptr{Void}(...),
>> since type constructors fall back to convert methods.
>> Closest candidates are:
>>  call{T}(::Type{T}, ::Any)
>>  convert{T}(::Type{Ptr{T}}, !Matched::UInt64)
>>  convert{T}(::Type{Ptr{T}}, !Matched::Int64)
>>  ...
>> while loading In[36], in expression starting on line 1
>>  
>>
>>  in insert at /Users/szalmaf/.julia/v0.4/Mongo/src/MongoCollection.jl:42 
>>
>>
>> Did you try Mongo.jl in Julia v0.4? Do you have any suggestions as to how 
>> to go about getting rid of the LoadError above? It seems like a generic 
>> problem when switching from v0.3 to v0.4.
>>
>> Cheers
>>
>>
>

[julia-users] Re: iFastSum (correctly rounded sums) available

2015-09-01 Thread Kristoffer Carlsson
On line 20, changing xs[:] = x[:] to copy!(xs, x) makes it 25% faster.


On Tuesday, September 1, 2015 at 11:14:59 AM UTC+2, Jeffrey Sarnoff wrote:
>
>
> https://github.com/J-Sarnoff/IFastSum.jl
>
> On Tuesday, September 1, 2015 at 5:14:20 AM UTC-4, Jeffrey Sarnoff wrote:
>>
>> This implements an algorithm that gives the correctly rounded sum of a 
>> Vector{AbstractFloat}. I wanted to know if it really worked.  It appears 
>> so.  AFAIK, this works properly with any size vector; I tested it with 
>> 5_000, 100_000, and 10_000_000 items.  I think of it as an an order of 
>> magnitude faster than summing BigFloats to get the correctly rounded result.
>>
>>

[julia-users] Re: iFastSum (correctly rounded sums) available

2015-09-01 Thread Jeffrey Sarnoff

https://github.com/J-Sarnoff/IFastSum.jl

On Tuesday, September 1, 2015 at 5:14:20 AM UTC-4, Jeffrey Sarnoff wrote:
>
> This implements an algorithm that gives the correctly rounded sum of a 
> Vector{AbstractFloat}. I wanted to know if it really worked.  It appears 
> so.  AFAIK, this works properly with any size vector; I tested it with 
> 5_000, 100_000, and 10_000_000 items.  I think of it as an an order of 
> magnitude faster than summing BigFloats to get the correctly rounded result.
>
>

[julia-users] Re: iFastSum (correctly rounded sums) available

2015-09-01 Thread Jeffrey Sarnoff
great -- thanks.

On Tuesday, September 1, 2015 at 5:28:49 AM UTC-4, Kristoffer Carlsson 
wrote:
>
> On line 20, changing xs[:] = x[:] to copy!(xs, x) makes it 25% faster.
>
>
> On Tuesday, September 1, 2015 at 11:14:59 AM UTC+2, Jeffrey Sarnoff wrote:
>>
>>
>> https://github.com/J-Sarnoff/IFastSum.jl
>>
>> On Tuesday, September 1, 2015 at 5:14:20 AM UTC-4, Jeffrey Sarnoff wrote:
>>>
>>> This implements an algorithm that gives the correctly rounded sum of a 
>>> Vector{AbstractFloat}. I wanted to know if it really worked.  It appears 
>>> so.  AFAIK, this works properly with any size vector; I tested it with 
>>> 5_000, 100_000, and 10_000_000 items.  I think of it as an an order of 
>>> magnitude faster than summing BigFloats to get the correctly rounded result.
>>>
>>>

[julia-users] iFastSum (correctly rounded sums) available

2015-09-01 Thread Jeffrey Sarnoff
This implements an algorithm that gives the correctly rounded sum of a 
Vector{AbstractFloat}. I wanted to know if it really worked.  It appears 
so.  AFAIK, this works properly with any size vector; I tested it with 
5_000, 100_000, and 10_000_000 items.  I think of it as an an order of 
magnitude faster than summing BigFloats to get the correctly rounded result.



[julia-users] Re: Type stability (or not) in core stats functions

2015-09-01 Thread Michael Francis
2) The underlying functions are only stable if the mean passed to them is 
of the correct type, e.g. a number. Essentially this is a type inference 
issue, if the compiler was able to optimize  the branches then it would be 
likely be ok, it looks from the LLVM code that this is not the case today. 

FWIW using a type stable version (e.g. directly calling covm) looks to be 
about 18% faster for small (100 element) AbstractArray pairs. 

On Monday, August 31, 2015 at 9:06:58 PM UTC-4, Sisyphuss wrote:
>
> IMO:
> 1) This is called keyword argument (not named optional argument).
> 2) The returned value depends only on `corzm`, and `corm`. If these two 
> functions are type stable, then `cor` is type stable.
> 3) I'm not sure whether this is the "correct" way to write this function.
>
> On Monday, August 31, 2015 at 11:48:37 PM UTC+2, Michael Francis wrote:
>>
>> The following is taken from statistics.jl line 428 
>>
>> function cor(x::AbstractVector, y::AbstractVector; mean=nothing)
>> mean == 0 ? corzm(x, y) :
>> mean == nothing ? corm(x, Base.mean(x), y, Base.mean(y)) :
>> isa(mean, (Number,Number)) ? corm(x, mean[1], y, mean[2]) :
>> error("Invalid value of mean.")
>> end
>>
>> due to the 'mean' initially having a type of 'Nothing' I am unable to 
>> inference the return type of the function - the following will return Any 
>> for the return type.
>>
>> rt = {}
>> for x in Base._methods(f,types,-1)
>> linfo = x[3].func.code
>> (tree, ty) = Base.typeinf(linfo, x[1], x[2])
>> push!(rt, ty)
>> end
>>
>> Each of the underlying functions are type stable when called directly. 
>>
>> Code lowered doesn't give much of a pointer to what will actually happen 
>> here, 
>>
>> julia> code_lowered( cor, ( Vector{Float64}, Vector{Float64} ) )
>> 1-element Array{Any,1}:
>>  :($(Expr(:lambda, {:x,:y}, {{},{{:x,:Any,0},{:y,:Any,0}},{}}, :(begin $(
>> Expr(:line, 429, symbol("statistics.jl"), symbol("")))
>> return __cor#195__(nothing,x,y)
>> end
>>
>>
>> If I re-write with a regular optional arg for the mean 
>>
>> code_lowered( cordf, ( Vector{Float64}, Vector{Float64}, Nothing ) )
>> 1-element Array{Any,1}:
>>  :($(Expr(:lambda, {:x,:y,:mean}, {{},{{:x,:Any,0},{:y,:Any,0},{:mean,:
>> Any,0}},{}}, :(begin  # none, line 2:
>> unless mean == 0 goto 0
>> return corzm(x,y)
>> 0: 
>> unless mean == nothing goto 1
>> return corm(x,((top(getfield))(Base,:mean))(x),y,((top(getfield
>> ))(Base,:mean))(y))
>> 1: 
>> unless isa(mean,(top(tuple))(Number,Number)) goto 2
>> return corm(x,getindex(mean,1),y,getindex(mean,2))
>> 2: 
>> return error("Invalid value of mean.")
>> end
>>
>> The LLVM code does not look very clean, If I have a real type for the 
>> mean (say Float64 ) it looks better  88 lines vs 140 
>>
>> julia> code_llvm( cor, ( Vector{Float64}, Vector{Float64}, Nothing ) )
>>
>>
>> define %jl_value_t* @julia_cordf_20322(%jl_value_t*, %jl_value_t*, %
>> jl_value_t*) {
>> top:
>>   %3 = alloca [7 x %jl_value_t*], align 8
>>   %.sub = getelementptr inbounds [7 x %jl_value_t*]* %3, i64 0, i64 0
>>   %4 = getelementptr [7 x %jl_value_t*]* %3, i64 0, i64 2, !dbg !949
>>   store %jl_value_t* inttoptr (i64 10 to %jl_value_t*), %jl_value_t** %.
>> sub, align 8
>>   %5 = getelementptr [7 x %jl_value_t*]* %3, i64 0, i64 1, !dbg !949
>>   %6 = load %jl_value_t*** @jl_pgcstack, align 8, !
>> ...
>
>

[julia-users] [ANN] Conda.jl: using conda package manager for Julia

2015-09-01 Thread Luthaf

Hi Julians!

I am happy to present you the Conda.jl 
 package, a binary dependencies 
manager for Julia based on the open-source conda 
 package manager.


Some interesting features of the Conda package manager:
 - You can easily add your own software and use your own channel for 
software distribution;

 - You can install packages as non root on Linux;
 - Conda is cross-plateforme, and you can use it for all your binary 
dependencies, provided the binaries have been uploaded.


I'll love to have your input on the code or the functionalities.

Cheers
Guillaume



[julia-users] Re: [ANN] Conda.jl: using conda package manager for Julia

2015-09-01 Thread Jeffrey Sarnoff
It would help to have explicit examples for adding a package from Julia 
package central (I assume this the default) and adding one from some other 
github location.

On Tuesday, September 1, 2015 at 8:42:31 AM UTC-4, Luthaf wrote:
>
> Hi Julians! 
>
> I am happy to present you the Conda.jl 
>  package, a binary dependencies 
> manager for Julia based on the open-source conda 
>  package manager.
>
> Some interesting features of the Conda package manager:
>  - You can easily add your own software and use your own channel for 
> software distribution;
>  - You can install packages as non root on Linux;
>  - Conda is cross-plateforme, and you can use it for all your binary 
> dependencies, provided the binaries have been uploaded.
>
> I'll love to have your input on the code or the functionalities.
>
> Cheers
> Guillaume
>


Re: [julia-users] Re: Pyplot graphic error

2015-09-01 Thread Steven G. Johnson


On Tuesday, September 1, 2015 at 4:55:10 PM UTC-4, Juan Carlos Cuevas 
Bautista wrote:
>
> After some test in my plots and googling. I changed my version of Gnuplot
> from 4.6. patchlevel 4 to gnuplot 4.6 patchlevel 6 and  julia is working 
> perfectly now.
> I don't know how julia and Gnuplot are related but it works.
>

They aren't directly related, but a lot of different programs use the 
libpng shared library, and it sounds like you were in "DLL hell" with 
conflicting versions installed, so that matplotlib was getting confused by 
gnuplot's libpng. 


[julia-users] RasterIO.jl

2015-09-01 Thread Marcio Sales
Hello all
As a statistician and GIS practitioner, of course I'm interested in that 
package. I've made a fork and started to play with the GDal functions, and now 
I'm in condition to start contributing. As a start, I've made a function to 
reproject a raster, with a choice to save a new file to disk or just return the 
result as a new Raster object.

I have a question about the Raster object though. It has a Data field that I'm 
not sure if it loads the raster to the memory or just keeps a reference to the 
values.

I would prefer to keep just the reference and then tried to redefine the raster 
type to have a gdaldataset field that keeps the handle to the dataset. This 
way, I could write other functions to fetch data and keep just the data I need 
in memory.

However, as this reference is returned by a C function, I wonder what happens 
to it when I delete or replace the value of the raster object created. Is the 
reference destroyed automatically? I tried to use the finalize function but it 
didnt work.

Also, is there a way to write a function that mimics matrix indexing, like 
making raster[i,j] invoke a GDAL function to fetch the values at i,j?

Also, two suggestion to the owners:
a) a constructor to Raster(filename) instead of the openraster function (if 
this is possible)
b) what about naming the package "JDAL.jl"?

 

[julia-users] Re: threads and processes, @async vs. @spawn/@parallel?

2015-09-01 Thread Steven G. Johnson


On Tuesday, September 1, 2015 at 10:02:30 PM UTC-4, John Brock wrote:
>
> I've read through the parallel computing documentation and experimented 
> with some toy examples, but I still have some questions about the 
> fundamentals of parallel programming in Julia:
>
>1. It seems that @async performs work in a separate green thread, 
>while @spawn performs work in a separate julia process. Is that right?
>
> Yes. 

>
>1. Will code executed with several calls to @async actually run on 
>separate cores?
>
> No. 

>
>1. Getting true parallelism with @spawn or @parallel requires 
>launching julia with the -p flag or using addprocs(...). Does the same 
> hold 
>for @async, i.e., will I get true parallelism with several calls to @async 
>if I only have a single julia process?
>
> @async will only ever give cooperative multitasking.

>
>1. In what situations should I choose @spawn over @async, and vice 
>versa?
>
> @async is much cheaper and is especially useful for asynchronous I/O (it 
is built on libuv, and async I/O is libuv's raison-d'être).  @spawn if you 
want multiple cores to be used simultaneously.

Note, however, that @spawn is built on @async: asynchronous I/O is used to 
track the master process's communication with multiple workers 
"simultaneously".  Basically, @async I/O is useful whenever you want to 
track communication with multiple I/O channels in a single process, with 
one I/O task waking up when read/write is available and sleeping when I/O 
is stalled, without the complexities (locking, race conditions, etc) or 
overhead of pre-emptive (kernel) threads.
 

>
>1. How does scope and serialization work with regards to @async? If 
>the code being executed with @async references some Array, will each 
> thread 
>get a copy of that Array, like if I had called @spawn instead? Or will 
> each 
>thread have access to the same Array, obviating the need for SharedArray 
>when using @async?
>
> @async uses green threads, which all share the same process address space 
(and hence will share the same array) and are all running in the same 
thread (just taking turns cooperatively).  @spawn uses separate julia 
processes (not shared-memory threads!) with their own address spaces (think 
MPI or PVM, not OpenMP).

(In the future, there is likely to be a third type of parallelism in Julia: 
shared-memory threads via @threads, ala openmp or cilk. 
 See https://github.com/JuliaLang/julia/issues/1790 and the jn/threading 
branch of Julia for the current prototype).



[julia-users] Re: Type stability (or not) in core stats functions

2015-09-01 Thread Jarrett Revels
Related: https://github.com/JuliaLang/julia/issues/9551

Unfortunately, as you've seen, type-variadic keyword arguments can really 
mess up type-inferencing. It appears that keyword argument types are pulled 
from the default arguments rather than those actually passed in at runtime:

*julia> f(x; a=1, b=2) = a*x^b*
*f (generic function with 1 method)*

*julia> f(1)*
*1*

*julia> f(1, a=(3+im), b=5.15)*
*3.0 + 1.0im*

*julia> @code_typed f(1, a=(3+im), b=5.15)*
*1-element Array{Any,1}:*
* :($(Expr(:lambda, Any[:x], 
Any[Any[Any[:x,Int64,0]],Any[],Any[Int64],Any[]], :(begin $(Expr(:line, 1, 
:none, symbol("")))*
*GenSym(0) = (Base.power_by_squaring)(x::Int64,2)::Int64*
*return (Base.box)(Int64,(Base.mul_int)(1,GenSym(0)))::Int64*
*end::Int64*

Obviously, that specific call to f does NOT return an Int64.

I know of only two reasonable ways to handle it at the moment:

1. If you're the method author: Restrict every keyword argument to a 
declared, concrete type, which ensures that the argument isn't 
type-variadic. Yichao basically gave an example of this.
2. If you're the method caller: Manually assert the return type. You can do 
this pretty easily in most cases using a wrapper function. 
Using `f` from above as an example:

*julia> g{X,A,B}(x::X, a::A, b::B) = f(x, a=a, b=b)::promote_type(X, A, B)*
*g (generic function with 2 methods)*

*julia> @code_typed g(1,2,3)*
*1-element Array{Any,1}:*
* :($(Expr(:lambda, Any[:x,:a,:b], 
Any[Any[Any[:x,Int64,0],Any[:a,Int64,0],Any[:b,Int64,0]],Any[],Any[Int64],Any[:X,:A,:B]],
 
:(begin  # none, line 1:*
*return 
(top(typeassert))((top(kwcall))((top(getfield))(Main,:call)::F,2,:a,a::Int64,:b,b::Int64,Main.f,(top(ccall))(:jl_alloc_array_1d,(top(apply_type))(Base.Array,Any,1)::Type{Array{Any,1}},(top(svec))(Base.Any,Base.Int)::SimpleVector,Array{Any,1},0,4,0)::Array{Any,1},x::Int64),Int64)::Int64*
*end::Int64*

*julia> @code_typed g(1,2,3.0)*
*1-element Array{Any,1}:*
* :($(Expr(:lambda, Any[:x,:a,:b], 
Any[Any[Any[:x,Int64,0],Any[:a,Int64,0],Any[:b,Float64,0]],Any[],Any[Int64],Any[:X,:A,:B]],
 
:(begin  # none, line 1:*
*return 
(top(typeassert))((top(kwcall))((top(getfield))(Main,:call)::F,2,:a,a::Int64,:b,b::Float64,Main.f,(top(ccall))(:jl_alloc_array_1d,(top(apply_type))(Base.Array,Any,1)::Type{Array{Any,1}},(top(svec))(Base.Any,Base.Int)::SimpleVector,Array{Any,1},0,4,0)::Array{Any,1},x::Int64),Float64)::Float64*
*end::Float64*

*julia> @code_typed g(1,2,3.0+im)*
*1-element Array{Any,1}:*
* :($(Expr(:lambda, Any[:x,:a,:b], 
Any[Any[Any[:x,Int64,0],Any[:a,Int64,0],Any[:b,Complex{Float64},0]],Any[],Any[Int64],Any[:X,:A,:B]],
 
:(begin  # none, line 1:*
*return 
(top(typeassert))((top(kwcall))((top(getfield))(Main,:call)::F,2,:a,a::Int64,:b,b::Complex{Float64},Main.f,(top(ccall))(:jl_alloc_array_1d,(top(apply_type))(Base.Array,Any,1)::Type{Array{Any,1}},(top(svec))(Base.Any,Base.Int)::SimpleVector,Array{Any,1},0,4,0)::Array{Any,1},x::Int64),Complex{Float64})::Complex{Float64}*
*end::Complex{Float64}*

Thus, downstream functions can call *f* through *g, *preventing 
type-instability from "bubbling up" to the calling methods (as it would if 
they called *f* directly).

Best,
Jarrett

On Tuesday, September 1, 2015 at 8:39:11 AM UTC-4, Michael Francis wrote:
>
> 2) The underlying functions are only stable if the mean passed to them is 
> of the correct type, e.g. a number. Essentially this is a type inference 
> issue, if the compiler was able to optimize  the branches then it would be 
> likely be ok, it looks from the LLVM code that this is not the case today. 
>
> FWIW using a type stable version (e.g. directly calling covm) looks to be 
> about 18% faster for small (100 element) AbstractArray pairs. 
>
> On Monday, August 31, 2015 at 9:06:58 PM UTC-4, Sisyphuss wrote:
>>
>> IMO:
>> 1) This is called keyword argument (not named optional argument).
>> 2) The returned value depends only on `corzm`, and `corm`. If these two 
>> functions are type stable, then `cor` is type stable.
>> 3) I'm not sure whether this is the "correct" way to write this function.
>>
>> On Monday, August 31, 2015 at 11:48:37 PM UTC+2, Michael Francis wrote:
>>>
>>> The following is taken from statistics.jl line 428 
>>>
>>> function cor(x::AbstractVector, y::AbstractVector; mean=nothing)
>>> mean == 0 ? corzm(x, y) :
>>> mean == nothing ? corm(x, Base.mean(x), y, Base.mean(y)) :
>>> isa(mean, (Number,Number)) ? corm(x, mean[1], y, mean[2]) :
>>> error("Invalid value of mean.")
>>> end
>>>
>>> due to the 'mean' initially having a type of 'Nothing' I am unable to 
>>> inference the return type of the function - the following will return Any 
>>> for the return type.
>>>
>>> rt = {}
>>> for x in Base._methods(f,types,-1)
>>> linfo = x[3].func.code
>>> (tree, ty) = Base.typeinf(linfo, x[1], x[2])
>>> push!(rt, ty)
>>> end
>>>
>>> Each of the underlying 

[julia-users] Re: Julia 0.4 warnings and how to fix

2015-09-01 Thread Tim Wheeler
Thanks! I was trying to figure out how to do Base.== for a long time. It 
turns out the proper way to do it is as follows (found this in 
DataArrays.jl):

Base.(:(==))( ... ) 


[julia-users] Re: IDE for Julia

2015-09-01 Thread STAR0SS
It's pretty good but it's like lighttable, there's no console and the plots 
management is a bit wonky (plots on top of your code?).


Re: [julia-users] IDE for Julia

2015-09-01 Thread Jeffrey Sarnoff
Installing Atom+Hydrogen on Linux

*Carefully* following the directions on 
https://github.com/willwhitney/hydrogen does work.
First, install the most recent version of Atom.
If you have python installed and python3 is the default, or you dont have 
python installed you need a minimal python2.7 for the moment.
  at the command prompt: PYTHON=python2.7 apm install hydrogen
  (give it a minute)
[if you installed python2.7 just for this, you can delete it now]
you need python3 (or python2.7, I guess -- I know it works with python3)

when you want to use atom+hydrogen
*at the command prompt*: atom (starting it from the menu does not work!)
load a Julia source file, highlight some code and press Alt+Ctrl+Enter
(the first time takes a while, so highlight e.g. 1 and press 
Alt+Ctrl+Enter, after that it goes quickly)


On Tuesday, September 1, 2015 at 9:12:23 AM UTC-4, Andrei Zh wrote:
>
> @Cedric Unless you are interested specifically in notebooks, I'd suggest 
> trying ESS mode for Emacs which has support for Julia including REPL. It 
> looks something like this 
>  (though 
> this video also uncovers sometimes annoying 
>  bug in ESS mode).
>
> On Tuesday, September 1, 2015 at 5:52:08 AM UTC+3, Cedric St-Jean wrote:
>>
>> Scott, do you have a way to run the notebooks (IJulia) inside Emacs? I 
>> run IJulia in the browser and edit code in Emacs, and would love to combine 
>> both.
>>
>> On Monday, August 31, 2015 at 9:04:28 PM UTC-4, Scott Jones wrote:
>>>
>>> The fact that Mike is working on it would make me confident of it. 
>>>  Currently all of the developers I'm working with have switched to Atom 
>>> (for Julia, C, C++, and Python work) [I've used it, and like it, but so far 
>>> I'm still sticking with Emacs, in part thanks to Yuyichao's (and others) 
>>> nice work on julia-mode.el, and also because my fingers just know Emacs 
>>> without thinking, and I haven't had time to set up Emacs bindings for Atom 
>>> yet, or find a Emacs key binding package for it].
>>>
>>> Scott
>>>
>>> On Monday, August 31, 2015 at 12:26:57 PM UTC-4, Viral Shah wrote:

 It’s mainly Mike Innes. Certainly not to discourage any other efforts, 
 but the number of people I have seen using Atom recently makes me feel 
 like 
 this could be the one. 

 -viral 



 > On 31-Aug-2015, at 7:58 pm, Kevin Squire  
 wrote: 
 > 
 > Hi Viral, just curious who is working on that development?  Your post 
 seems to imply an officially supported effort, but that doesn't mean that 
 development on other IDEs will be discouraged, I presume? :-)  (Not that 
 I'm aware of other IDEs being worked on...) 
 > 
 > Cheers, 
 >   Kevin 
 > 
 > On Monday, August 31, 2015, Viral Shah  wrote: 
 > Also, it is worth pointing out that a lot of the future IDE effort 
 (Juno 2) will be focussed around Atom. 
 > 
 > https://atom.io/packages/language-julia 
 > 
 > https://github.com/JuliaLang/atom-language-julia 
 > https://github.com/JunoLab/atom-julia-client 
 > 
 > -viral 
 > 
 > On Thursday, August 27, 2015 at 9:12:22 PM UTC+5:30, Arch Call wrote: 
 > Deb,  I use Juno all the time.  It works good for me on Windows 10, 
 and Julia version 3.11 
 > 
 > I have used R-Studio extensively in R and it is a great IDE.  Juno is 
 nowhere near as powerful, but Julia is a speed demon -- way faster than R. 
 > 
 > ...Archie 
 > 
 > On Wednesday, August 26, 2015 at 11:12:22 PM UTC-4, Deb Midya wrote: 
 > Hi, 
 > 
 > Thanks in advance. 
 > 
 > I am new to Julia and using Julia-0.3.7 on Windows 8. 
 > 
 > I am looking for an IDE for Julia (like RStudio in R). 
 > 
 > Once again, thank you very much for the time you have given.. 
 > 
 > Regards, 
 > 
 > Deb 



Re: [julia-users] Re: [ANN] Conda.jl: using conda package manager for Julia

2015-09-01 Thread Jeffrey Sarnoff
That is clearer to me; maybe include "Conda is not ... external C library" 
in the README.

On Tuesday, September 1, 2015 at 9:35:40 AM UTC-4, Luthaf wrote:
>
> I don't get what you mean. By package, you mean binary package ? In this 
> case, there is no Julia central channel (a channel is a package source in 
> conda), and you can use your very own by pushing the URL to 
> `Conda.CHANNELS` before anything else.
>
> Or if you mean Julia package, Conda is not a Julia package manager. It is 
> a binary package provider, for use with BinDeps. It allow to distribute 
> C/C++/Fortran libraries with the Julia package.
> This is only useful when building a Julia Package which one is calling an 
> external C library.
>
> Jeffrey Sarnoff a écrit : 
>
> It would help to have explicit examples for adding a package from Julia 
> package central (I assume this the default) and adding one from some other 
> github location.
>
> On Tuesday, September 1, 2015 at 8:42:31 AM UTC-4, Luthaf wrote:
>
> Hi Julians! 
>
> I am happy to present you the Conda.jl 
>  package, a binary dependencies 
> manager for Julia based on the open-source conda 
>  package manager.
>
> Some interesting features of the Conda package manager:
>  - You can easily add your own software and use your own channel for 
> software distribution;
>  - You can install packages as non root on Linux;
>  - Conda is cross-plateforme, and you can use it for all your binary 
> dependencies, provided the binaries have been uploaded.
>
> I'll love to have your input on the code or the functionalities.
>
> Cheers
> Guillaume
>
>

[julia-users] Re: IDE for Julia

2015-09-01 Thread Seth
Have you tried Hydrogen with Atom? It has all of those things 
(subjectively).

On Tuesday, September 1, 2015 at 7:05:59 AM UTC-7, STAR0SS wrote:
>
> I think a good IDE should have:
>
> - A proper console and a good way to send single line and block of codes 
> to it (e.g. matlab's code section)
> - A decent text editor
> - Integrated plots
> - Proper window management (docking, etc) so you don't have windows 
> everywhere 
>
> All this meet two others broad and important goals, namely having a good 
> plotting solution and being able to make GUI applications in Julia.
> For these reasons I feel like something like Julietta was the way to go.
>
>

[julia-users] Julia in Atom - automaticindentation within brackets

2015-09-01 Thread Tom Lee
Has anyone been able to get atom to automatically indent items within (...) 
to line up with the opening ( ?
eg

function foo(bar,
 baz) # <-- Atom (with language-julia) only auto-indents this 
line by one tab :(
stuff
end

I have just discovered the awesomeness of Atom, and right now this is the 
main behavior I am missing from emacs.

Cheers,

Tom


Re: [julia-users] Converting vectors to dictionaries back and forth

2015-09-01 Thread Yichao Yu
On Tue, Sep 1, 2015 at 9:39 AM,   wrote:
> Hi,
>
> I want to be able to convert vectors to dictionaries and the other way
> round. I have coded these two functions:
>
> function vec_to_dict(v, ll)
> d = Dict()
> assert(length(v) == length(ll))
> for i = 1:length(v)
> d[ll[i]] = v[i]
> end
> return d
> end
>
> function dict_to_vec(d)
> v = Array(Float64, 0)
> for k in keys(d)
> push!(v, d[k])
> end
> return v
> end
>
> The problem is that if I convert a vector to a dictionary first, and then
> convert it back to a vector, the vector is not ordered anymore. As an
> example, the output of this code:
>
> v0 = [1.0, 2.0, 3.0]
> ll = ["a", "b", "c"]
> d = vec_to_dict(v0, ll)
> v1 = dict_to_vec(d)
> println(v0)
> println(d)
> println(v1)
>
> is:
>
> [1.0,2.0,3.0]
> {"c"=>3.0,"b"=>2.0,"a"=>1.0}
> [3.0,2.0,1.0]
>
> Is there some way to get the initial vector back with such a transformation
> using a dictionary?

https://github.com/JuliaLang/DataStructures.jl#ordereddicts-and-orderedsets

>
> Thanks,


[julia-users] Re: Julia in Atom - automaticindentation within brackets

2015-09-01 Thread Jeffrey Sarnoff
I tried some of the available indent-related packages without success.

On Tuesday, September 1, 2015 at 9:18:16 AM UTC-4, Tom Lee wrote:
>
> Has anyone been able to get atom to automatically indent items within 
> (...) to line up with the opening ( ?
> eg
>
> function foo(bar,
>  baz) # <-- Atom (with language-julia) only auto-indents this 
> line by one tab :(
> stuff
> end
>
> I have just discovered the awesomeness of Atom, and right now this is the 
> main behavior I am missing from emacs.
>
> Cheers,
>
> Tom
>


[julia-users] Are (+), (*) the only ops that can use n-ary definitions?

2015-09-01 Thread Jeffrey Sarnoff
import Base:(+),(*),(-)

julia> 2+3+4
9
julia> (+){T<:Integer}(a::T,b::T,c::T) = ((a+b)+c)+1
julia> 2+3+4
10

julia> 2-3-4
-5
julia> (+){T<:Integer}(a::T,b::T,c::T) = ((a-b)-c)-1
julia> 2-3-4
-5

Are (+),(*) the only ops that can be n-ary specialized? What gives them 
that ability? Why only those?



Re: [julia-users] Re: [ANN] Conda.jl: using conda package manager for Julia

2015-09-01 Thread Luthaf
I don't get what you mean. By package, you mean binary package ? In this 
case, there is no Julia central channel (a channel is a package source 
in conda), and you can use your very own by pushing the URL to 
`Conda.CHANNELS` before anything else.


Or if you mean Julia package, Conda is not a Julia package manager. It 
is a binary package provider, for use with BinDeps. It allow to 
distribute C/C++/Fortran libraries with the Julia package.
This is only useful when building a Julia Package which one is calling 
an external C library.


Jeffrey Sarnoff a écrit :
It would help to have explicit examples for adding a package from 
Julia package central (I assume this the default) and adding one from 
some other github location.


On Tuesday, September 1, 2015 at 8:42:31 AM UTC-4, Luthaf wrote:

Hi Julians!

I am happy to present you the Conda.jl
 package, a binary
dependencies manager for Julia based on the open-source conda
 package manager.

Some interesting features of the Conda package manager:
 - You can easily add your own software and use your own channel
for software distribution;
 - You can install packages as non root on Linux;
 - Conda is cross-plateforme, and you can use it for all your
binary dependencies, provided the binaries have been uploaded.

I'll love to have your input on the code or the functionalities.

Cheers
Guillaume



[julia-users] Converting vectors to dictionaries back and forth

2015-09-01 Thread amiksvi
Hi,

I want to be able to convert vectors to dictionaries and the other way 
round. I have coded these two functions:

function vec_to_dict(v, ll)
d = Dict()
assert(length(v) == length(ll))
for i = 1:length(v)
d[ll[i]] = v[i]
end
return d
end

function dict_to_vec(d)
v = Array(Float64, 0)
for k in keys(d)
push!(v, d[k])
end
return v
end

The problem is that if I convert a vector to a dictionary first, and then 
convert it back to a vector, the vector is not ordered anymore. As an 
example, the output of this code:

v0 = [1.0, 2.0, 3.0]
ll = ["a", "b", "c"]
d = vec_to_dict(v0, ll)
v1 = dict_to_vec(d)
println(v0)
println(d)
println(v1)

is:

[1.0,2.0,3.0]
{"c"=>3.0,"b"=>2.0,"a"=>1.0}
[3.0,2.0,1.0]

Is there some way to get the initial vector back with such a transformation 
using a dictionary?

Thanks,


Re: [julia-users] IDE for Julia

2015-09-01 Thread Andrei Zh
@Cedric Unless you are interested specifically in notebooks, I'd suggest 
trying ESS mode for Emacs which has support for Julia including REPL. It 
looks something like this 
 (though 
this video also uncovers sometimes annoying 
 bug in ESS mode).

On Tuesday, September 1, 2015 at 5:52:08 AM UTC+3, Cedric St-Jean wrote:
>
> Scott, do you have a way to run the notebooks (IJulia) inside Emacs? I run 
> IJulia in the browser and edit code in Emacs, and would love to combine 
> both.
>
> On Monday, August 31, 2015 at 9:04:28 PM UTC-4, Scott Jones wrote:
>>
>> The fact that Mike is working on it would make me confident of it. 
>>  Currently all of the developers I'm working with have switched to Atom 
>> (for Julia, C, C++, and Python work) [I've used it, and like it, but so far 
>> I'm still sticking with Emacs, in part thanks to Yuyichao's (and others) 
>> nice work on julia-mode.el, and also because my fingers just know Emacs 
>> without thinking, and I haven't had time to set up Emacs bindings for Atom 
>> yet, or find a Emacs key binding package for it].
>>
>> Scott
>>
>> On Monday, August 31, 2015 at 12:26:57 PM UTC-4, Viral Shah wrote:
>>>
>>> It’s mainly Mike Innes. Certainly not to discourage any other efforts, 
>>> but the number of people I have seen using Atom recently makes me feel like 
>>> this could be the one. 
>>>
>>> -viral 
>>>
>>>
>>>
>>> > On 31-Aug-2015, at 7:58 pm, Kevin Squire  wrote: 
>>> > 
>>> > Hi Viral, just curious who is working on that development?  Your post 
>>> seems to imply an officially supported effort, but that doesn't mean that 
>>> development on other IDEs will be discouraged, I presume? :-)  (Not that 
>>> I'm aware of other IDEs being worked on...) 
>>> > 
>>> > Cheers, 
>>> >   Kevin 
>>> > 
>>> > On Monday, August 31, 2015, Viral Shah  wrote: 
>>> > Also, it is worth pointing out that a lot of the future IDE effort 
>>> (Juno 2) will be focussed around Atom. 
>>> > 
>>> > https://atom.io/packages/language-julia 
>>> > 
>>> > https://github.com/JuliaLang/atom-language-julia 
>>> > https://github.com/JunoLab/atom-julia-client 
>>> > 
>>> > -viral 
>>> > 
>>> > On Thursday, August 27, 2015 at 9:12:22 PM UTC+5:30, Arch Call wrote: 
>>> > Deb,  I use Juno all the time.  It works good for me on Windows 10, 
>>> and Julia version 3.11 
>>> > 
>>> > I have used R-Studio extensively in R and it is a great IDE.  Juno is 
>>> nowhere near as powerful, but Julia is a speed demon -- way faster than R. 
>>> > 
>>> > ...Archie 
>>> > 
>>> > On Wednesday, August 26, 2015 at 11:12:22 PM UTC-4, Deb Midya wrote: 
>>> > Hi, 
>>> > 
>>> > Thanks in advance. 
>>> > 
>>> > I am new to Julia and using Julia-0.3.7 on Windows 8. 
>>> > 
>>> > I am looking for an IDE for Julia (like RStudio in R). 
>>> > 
>>> > Once again, thank you very much for the time you have given.. 
>>> > 
>>> > Regards, 
>>> > 
>>> > Deb 
>>>
>>>

Re: [julia-users] Converting vectors to dictionaries back and forth

2015-09-01 Thread amiksvi
Exactly what I needed, thanks a lot.


[julia-users] Re: IDE for Julia

2015-09-01 Thread STAR0SS
I think a good IDE should have:

- A proper console and a good way to send single line and block of codes to 
it (e.g. matlab's code section)
- A decent text editor
- Integrated plots
- Proper window management (docking, etc) so you don't have windows 
everywhere 

All this meet two others broad and important goals, namely having a good 
plotting solution and being able to make GUI applications in Julia.
For these reasons I feel like something like Julietta was the way to go.



Re: [julia-users] Converting vectors to dictionaries back and forth

2015-09-01 Thread Cedric St-Jean
BTW, if I'm not mistaken, your dict_to_vec function can be written as just 
collect(values(d)). 

On Tuesday, September 1, 2015 at 9:47:47 AM UTC-4, ami...@gmail.com wrote:
>
> Exactly what I needed, thanks a lot.
>


[julia-users] Re: Julia on OpenBSD

2015-09-01 Thread Maurizio Tomasi
*@Viral*, yes, indeed I was the submitter of those patches. Unfortunately, 
I realized that part of the C code has a number of #ifdefs that do not 
cover OpenBSD properly. Maybe I should create a bugreport in openlibm to 
keep track of this?

*@Yychao*, thank you very much for this tip! It has saved me a lot of time, 
as I have found problems with other dependencies as well (OpenBLAS, PCRE, 
and FFTW, at the moment). With your trick, I have been able to solve them 
all.

At the moment I am struck with Rmath-julia, which shows similar problems as 
openlibm. In this case, the code seems simpler, and I have already made a 
pull request (https://github.com/JuliaLang/Rmath-julia/pull/9). *@Viral*, 
is there an easy workaround to force the build system to use the HEAD of a 
dependency instead of a tagged commit? I have tried to understand how 
julia/deps/Makefile works, but it is full of GNU Make constructs which I do 
not grasp...

Thanks a lot,
  Maurizio.

 


Re: [julia-users] Julia 0.4 warnings and how to fix

2015-09-01 Thread Yichao Yu
On Sep 1, 2015 11:48 AM, "Michael Francis"  wrote:
>
> I get the following - WARNING: module  should explicitly import
== from Base
>
> Unfortunately you can't simply do
>
> Base.==( ... )

Operators are not imported by default in 0.4 anymore. Import them directly
or use Base.(:==) should both work

>
>
> I can fix with
>
> import Base: ==
>
>
> what is the idiomatic way of doing this in 0.4
>
> Also is there a guide for upgrading packages to 0.4 ?
>


Re: [julia-users] Julia 0.4 warnings and how to fix

2015-09-01 Thread Seth
Thanks, Yichao! As a related aside, it appears that using import violates 
Style Guide #33 (per https://github.com/johnmyleswhite/Style.jl), so is the 
idiomatic / preferred way of doing this limited to using Base.(:==) ?

On Tuesday, September 1, 2015 at 9:06:42 AM UTC-7, Yichao Yu wrote:
>
>
> On Sep 1, 2015 11:48 AM, "Michael Francis"  > wrote:
> >
> > I get the following - WARNING: module  should explicitly import 
> == from Base
> >
> > Unfortunately you can't simply do 
> >
> > Base.==( ... )
>
> Operators are not imported by default in 0.4 anymore. Import them directly 
> or use Base.(:==) should both work 
>
> >
> >
> > I can fix with 
> >
> > import Base: ==
> >
> >
> > what is the idiomatic way of doing this in 0.4
> >
> > Also is there a guide for upgrading packages to 0.4 ?
> >
>


[julia-users] Julia 0.4 warnings and how to fix

2015-09-01 Thread Michael Francis
I get the following - WARNING: module  should explicitly import == 
from Base

Unfortunately you can't simply do 

Base.==( ... )


I can fix with 

import Base: ==


what is the idiomatic way of doing this in 0.4

Also is there a guide for upgrading packages to 0.4 ?