Re: [julia-users] Println question: adding suffix to $interpolation

2015-02-15 Thread David P. Sanders
This would be a good example to put in the manual (if it is not already there). 

[julia-users] Re: Issues with optimize in Julia

2015-02-15 Thread John Myles White
Here's my two, not very thorough, cents:

(1) The odds of a bug in Optim.jl are very high (>90%).
(2) The odds of a bug in your code are very high (>90%).

It's pretty easy to make a decision about (2). Deciding on (1) is a lot 
harder, since you need a specific optimization that Optim should solve, but 
fails to solve.

For resolving (2), you have a couple of sub-problems:

(a) Is your gradient analytically correct? You can check this by comparing 
it with finite differencing. If it doesn't produce a close match, be 
suspicious.
(b) Is your log likelihood + gradient numerically correct? Your stress test 
is, in theory, an attempt to test this. But numerical instability implies 
that the problem only occurs when the problem is likely to be numerically 
unstable. So you'd want to measure the correlation between the difficulty 
of the problem and the probability of failure.

My experience is that the Optim error messages don't make it easy to 
realize when you've made a mistake in your gradients. This is being worked 
on at the moment, but I think someone would need to dedicate a week to 
working on this to get us to a point where the error messages are always 
clear.

 -- John

On Sunday, February 15, 2015 at 3:29:35 PM UTC-8, Ryan Carey wrote:
>
> Hi all,
>
> I've just discovered Julia this last month, and have been greatly enjoying 
> using it, especially because of its matlab-like linear algebra notation and 
> all-round concise and intuitive syntax.
>
> I've been playing with its optimisation functions, looking to implement 
> gradient descent for logistic regression but I hit a couple of stumbling 
> blocks, and was wondering how you've managed these.
>
> Using Optim, I implemented regularized logistic regression with l_bfgs, 
> and although it worked some times, when I stress-tested it with some k-fold 
> validation, I got Linesearch errors.
>
> I've got a dataset that's about 600 x 100 (m x n) with weights w and 
> classes y.
>
> my code:
>   function J(w)
> m,n = size(X)
> return sum(-y'*log(logistic(X*w)) - (1-y')*log(1-logistic(X*w))) + 
>  reg/(2m) * sum(w.^2) # note normalizing bias weight
>   end
> function g!(w,storage)
> storage[:] = X' * (logistic(X*w) - y) + reg / m * w
> end
>
> out = optimize(J, g!, w, method = :l_bfgs,show_trace=true)
>
>
> the error:
>
> Iter Function value   Gradient norm 
> ...
> 19-9.034225e+02 2.092807e+02
> 20-9.034225e+02 2.092807e+02
> 21-9.034225e+02 2.092807e+02
> 22-9.034225e+02 2.092807e+02
> 23-9.034225e+02 2.092807e+02
>
> Linesearch failed to converge
> while loading In[6], in expression starting on line 2
>
>  in hz_linesearch! at 
> /home/ryan/.julia/v0.3/Optim/src/linesearch/hz_linesearch.jl:374
>  in hz_linesearch! at 
> /home/ryan/.julia/v0.3/Optim/src/linesearch/hz_linesearch.jl:188
>  in l_bfgs at /home/ryan/.julia/v0.3/Optim/src/l_bfgs.jl:165
>  in optimize at /home/ryan/.julia/v0.3/Optim/src/optimize.jl:340
>
>
> Perhaps I should override its convergence criteria? Or there's a bug in my 
> code? Anyway, I thought I might have more like with conjugate gradient 
> descent, so I included types.jl and cg.jl from the Optim package, and tried 
> to make it work too, defining a Differentiable Function type
>
>
> function rosenbrock(g, x::Vector)
>
>  d1 = 1.0 - x[1]
>
>  d2 = x[2] - x[1]^2
>
>  if !(g === nothing)
>
>g[1] = -2.0*d1 - 400.0*d2*x[1]
>
>g[2] = 200.0*d2
>
>  end
>
>  val = d1^2 + 100.0 * d2^2
>
>  return val
>
>end
>
>
> function rosenbrock_gradient!(x::Vector, storage::Vector)
>
>storage[1] = -2.0 * (1.0 - x[1]) - 400.0 * (x[2] - x[1]^2) * x[1]
>
>storage[2] = 200.0 * (x[2] - x[1]^2)
>
>end
>
>  
>
>  cg(rosenbrock,[0,0])
>
>
> d2 = DifferentiableFunction(rosenbrock,rosenbrock_gradient!)
>
> cg(d2,[0,0])
>
> ERROR: InexactError()
>
>
> I tried a few variations on the function 'cg' before coming here for help. 
> I notice that there are a couple of other optimization packages out there 
> but this one is by JMW and looks good.
>
> Obviously, if I just wanted to perform linear regression, I could just use 
> a built-in function, but to use more complex models, I would need to be 
> able to do gradient descent.
>
> How have others fared with Optim? Any thoughts on what's going wrong? 
> General tips for how to make gradient descent work with Julia?
>
>
>
>

Re: [julia-users] Println question: adding suffix to $interpolation

2015-02-15 Thread Stefan Karpinski
Parens: println("$(bird)s are birds")

On Sun, Feb 15, 2015 at 6:40 PM, Aura  wrote:

> how can we add a suffix (such as a plural "s") to a variable that we call
> in println?
>
> ex:
>
> *code*
> for bird in ["crow", "parrot", "bluejay"]
> println("$bird are birds")  #how do we add an "s" to $bird without
> having a space inbetween
>
> *desired outcome:*
> crows are birds
> parrots are birds
> bluejays are birds
>


[julia-users] Println question: adding suffix to $interpolation

2015-02-15 Thread Aura
how can we add a suffix (such as a plural "s") to a variable that we call 
in println?

ex:

*code*
for bird in ["crow", "parrot", "bluejay"]
println("$bird are birds")  #how do we add an "s" to $bird without 
having a space inbetween

*desired outcome:*
crows are birds
parrots are birds
bluejays are birds


[julia-users] Issues with optimize in Julia

2015-02-15 Thread Ryan Carey


Hi all,

I've just discovered Julia this last month, and have been greatly enjoying 
using it, especially because of its matlab-like linear algebra notation and 
all-round concise and intuitive syntax.

I've been playing with its optimisation functions, looking to implement 
gradient descent for logistic regression but I hit a couple of stumbling 
blocks, and was wondering how you've managed these.

Using Optim, I implemented regularized logistic regression with l_bfgs, and 
although it worked some times, when I stress-tested it with some k-fold 
validation, I got Linesearch errors.

I've got a dataset that's about 600 x 100 (m x n) with weights w and 
classes y.

my code:
  function J(w)
m,n = size(X)
return sum(-y'*log(logistic(X*w)) - (1-y')*log(1-logistic(X*w))) + 
 reg/(2m) * sum(w.^2) # note normalizing bias weight
  end
function g!(w,storage)
storage[:] = X' * (logistic(X*w) - y) + reg / m * w
end

out = optimize(J, g!, w, method = :l_bfgs,show_trace=true)


the error:

Iter Function value   Gradient norm 
...
19-9.034225e+02 2.092807e+02
20-9.034225e+02 2.092807e+02
21-9.034225e+02 2.092807e+02
22-9.034225e+02 2.092807e+02
23-9.034225e+02 2.092807e+02

Linesearch failed to converge
while loading In[6], in expression starting on line 2

 in hz_linesearch! at 
/home/ryan/.julia/v0.3/Optim/src/linesearch/hz_linesearch.jl:374
 in hz_linesearch! at 
/home/ryan/.julia/v0.3/Optim/src/linesearch/hz_linesearch.jl:188
 in l_bfgs at /home/ryan/.julia/v0.3/Optim/src/l_bfgs.jl:165
 in optimize at /home/ryan/.julia/v0.3/Optim/src/optimize.jl:340


Perhaps I should override its convergence criteria? Or there's a bug in my 
code? Anyway, I thought I might have more like with conjugate gradient descent, 
so I included types.jl and cg.jl from the Optim package, and tried to make it 
work too, defining a Differentiable Function type


function rosenbrock(g, x::Vector)

 d1 = 1.0 - x[1]

 d2 = x[2] - x[1]^2

 if !(g === nothing)

   g[1] = -2.0*d1 - 400.0*d2*x[1]

   g[2] = 200.0*d2

 end

 val = d1^2 + 100.0 * d2^2

 return val

   end


function rosenbrock_gradient!(x::Vector, storage::Vector)

   storage[1] = -2.0 * (1.0 - x[1]) - 400.0 * (x[2] - x[1]^2) * x[1]

   storage[2] = 200.0 * (x[2] - x[1]^2)

   end

 

 cg(rosenbrock,[0,0])


d2 = DifferentiableFunction(rosenbrock,rosenbrock_gradient!)

cg(d2,[0,0])

ERROR: InexactError()


I tried a few variations on the function 'cg' before coming here for help. 
I notice that there are a couple of other optimization packages out there 
but this one is by JMW and looks good.

Obviously, if I just wanted to perform linear regression, I could just use 
a built-in function, but to use more complex models, I would need to be 
able to do gradient descent.

How have others fared with Optim? Any thoughts on what's going wrong? 
General tips for how to make gradient descent work with Julia?





[julia-users] Re: Values from Dict assigned to "variables" (symbols?) named as keys?

2015-02-15 Thread Martin Johansson
Thanks, the second solution was what I was looking for!

Regards, m

On Sunday, February 15, 2015 at 11:41:02 PM UTC+1, Sean Marshallsay wrote:
>
> I'm not too certain what you're asking here.
>
> Do you mean something like this:
>
> julia> d = Dict()
> Dict{Any,Any} with 0 entries
>
> julia> for i in [:var1, :var2, :var3]
>d[i] = string(i)
>end
>
> julia> d
> Dict{Any,Any} with 3 entries:
>   :var3 => "var3"
>   :var1 => "var1"
>   :var2 => "var2"
>
> Or something more like this:
>
> julia> d = [:a=>1, :b=>2, :c=>3]
> Dict{Symbol,Int64} with 3 entries:
>   :b => 2
>   :c => 3
>   :a => 1
>
> julia> for (k,v) in d
> @eval $k = $v
>end
>
> julia> a
> 1
>
> julia> b
> 2
>
> julia> c
> 3
>
> Or maybe something else entirely?
>
> On Sunday, 15 February 2015 22:17:54 UTC, Martin Johansson wrote:
>>
>> Is there a way to "assign" the values of a Dict to the corresponding keys 
>> such that I get "variables" (symbols?) with names given by the keys? Let's 
>> assume the keys are for example strings which would work as variables.
>>
>> The reason for why I would want to do this is to have functions where 
>> explicit variables are used rather than Dicts, primarily for readability. 
>> I'm suspecting that this is not a recommended way of doing things (since I 
>> couldn't find any info along these lines when searching), and if this is 
>> the case please set me straight.
>>
>> Regards, m
>>
>>
>>

[julia-users] Re: Values from Dict assigned to "variables" (symbols?) named as keys?

2015-02-15 Thread Sean Marshallsay
I'm not too certain what you're asking here.

Do you mean something like this:

julia> d = Dict()
Dict{Any,Any} with 0 entries

julia> for i in [:var1, :var2, :var3]
   d[i] = string(i)
   end

julia> d
Dict{Any,Any} with 3 entries:
  :var3 => "var3"
  :var1 => "var1"
  :var2 => "var2"

Or something more like this:

julia> d = [:a=>1, :b=>2, :c=>3]
Dict{Symbol,Int64} with 3 entries:
  :b => 2
  :c => 3
  :a => 1

julia> for (k,v) in d
@eval $k = $v
   end

julia> a
1

julia> b
2

julia> c
3

Or maybe something else entirely?

On Sunday, 15 February 2015 22:17:54 UTC, Martin Johansson wrote:
>
> Is there a way to "assign" the values of a Dict to the corresponding keys 
> such that I get "variables" (symbols?) with names given by the keys? Let's 
> assume the keys are for example strings which would work as variables.
>
> The reason for why I would want to do this is to have functions where 
> explicit variables are used rather than Dicts, primarily for readability. 
> I'm suspecting that this is not a recommended way of doing things (since I 
> couldn't find any info along these lines when searching), and if this is 
> the case please set me straight.
>
> Regards, m
>
>
>

[julia-users] Re: using ccall

2015-02-15 Thread Seth


On Sunday, February 15, 2015 at 10:58:43 AM UTC-8, Patrick O'Leary wrote:
>
>
> On Sunday, February 15, 2015 at 11:54:34 AM UTC-6, Seth wrote:
>>
>> Maybe I'm missing something, but isn't that result what you would expect 
>> by calling foo with (1,2,3,4)? The sum is 10.
>>
>
> The response was not from Josh, who posted the original question. 
> Dominique is showing that at least someone gets the expected answer. The 
> question is what might be different in Josh's situation. 
>

Ah, sorry - didn't look at the headers.

In any case, I'm getting the same results as Dominique:

seth@schroeder:~/tmp/foo$ julia foo.jl
10.0 



[julia-users] Values from Dict assigned to "variables" (symbols?) named as keys?

2015-02-15 Thread Martin Johansson
Is there a way to "assign" the values of a Dict to the corresponding keys 
such that I get "variables" (symbols?) with names given by the keys? Let's 
assume the keys are for example strings which would work as variables.

The reason for why I would want to do this is to have functions where 
explicit variables are used rather than Dicts, primarily for readability. 
I'm suspecting that this is not a recommended way of doing things (since I 
couldn't find any info along these lines when searching), and if this is 
the case please set me straight.

Regards, m




Re: [julia-users] Start julia with REPL, loading script, and passing arguments

2015-02-15 Thread Devendra Ghate

Shouldn't `read(STDIN,arg1,arg2)` in the startup.jl work?

Devendra


On Sun, Feb 15, 2015 at 02:31:08PM -0500, Stefan Karpinski wrote:

I'm not sure that we support this at the moment. The part that's not
supported seems to be passing command-line arguments in interactive mode.

On Sat, Feb 14, 2015 at 8:11 PM, Josef Sachs  wrote:


I am trying to start julia
(1) in interactive mode, i.e. with the REPL,
(2) loading a script, and
(3) passing command line arguments

I can accomplish (1) and (2) with
julia -i -L startup.jl
but I can't manage to pass additional arguments without producing an error.

juser@juliabox:~$ julia -i -L startup.jl foo bar
ERROR: could not open file /home/juser/foo
 in include at ./boot.jl:245
 in include_from_node1 at loading.jl:128
 in process_options at ./client.jl:285
 in _start at ./client.jl:354
juser@juliabox:~$ echo $?
1

Is there any way I can accomplish this?



--
Devendra Ghate


Re: [julia-users] Julia on Raspberry Pi 2

2015-02-15 Thread Seth


On Sunday, February 15, 2015 at 11:04:02 AM UTC-8, Sto Forest wrote:
>
> Thanks Steve I'll give that a try and see how far I get. :)
>
>
>
Sto,

I'm going to be attempting the same thing tomorrow with my RPi2. Please 
post any tips you discover. :)

Thanks,

Seth.
 


Re: [julia-users] Start julia with REPL, loading script, and passing arguments

2015-02-15 Thread Stefan Karpinski
I'm not sure that we support this at the moment. The part that's not
supported seems to be passing command-line arguments in interactive mode.

On Sat, Feb 14, 2015 at 8:11 PM, Josef Sachs  wrote:

> I am trying to start julia
> (1) in interactive mode, i.e. with the REPL,
> (2) loading a script, and
> (3) passing command line arguments
>
> I can accomplish (1) and (2) with
> julia -i -L startup.jl
> but I can't manage to pass additional arguments without producing an error.
>
> juser@juliabox:~$ julia -i -L startup.jl foo bar
> ERROR: could not open file /home/juser/foo
>  in include at ./boot.jl:245
>  in include_from_node1 at loading.jl:128
>  in process_options at ./client.jl:285
>  in _start at ./client.jl:354
> juser@juliabox:~$ echo $?
> 1
>
> Is there any way I can accomplish this?
>


Re: [julia-users] Julia on Raspberry Pi 2

2015-02-15 Thread Sto Forest
Thanks Steve I'll give that a try and see how far I get. :)


On Sunday, 15 February 2015 01:06:39 UTC, Steve Kelly wrote:
>
> Sto, 
>
> I got Julia running on a BeagleBone Black running Debian Jessie a couple 
> months back using this process: 
> https://github.com/JuliaLang/julia/blob/master/README.arm.md. It depends 
> on a few system libraries to run, so I needed to update from Wheezy to 
> Jessie so it would work. I think some improvements have been made since 
> then so the build is more self contained. I am pretty sure Raspbian is 
> based on Wheezy, but it might be worth a shot with the latest master.
>
> Best,
> Steve
>
> On Sat, Feb 14, 2015 at 3:11 PM, Sto Forest  > wrote:
>
>> Is there a way to get Julia running on the new Raspberry Pi 2, perhaps 
>> under raspbian ? 
>>
>>
>>
>

[julia-users] Re: using ccall

2015-02-15 Thread Patrick O'Leary

On Sunday, February 15, 2015 at 11:54:34 AM UTC-6, Seth wrote:
>
> Maybe I'm missing something, but isn't that result what you would expect 
> by calling foo with (1,2,3,4)? The sum is 10.
>

The response was not from Josh, who posted the original question. Dominique 
is showing that at least someone gets the expected answer. The question is 
what might be different in Josh's situation. 


[julia-users] Re: using ccall

2015-02-15 Thread Seth


On Saturday, February 14, 2015 at 5:19:45 PM UTC-8, Dominique Orban wrote:
>
>
> $ cat foo.jl
> run(`gcc foo.c -O3 -dynamiclib -o foo`) #-> compile it
> arg1, arg2, arg3, arg4 = 1, 2, 3, 4;
> val = ccall((:foo, "foo"), Cdouble,(Cdouble, Cdouble, Cdouble, Cdouble),
> arg1,arg2,arg3,arg4)
> println(val);
> $ julia foo.jl
> 10.0
>
>>
>>
Maybe I'm missing something, but isn't that result what you would expect by 
calling foo with (1,2,3,4)? The sum is 10.
 


Re: [julia-users] ccall from gc finalizer (task switch)

2015-02-15 Thread Kirill Ignatiev
On Saturday, 14 February 2015 19:41:25 UTC-5, Jameson wrote:
>
> Just task switches are disallowed (and thus any IO). Perhaps after the 
> threads branch merged, it'll be worth reconsidering that restriction too. 
> The reason to ban task switch is that gc could happen at any time in user's 
> code due to an allocation. It's not generally advisable therefore to allow 
> task switches at that time, since that it necessary for the user to be much 
> more concerned with syncronization.
>

I see, thanks for explaining.