Re: [julia-users] How do I write `import Base.!` in julia 0.5?

2016-08-08 Thread Kevin Squire
Generally it's not a different syntax, and the error generated here is
probably a consequence of recent parser changes to handle dot-operator
overloading (and should probably be reported as a bug, and possibly fixed.)

The syntax I suggested is related (I think) to the fact that some operators
need parentheses around them in order to define functions for them:

julia> &(a) = "hi"
ERROR: syntax: invalid assignment location

julia> (&)(a) = "hi"
WARNING: module Main should explicitly import & from Base
& (generic function with 36 methods)

But I agree that it's inconsistent with parsing elsewhere. It would
probably be best to deprecate `import Base.xxx` in favor of `import Base:
xxx` (i.e., removing `import Base.(!)`)

(An early conversation about operator functions needing parentheses is here
.)

Cheers,
   Kevin

On Mon, Aug 8, 2016 at 9:59 PM, Fengyang Wang 
wrote:

> On Monday, August 8, 2016 at 10:26:46 AM UTC-4, Kevin Squire wrote:
>>
>> Try
>>
>>   import Base.(!)
>>
>> Cheers,
>>   Kevin
>>
>
> Why do import statements have different syntax? This syntax, I'm pretty
> sure, has either meant getfield or broadcast—in neither case does it
> actually refer to the ! function.
>


Re: [julia-users] How do I write `import Base.!` in julia 0.5?

2016-08-08 Thread Fengyang Wang
On Monday, August 8, 2016 at 10:26:46 AM UTC-4, Kevin Squire wrote:
>
> Try
>
>   import Base.(!)
>
> Cheers,
>   Kevin 
>

Why do import statements have different syntax? This syntax, I'm pretty 
sure, has either meant getfield or broadcast—in neither case does it 
actually refer to the ! function.


[julia-users] Re: PyCall-ing Numba-dependent Libraries

2016-08-08 Thread Christoph Ortner
To reply to my own question, this seems to have worked:

git clone https://github.com/numba/llvmlite
cd llvmlite
LLVM_CONFIG=.../julia/usr/tools/llvm-config python setup.py install
LLVM_CONFIG=.../julia/usr/tools/llvm-config pip install numba


Unfortunately it didn't solve my problem since the library I am really 
interested in (chemview) still crashes. 


Re: [julia-users] Is the master algorithm on the roadmap?

2016-08-08 Thread Kevin Liu
I have no idea where to start and where to finish. Founders' help would be 
wonderful. 

On Tuesday, August 9, 2016 at 12:19:26 AM UTC-3, Kevin Liu wrote:
>
> After which I have to code Felix into Julia, a relational optimizer for 
> statistical inference with Tuffy  
> inside, for enterprise settings.
>
> On Tuesday, August 9, 2016 at 12:07:32 AM UTC-3, Kevin Liu wrote:
>>
>> Can I get tips on bringing Alchemy's optimized Tuffy 
>>  in Java to Julia while showing the 
>> best of Julia? I am going for the most correct way, even if it means coding 
>> Tuffy into C and Julia.
>>
>> On Sunday, August 7, 2016 at 8:34:37 PM UTC-3, Kevin Liu wrote:
>>>
>>> I'll try to build it, compare it, and show it to you guys. I offered to 
>>> do this as work. I am waiting to see if they will accept it. 
>>>
>>> On Sunday, August 7, 2016 at 6:15:50 PM UTC-3, Stefan Karpinski wrote:

 Kevin, as previously requested by Isaiah, please take this to some 
 other forum or maybe start a blog.

 On Sat, Aug 6, 2016 at 10:53 PM, Kevin Liu  wrote:

> Symmetry-based learning, Domingos, 2014 
> https://www.microsoft.com/en-us/research/video/symmetry-based-learning/
>
> Approach 2: Deep symmetry networks generalize convolutional neural 
> networks by tying parameters and pooling over an arbitrary symmetry 
> group, 
> not just the translation group. In preliminary experiments, they 
> outperformed convnets on a digit recognition task. 
>
> On Friday, August 5, 2016 at 4:56:45 PM UTC-3, Kevin Liu wrote:
>>
>> Minsky died of a cerebral hemorrhage at the age of 88.[40] 
>>  Ray 
>> Kurzweil  says he was 
>> contacted by the cryonics organization Alcor Life Extension 
>> Foundation 
>>  seeking 
>> Minsky's body.[41] 
>>  
>> Kurzweil 
>> believes that Minsky was cryonically preserved by Alcor and will be 
>> revived 
>> by 2045.[41] 
>>  
>> Minsky 
>> was a member of Alcor's Scientific Advisory Board 
>> .[42] 
>>  In 
>> keeping with their policy of protecting privacy, Alcor will neither 
>> confirm 
>> nor deny that Alcor has cryonically preserved Minsky.[43] 
>>  
>>
>> We better do a good job. 
>>
>> On Friday, August 5, 2016 at 4:45:42 PM UTC-3, Kevin Liu wrote:
>>>
>>> *So, I think in the next 20 years (2003), if we can get rid of all 
>>> of the traditional approaches to artificial intelligence, like neural 
>>> nets 
>>> and genetic algorithms and rule-based systems, and just turn our sights 
>>> a 
>>> little bit higher to say, can we make a system that can use all those 
>>> things for the right kind of problem? Some problems are good for neural 
>>> nets; we know that others, neural nets are hopeless on them. Genetic 
>>> algorithms are great for certain things; I suspect I know what they're 
>>> bad 
>>> at, and I won't tell you. (Laughter)*  - Minsky, founder of CSAIL 
>>> MIT
>>>
>>> *Those programmers tried to find the single best way to represent 
>>> knowledge - Only Logic protects us from paradox.* - Minsky (see 
>>> attachment from his lecture)
>>>
>>> On Friday, August 5, 2016 at 8:12:03 AM UTC-3, Kevin Liu wrote:

 Markov Logic Network is being used for the continuous development 
 of drugs to cure cancer at MIT's CanceRX , on 
 DARPA's largest AI project to date, Personalized Assistant that 
 Learns (PAL) , progenitor of Siri. One of 
 Alchemy's largest applications to date was to learn a semantic network 
 (knowledge graph as Google calls it) from the web. 

 Some on Probabilistic Inductive Logic Programming / Probabilistic 
 Logic Programming / Statistical Relational Learning from CSAIL 
 
  (my 
 understanding is Alchemy does PILP from entailment, proofs, and 
 interpretation)

 The MIT Probabilistic Computing Project (where there is Picture, an 
 extension of Julia, for computer vision; Watch the video from Vikash) 
 

 Probabilistic programming could do for Bayesian ML what Theano has 
 done for 

Re: [julia-users] Is the master algorithm on the roadmap?

2016-08-08 Thread Kevin Liu
After which I have to code Felix into Julia, a relational optimizer for 
statistical inference with Tuffy  
inside, for enterprise settings.

On Tuesday, August 9, 2016 at 12:07:32 AM UTC-3, Kevin Liu wrote:
>
> Can I get tips on bringing Alchemy's optimized Tuffy 
>  in Java to Julia while showing the 
> best of Julia? I am going for the most correct way, even if it means coding 
> Tuffy into C and Julia.
>
> On Sunday, August 7, 2016 at 8:34:37 PM UTC-3, Kevin Liu wrote:
>>
>> I'll try to build it, compare it, and show it to you guys. I offered to 
>> do this as work. I am waiting to see if they will accept it. 
>>
>> On Sunday, August 7, 2016 at 6:15:50 PM UTC-3, Stefan Karpinski wrote:
>>>
>>> Kevin, as previously requested by Isaiah, please take this to some other 
>>> forum or maybe start a blog.
>>>
>>> On Sat, Aug 6, 2016 at 10:53 PM, Kevin Liu  wrote:
>>>
 Symmetry-based learning, Domingos, 2014 
 https://www.microsoft.com/en-us/research/video/symmetry-based-learning/

 Approach 2: Deep symmetry networks generalize convolutional neural 
 networks by tying parameters and pooling over an arbitrary symmetry group, 
 not just the translation group. In preliminary experiments, they 
 outperformed convnets on a digit recognition task. 

 On Friday, August 5, 2016 at 4:56:45 PM UTC-3, Kevin Liu wrote:
>
> Minsky died of a cerebral hemorrhage at the age of 88.[40] 
>  Ray 
> Kurzweil  says he was 
> contacted by the cryonics organization Alcor Life Extension Foundation 
>  seeking 
> Minsky's body.[41] 
>  
> Kurzweil 
> believes that Minsky was cryonically preserved by Alcor and will be 
> revived 
> by 2045.[41] 
>  
> Minsky 
> was a member of Alcor's Scientific Advisory Board 
> .[42] 
>  In 
> keeping with their policy of protecting privacy, Alcor will neither 
> confirm 
> nor deny that Alcor has cryonically preserved Minsky.[43] 
>  
>
> We better do a good job. 
>
> On Friday, August 5, 2016 at 4:45:42 PM UTC-3, Kevin Liu wrote:
>>
>> *So, I think in the next 20 years (2003), if we can get rid of all of 
>> the traditional approaches to artificial intelligence, like neural nets 
>> and 
>> genetic algorithms and rule-based systems, and just turn our sights a 
>> little bit higher to say, can we make a system that can use all those 
>> things for the right kind of problem? Some problems are good for neural 
>> nets; we know that others, neural nets are hopeless on them. Genetic 
>> algorithms are great for certain things; I suspect I know what they're 
>> bad 
>> at, and I won't tell you. (Laughter)*  - Minsky, founder of CSAIL MIT
>>
>> *Those programmers tried to find the single best way to represent 
>> knowledge - Only Logic protects us from paradox.* - Minsky (see 
>> attachment from his lecture)
>>
>> On Friday, August 5, 2016 at 8:12:03 AM UTC-3, Kevin Liu wrote:
>>>
>>> Markov Logic Network is being used for the continuous development of 
>>> drugs to cure cancer at MIT's CanceRX , on 
>>> DARPA's largest AI project to date, Personalized Assistant that 
>>> Learns (PAL) , progenitor of Siri. One of 
>>> Alchemy's largest applications to date was to learn a semantic network 
>>> (knowledge graph as Google calls it) from the web. 
>>>
>>> Some on Probabilistic Inductive Logic Programming / Probabilistic 
>>> Logic Programming / Statistical Relational Learning from CSAIL 
>>> 
>>>  (my 
>>> understanding is Alchemy does PILP from entailment, proofs, and 
>>> interpretation)
>>>
>>> The MIT Probabilistic Computing Project (where there is Picture, an 
>>> extension of Julia, for computer vision; Watch the video from Vikash) 
>>> 
>>>
>>> Probabilistic programming could do for Bayesian ML what Theano has 
>>> done for neural networks. 
>>>  - Ferenc Huszár
>>>
>>> Picture doesn't appear to be open-source, even though its Paper is 
>>> available. 
>>>
>>> I'm in the process of comparing the Picture Paper and Alchemy code 
>>> and 

[julia-users] precompiling all packages

2016-08-08 Thread Chang Kwon
Is there a way to precompile all packages at once? Each time that I run 
Pkg.update(), I would also like to precompile all packages so that when I 
actually use packages I don't need to precompile.

Chang


Re: [julia-users] Is the master algorithm on the roadmap?

2016-08-08 Thread Kevin Liu
Can I get tips on bringing Alchemy's optimized Tuffy 
 in Java to Julia while showing the best 
of Julia? I am going for the most correct way, even if it means coding 
Tuffy into C and Julia.

On Sunday, August 7, 2016 at 8:34:37 PM UTC-3, Kevin Liu wrote:
>
> I'll try to build it, compare it, and show it to you guys. I offered to do 
> this as work. I am waiting to see if they will accept it. 
>
> On Sunday, August 7, 2016 at 6:15:50 PM UTC-3, Stefan Karpinski wrote:
>>
>> Kevin, as previously requested by Isaiah, please take this to some other 
>> forum or maybe start a blog.
>>
>> On Sat, Aug 6, 2016 at 10:53 PM, Kevin Liu  wrote:
>>
>>> Symmetry-based learning, Domingos, 2014 
>>> https://www.microsoft.com/en-us/research/video/symmetry-based-learning/
>>>
>>> Approach 2: Deep symmetry networks generalize convolutional neural 
>>> networks by tying parameters and pooling over an arbitrary symmetry group, 
>>> not just the translation group. In preliminary experiments, they 
>>> outperformed convnets on a digit recognition task. 
>>>
>>> On Friday, August 5, 2016 at 4:56:45 PM UTC-3, Kevin Liu wrote:

 Minsky died of a cerebral hemorrhage at the age of 88.[40] 
  Ray Kurzweil 
  says he was contacted by 
 the cryonics organization Alcor Life Extension Foundation 
  seeking 
 Minsky's body.[41] 
  
 Kurzweil 
 believes that Minsky was cryonically preserved by Alcor and will be 
 revived 
 by 2045.[41] 
  Minsky 
 was a member of Alcor's Scientific Advisory Board 
 .[42] 
  In 
 keeping with their policy of protecting privacy, Alcor will neither 
 confirm 
 nor deny that Alcor has cryonically preserved Minsky.[43] 
  

 We better do a good job. 

 On Friday, August 5, 2016 at 4:45:42 PM UTC-3, Kevin Liu wrote:
>
> *So, I think in the next 20 years (2003), if we can get rid of all of 
> the traditional approaches to artificial intelligence, like neural nets 
> and 
> genetic algorithms and rule-based systems, and just turn our sights a 
> little bit higher to say, can we make a system that can use all those 
> things for the right kind of problem? Some problems are good for neural 
> nets; we know that others, neural nets are hopeless on them. Genetic 
> algorithms are great for certain things; I suspect I know what they're 
> bad 
> at, and I won't tell you. (Laughter)*  - Minsky, founder of CSAIL MIT
>
> *Those programmers tried to find the single best way to represent 
> knowledge - Only Logic protects us from paradox.* - Minsky (see 
> attachment from his lecture)
>
> On Friday, August 5, 2016 at 8:12:03 AM UTC-3, Kevin Liu wrote:
>>
>> Markov Logic Network is being used for the continuous development of 
>> drugs to cure cancer at MIT's CanceRX , on 
>> DARPA's largest AI project to date, Personalized Assistant that 
>> Learns (PAL) , progenitor of Siri. One of 
>> Alchemy's largest applications to date was to learn a semantic network 
>> (knowledge graph as Google calls it) from the web. 
>>
>> Some on Probabilistic Inductive Logic Programming / Probabilistic 
>> Logic Programming / Statistical Relational Learning from CSAIL 
>> 
>>  (my 
>> understanding is Alchemy does PILP from entailment, proofs, and 
>> interpretation)
>>
>> The MIT Probabilistic Computing Project (where there is Picture, an 
>> extension of Julia, for computer vision; Watch the video from Vikash) 
>> 
>>
>> Probabilistic programming could do for Bayesian ML what Theano has 
>> done for neural networks. 
>>  - Ferenc Huszár
>>
>> Picture doesn't appear to be open-source, even though its Paper is 
>> available. 
>>
>> I'm in the process of comparing the Picture Paper and Alchemy code 
>> and would like to have an open-source PILP from Julia that combines the 
>> best of both. 
>>
>> On Wednesday, August 3, 2016 at 5:01:02 PM UTC-3, Christof Stocker 
>> wrote:
>>>
>>> This sounds like it could be a great contribution. I shall keep a 
>>> curious eye on your progress
>>>
>>> Am Mittwoch, 3. August 2016 

[julia-users] Re: Save a REPL defined function to a file

2016-08-08 Thread Ralph Smith
Your definition is entered into your history file, probably when you end 
the session.  On Linux, the file is ${HOME}/.julia_history - perhaps 
someone else can report on OSX or Windows.  This does not appear to be 
mentioned in the main documentation.

On Monday, August 8, 2016 at 6:56:30 PM UTC-4, Naelson Douglas wrote:
>
> Is there a way for me to take a function I defined on REPL and then save 
> it to a file?
>
> Example
> julia > double(x) = 2*x
> double (generic function with 1 method)
> julia > savetofile(double, "myfile.jl")
>
> After that I would open myfile.jl and see it's writher "double(x) = 2*x" 
> there
>
>
>

Re: [julia-users] Re: chol() more strict in v0.5?

2016-08-08 Thread Andreas Noack
Okay. I did it this time

https://github.com/JuliaLang/Compat.jl/pull/268

On Monday, August 8, 2016 at 6:37:53 PM UTC-4, Giuseppe Ragusa wrote:
>
> Me too! I have been trying to update a package to `v0.5` and I do not 
> really see a clean way to support both 0.4 and 0.5 without an entry like 
> this in Compat.jl. 
>
> On Sunday, August 7, 2016 at 10:02:44 PM UTC+2, Andreas Noack wrote:
>>
>> It would be great with an entry for this in Compat.jl, e.g. something like
>>
>> cholfact(A::HermOrSym, args...) = cholfact(A.data, A.uplo, args...)
>>
>> On Sun, Aug 7, 2016 at 2:44 PM, Chris <7hunde...@gmail.com> wrote:
>>
>>> mmh, could you explain your comment a little more?
>>>
>>> David, thanks for the tip.
>>
>>
>>

[julia-users] Re: I cant view Plots in Juno - Atom IDE

2016-08-08 Thread Erick J Zagal
Tank you :)

El lunes, 8 de agosto de 2016, 3:03:42 (UTC-5), Chris Rackauckas escribió:
>
> Use the command `gui()`
>
> On Sunday, August 7, 2016 at 9:32:29 PM UTC-7, Erick J Zagal wrote:
>>
>> I have this code:
>>
>> using Plots
>>
>> x = linspace(0, 10, 200)
>> y = sin(x)
>> plot(x, y, color=:blue, linewidth=2, label="sine")
>>
>> when run this in the console , show the plot , but  trying in Juno only 
>> show [Plots.jl] Initializing backend:plotly
>> the plot is nothing 
>> Add a Screenshot
>>
>> Help
>>
>

Re: [julia-users] Plots with scale bars

2016-08-08 Thread Tom Breloff
Yes likely this could be done easily in a recipe if it's something you'd
want to do repeatedly.

On Monday, August 8, 2016, Islam Badreldin 
wrote:

>
> Hi Tom,
>
> It'd be cool if you can add a couple of examples, yes!
>
> I'm not familiar with plot recipes yet, but do you think that such a
> scenario can be handled 'neatly' in a recipe while hiding all manual
> tweaking from the end user?
>
> Thanks,
> Islam
>
> _
> From: Tom Breloff  >
> Sent: Monday, August 8, 2016 1:03 PM
> Subject: Re: [julia-users] Plots with scale bars
> To:  >
>
>
> There's a bunchof ways to do this with Plots. Draw the lines and add
> annotations or add an inset subplot with labels? I'm not at a computer...
> maybe I'll throw together an example later.
>
> On Monday, August 8, 2016, Islam Badreldin  > wrote:
>
>>
>>
>> Hello,
>>
>> Is there a simple way in Julia to add scale bars with labels to plots and
>> to hide the x-y axes as well? The way to do in MATLAB involves a lot of
>> manual tweaking as described here
>> http://www.mathworks.com/matlabcentral/answers/151248-add-a-
>> scale-bar-to-my-plot
>>
>> I'm hoping to find a more elegant way in Julia!
>>
>> Thanks,
>> Islam
>>
>
>
>


Re: [julia-users] Plots with scale bars

2016-08-08 Thread Islam Badreldin

Hi Tom,
It'd be cool if you can add a couple of examples, yes!
I'm not familiar with plot recipes yet, but do you think that such a scenario 
can be handled 'neatly' in a recipe while hiding all manual tweaking from the 
end user?
Thanks,Islam

_
From: Tom Breloff 
Sent: Monday, August 8, 2016 1:03 PM
Subject: Re: [julia-users] Plots with scale bars
To:  


There's a bunchof ways to do this with Plots. Draw the lines and add 
annotations or add an inset subplot with labels? I'm not at a computer... maybe 
I'll throw together an example later. 

On Monday, August 8, 2016, Islam Badreldin  wrote:


Hello,
Is there a simple way in Julia to add scale bars with labels to plots and to 
hide the x-y axes as well? The way to do in MATLAB involves a lot of manual 
tweaking as described 
herehttp://www.mathworks.com/matlabcentral/answers/151248-add-a-scale-bar-to-my-plot
I'm hoping to find a more elegant way in Julia!
Thanks,Islam




[julia-users] Save a REPL defined function to a file

2016-08-08 Thread Naelson Douglas
Is there a way for me to take a function I defined on REPL and then save it 
to a file?

Example
julia > double(x) = 2*x
double (generic function with 1 method)
julia > savetofile(double, "myfile.jl")

After that I would open myfile.jl and see it's writher "double(x) = 2*x" 
there




Re: [julia-users] Re: chol() more strict in v0.5?

2016-08-08 Thread Giuseppe Ragusa
Me too! I have been trying to update a package to `v0.5` and I do not 
really see a clean way to support both 0.4 and 0.5 without an entry like 
this in Compat.jl. 

On Sunday, August 7, 2016 at 10:02:44 PM UTC+2, Andreas Noack wrote:
>
> It would be great with an entry for this in Compat.jl, e.g. something like
>
> cholfact(A::HermOrSym, args...) = cholfact(A.data, A.uplo, args...)
>
> On Sun, Aug 7, 2016 at 2:44 PM, Chris <7hunde...@gmail.com > 
> wrote:
>
>> mmh, could you explain your comment a little more?
>>
>> David, thanks for the tip.
>
>
>

[julia-users] Re: ANN: JuMP 0.14 released

2016-08-08 Thread Uwe Fechner
Ok, I replaced all occurrences of GradientNumber in my code with Dual (not 
DualNumber!),
and now the code works again.

But this is NOT an implementation detail. It could be an implementation 
detail, if a function for converting
Dual or GradientNumber to real would be part of the package, but it is not.

Therefore it would be nice, if this info could be added to the upgrading 
guide.

Best regards:

Uwe

On Monday, August 8, 2016 at 10:18:10 PM UTC+2, Kristoffer Carlsson wrote:
>
> It is more of an implementation detail but there is DualNumber now.
>
> On Monday, August 8, 2016 at 7:27:42 PM UTC+2, Uwe Fechner wrote:
>>
>> Well, but in the upgrading guide there is no replacement for 
>> GradientNumber mentioned.
>>
>> Any idea?
>>
>> Uwe
>>
>> On Monday, August 8, 2016 at 7:14:45 PM UTC+2, Miles Lubin wrote:
>>>
>>> ForwardDiff 0.2 introduced some breaking changes, you will need to 
>>> update your code (GradientNumber is no longer defined). See the upgrading 
>>> guide .
>>>
>>> On Monday, August 8, 2016 at 11:10:50 AM UTC-6, Uwe Fechner wrote:

 Hello,
 I updated, and now I get the following error:
 julia> include("Plotting.jl")
 INFO: Recompiling stale cache file 
 /home/ufechner/.julia/lib/v0.4/JuMP.ji for module JuMP.
 INFO: Recompiling stale cache file 
 /home/ufechner/.julia/lib/v0.4/ReverseDiffSparse.ji for module 
 ReverseDiffSparse.
 INFO: Recompiling stale cache file 
 /home/ufechner/.julia/lib/v0.4/ForwardDiff.ji for module ForwardDiff.
 INFO: Recompiling stale cache file 
 /home/ufechner/.julia/lib/v0.4/HDF5.ji for module HDF5.
 ERROR: LoadError: LoadError: LoadError: LoadError: UndefVarError: 
 GradientNumber not defined
 while loading /home/ufechner/00PythonSoftware/FastSim/src/Projects.jl, 
 in expression starting on line 433
 while loading /home/ufechner/00PythonSoftware/FastSim/src/Model.jl, in 
 expression starting on line 19
 while loading /home/ufechner/00PythonSoftware/FastSim/src/Optimizer.jl, 
 in expression starting on line 13
 while loading /home/ufechner/00PythonSoftware/FastSim/src/Plotting.jl, 
 in expression starting on line 22

 The code, that fails is the following:
 """
 Helper function to convert the value of an optimization results, but 
 also
 simple real values.
 """
 my_value(value::ForwardDiff.GradientNumber) = ForwardDiff.value(value)
 my_value(value::Real) = value
 my_value(val_vector::Vector) = [my_value(value) for value in val_vector]

 Any idea how to fix this?

 Uwe

 On Monday, August 8, 2016 at 4:57:16 PM UTC+2, Miles Lubin wrote:
>
> The JuMP team is happy to announce the release of JuMP 0.14. The 
> release should clear most, if not all, deprecation warnings on Julia 0.5 
> and is compatible with ForwardDiff 0.2. The full release notes are 
> here 
> ,
>  
> and I'd just like to highlight a few points:
>
> - *All JuMP users read this*: As previously announced 
> , we 
> will be deprecating the sum{}, prod{}, and norm{} syntax in favor of 
> using 
> Julia 0.5's new syntax for generator statements, e.g., sum(x[i] for i 
> in 1:N) instead of sum{x[i], i in 1:N}. In this release, the new 
> syntax is available for testing if using Julia 0.5. No deprecation 
> warnings 
> are printed yet. In JuMP 0.15, which will drop support for Julia 0.4, we 
> will begin printing deprecation warnings for the old syntax.
>
> - *Advanced JuMP users read this*: We have introduced a new syntax 
> for "anonymous" objects, which means that when declaring an optimization 
> variable, constraint, expression, or parameter, you may omit the name of 
> the object within the macro. The macro will instead return the object 
> itself which you can assign to a variable if you'd like. Example:
>
> # instead of @variable(m, l[i] <= x[i=1:N] <= u[i]):
> x = @variable(m, [i=1:N], lowerbound=l[i], upperbound=u[i]) 
>
> This syntax should be comfortable for advanced use cases of JuMP 
> (e.g., within a library) and should obviate some confusions about JuMP's 
> variable scoping rules.
>
> - We also have a new input form for nonlinear expressions that has the 
> potential to extend JuMP's scope as an AD tool. Previously all nonlinear 
> expressions needed to be input via macros, which isn't convenient if the 
> expression is generated programmatically. You can now set nonlinear 
> objectives and add nonlinear constraints by providing a Julia Expr 
> object directly with JuMP variables spliced in. This means that you can 
> now 
> generate expressions via symbolic manipulation 

[julia-users] Re: ANN: JuMP 0.14 released

2016-08-08 Thread Kristoffer Carlsson
It is more of an implementation detail but there is DualNumber now.

On Monday, August 8, 2016 at 7:27:42 PM UTC+2, Uwe Fechner wrote:
>
> Well, but in the upgrading guide there is no replacement for 
> GradientNumber mentioned.
>
> Any idea?
>
> Uwe
>
> On Monday, August 8, 2016 at 7:14:45 PM UTC+2, Miles Lubin wrote:
>>
>> ForwardDiff 0.2 introduced some breaking changes, you will need to update 
>> your code (GradientNumber is no longer defined). See the upgrading guide 
>> .
>>
>> On Monday, August 8, 2016 at 11:10:50 AM UTC-6, Uwe Fechner wrote:
>>>
>>> Hello,
>>> I updated, and now I get the following error:
>>> julia> include("Plotting.jl")
>>> INFO: Recompiling stale cache file 
>>> /home/ufechner/.julia/lib/v0.4/JuMP.ji for module JuMP.
>>> INFO: Recompiling stale cache file 
>>> /home/ufechner/.julia/lib/v0.4/ReverseDiffSparse.ji for module 
>>> ReverseDiffSparse.
>>> INFO: Recompiling stale cache file 
>>> /home/ufechner/.julia/lib/v0.4/ForwardDiff.ji for module ForwardDiff.
>>> INFO: Recompiling stale cache file 
>>> /home/ufechner/.julia/lib/v0.4/HDF5.ji for module HDF5.
>>> ERROR: LoadError: LoadError: LoadError: LoadError: UndefVarError: 
>>> GradientNumber not defined
>>> while loading /home/ufechner/00PythonSoftware/FastSim/src/Projects.jl, 
>>> in expression starting on line 433
>>> while loading /home/ufechner/00PythonSoftware/FastSim/src/Model.jl, in 
>>> expression starting on line 19
>>> while loading /home/ufechner/00PythonSoftware/FastSim/src/Optimizer.jl, 
>>> in expression starting on line 13
>>> while loading /home/ufechner/00PythonSoftware/FastSim/src/Plotting.jl, 
>>> in expression starting on line 22
>>>
>>> The code, that fails is the following:
>>> """
>>> Helper function to convert the value of an optimization results, but also
>>> simple real values.
>>> """
>>> my_value(value::ForwardDiff.GradientNumber) = ForwardDiff.value(value)
>>> my_value(value::Real) = value
>>> my_value(val_vector::Vector) = [my_value(value) for value in val_vector]
>>>
>>> Any idea how to fix this?
>>>
>>> Uwe
>>>
>>> On Monday, August 8, 2016 at 4:57:16 PM UTC+2, Miles Lubin wrote:

 The JuMP team is happy to announce the release of JuMP 0.14. The 
 release should clear most, if not all, deprecation warnings on Julia 0.5 
 and is compatible with ForwardDiff 0.2. The full release notes are here 
 ,
  
 and I'd just like to highlight a few points:

 - *All JuMP users read this*: As previously announced 
 , we 
 will be deprecating the sum{}, prod{}, and norm{} syntax in favor of using 
 Julia 0.5's new syntax for generator statements, e.g., sum(x[i] for i 
 in 1:N) instead of sum{x[i], i in 1:N}. In this release, the new 
 syntax is available for testing if using Julia 0.5. No deprecation 
 warnings 
 are printed yet. In JuMP 0.15, which will drop support for Julia 0.4, we 
 will begin printing deprecation warnings for the old syntax.

 - *Advanced JuMP users read this*: We have introduced a new syntax for 
 "anonymous" objects, which means that when declaring an optimization 
 variable, constraint, expression, or parameter, you may omit the name of 
 the object within the macro. The macro will instead return the object 
 itself which you can assign to a variable if you'd like. Example:

 # instead of @variable(m, l[i] <= x[i=1:N] <= u[i]):
 x = @variable(m, [i=1:N], lowerbound=l[i], upperbound=u[i]) 

 This syntax should be comfortable for advanced use cases of JuMP (e.g., 
 within a library) and should obviate some confusions about JuMP's variable 
 scoping rules.

 - We also have a new input form for nonlinear expressions that has the 
 potential to extend JuMP's scope as an AD tool. Previously all nonlinear 
 expressions needed to be input via macros, which isn't convenient if the 
 expression is generated programmatically. You can now set nonlinear 
 objectives and add nonlinear constraints by providing a Julia Expr 
 object directly with JuMP variables spliced in. This means that you can 
 now 
 generate expressions via symbolic manipulation and add them directly to a 
 JuMP model. See the example in the documentation 
 .

 Finally, I'd like to thank Joaquim Dias Garcia, Oscar Dowson, Mehdi 
 Madani, and Jarrett Revels for contributions to this release which are 
 cited in the release notes.

 Miles, Iain, and Joey




[julia-users] PyCall-ing Numba-dependent Libraries

2016-08-08 Thread Christoph Ortner
Has anybody managed to @pyimport a package that uses NUMBA? I've only found 
this 
discussion , of how it 
fails. 

It seems related to LLVM versions, which in principle sounds easy enough to 
fix, either change the LLVM version in Julia or in Python. Maybe this is 
easy enough to do, but I tried and failed. I'd be interested to hear if 
anybody has managed. 







Re: [julia-users] GlobalRef(Module, func) vs. :(Module.func)

2016-08-08 Thread Andrei Zh
Thanks, this makes sense. Just for clarification: 
 

> It strongly depend on what you want to do and whether you care about what 
> they represent.
>

I want to apply transformations (mostly) to algebraic expressions and 
function calls in order to simplify them, replace argument names, find 
derivatives, etc. Thus I don't really care about the difference between 
GlobalRef and dot notation as long as they contain the same amount of 
information and I can evaluate them to actual function objects. But since 
there's a choice between 2 options I was trying to understand which one is 
more suitable for me and whether I can cast one to another (again, in the 
context of my simple transformations). 
 
Now based on what you've told about optimization and 
`Base.uncompressed_ast()` I believe these 2 options - GloablRef and dot 
notation - are interchangeable for my use case. 
 

> The devdoc should have fairly complete description of what these nodes 
> are. 
>

What they are - yes, but not when to use which one. Anyway, I didn't intend 
to abuse anyone, just tried to get my head around AST internals. 

 

>  
>
>>
>>
>>
>> On Monday, August 8, 2016 at 1:55:51 AM UTC+3, Yichao Yu wrote:
>>>
>>>
>>>
>>> On Mon, Aug 8, 2016 at 3:57 AM, Andrei Zh  wrote:
>>>
 While parsing Julia expressions, I noticed that sometimes calls to 

>>>
>>> This shouldn't happen.
>>>  
>>>
 global functions resolve to `GloablRef` and sometimes to 

>>>
>>> and GlobalRef should only happen during lowering.
>>>  
>>>
 getfield(Module, func). Could somebody please clarify:

 1. Why do we need both? 

>>>
>>> GlobalRef is essentially an optimization. It's more restricted and 
>>> easier to interpret/codegen and is emitted by lowering/type inference when 
>>> it is legal to do so.
>>>  
>>>
 2. Is it safe to replace one by the other (assuming only modules and 
 functions are involved)? 

>>>
>>> Not always. There are certainly cases where a GlobalRef currently can't 
>>> be used (method definition for example) I'm not sure if there's cases the 
>>> other way around.
>>>
>>>
>

[julia-users] Re: JuliaIO

2016-08-08 Thread Simon Danisch
I replied on github ;)

Best,
Simon

Am Montag, 8. August 2016 19:54:14 UTC+2 schrieb David Anthoff:
>
> Who is maintaining JuliaIO packages? It would be great if someone with 
> push rights could follow up on 
> https://github.com/JuliaIO/GZip.jl/issues/57. 
>
>  
>
> Thanks,
>
> David
>
>  
>
> --
>
> David Anthoff
>
> University of California, Berkeley
>
>  
>
> http://www.david-anthoff.com
>
>  
>


[julia-users] JuliaIO

2016-08-08 Thread David Anthoff
Who is maintaining JuliaIO packages? It would be great if someone with push
rights could follow up on https://github.com/JuliaIO/GZip.jl/issues/57. 

 

Thanks,

David

 

--

David Anthoff

University of California, Berkeley

 

http://www.david-anthoff.com

 



[julia-users] Re: ANN: JuMP 0.14 released

2016-08-08 Thread Uwe Fechner
Well, but in the upgrading guide there is no replacement for GradientNumber 
mentioned.

Any idea?

Uwe

On Monday, August 8, 2016 at 7:14:45 PM UTC+2, Miles Lubin wrote:
>
> ForwardDiff 0.2 introduced some breaking changes, you will need to update 
> your code (GradientNumber is no longer defined). See the upgrading guide 
> .
>
> On Monday, August 8, 2016 at 11:10:50 AM UTC-6, Uwe Fechner wrote:
>>
>> Hello,
>> I updated, and now I get the following error:
>> julia> include("Plotting.jl")
>> INFO: Recompiling stale cache file /home/ufechner/.julia/lib/v0.4/JuMP.ji 
>> for module JuMP.
>> INFO: Recompiling stale cache file 
>> /home/ufechner/.julia/lib/v0.4/ReverseDiffSparse.ji for module 
>> ReverseDiffSparse.
>> INFO: Recompiling stale cache file 
>> /home/ufechner/.julia/lib/v0.4/ForwardDiff.ji for module ForwardDiff.
>> INFO: Recompiling stale cache file /home/ufechner/.julia/lib/v0.4/HDF5.ji 
>> for module HDF5.
>> ERROR: LoadError: LoadError: LoadError: LoadError: UndefVarError: 
>> GradientNumber not defined
>> while loading /home/ufechner/00PythonSoftware/FastSim/src/Projects.jl, in 
>> expression starting on line 433
>> while loading /home/ufechner/00PythonSoftware/FastSim/src/Model.jl, in 
>> expression starting on line 19
>> while loading /home/ufechner/00PythonSoftware/FastSim/src/Optimizer.jl, 
>> in expression starting on line 13
>> while loading /home/ufechner/00PythonSoftware/FastSim/src/Plotting.jl, in 
>> expression starting on line 22
>>
>> The code, that fails is the following:
>> """
>> Helper function to convert the value of an optimization results, but also
>> simple real values.
>> """
>> my_value(value::ForwardDiff.GradientNumber) = ForwardDiff.value(value)
>> my_value(value::Real) = value
>> my_value(val_vector::Vector) = [my_value(value) for value in val_vector]
>>
>> Any idea how to fix this?
>>
>> Uwe
>>
>> On Monday, August 8, 2016 at 4:57:16 PM UTC+2, Miles Lubin wrote:
>>>
>>> The JuMP team is happy to announce the release of JuMP 0.14. The release 
>>> should clear most, if not all, deprecation warnings on Julia 0.5 and is 
>>> compatible with ForwardDiff 0.2. The full release notes are here 
>>> ,
>>>  
>>> and I'd just like to highlight a few points:
>>>
>>> - *All JuMP users read this*: As previously announced 
>>> , we 
>>> will be deprecating the sum{}, prod{}, and norm{} syntax in favor of using 
>>> Julia 0.5's new syntax for generator statements, e.g., sum(x[i] for i 
>>> in 1:N) instead of sum{x[i], i in 1:N}. In this release, the new syntax 
>>> is available for testing if using Julia 0.5. No deprecation warnings are 
>>> printed yet. In JuMP 0.15, which will drop support for Julia 0.4, we will 
>>> begin printing deprecation warnings for the old syntax.
>>>
>>> - *Advanced JuMP users read this*: We have introduced a new syntax for 
>>> "anonymous" objects, which means that when declaring an optimization 
>>> variable, constraint, expression, or parameter, you may omit the name of 
>>> the object within the macro. The macro will instead return the object 
>>> itself which you can assign to a variable if you'd like. Example:
>>>
>>> # instead of @variable(m, l[i] <= x[i=1:N] <= u[i]):
>>> x = @variable(m, [i=1:N], lowerbound=l[i], upperbound=u[i]) 
>>>
>>> This syntax should be comfortable for advanced use cases of JuMP (e.g., 
>>> within a library) and should obviate some confusions about JuMP's variable 
>>> scoping rules.
>>>
>>> - We also have a new input form for nonlinear expressions that has the 
>>> potential to extend JuMP's scope as an AD tool. Previously all nonlinear 
>>> expressions needed to be input via macros, which isn't convenient if the 
>>> expression is generated programmatically. You can now set nonlinear 
>>> objectives and add nonlinear constraints by providing a Julia Expr 
>>> object directly with JuMP variables spliced in. This means that you can now 
>>> generate expressions via symbolic manipulation and add them directly to a 
>>> JuMP model. See the example in the documentation 
>>> .
>>>
>>> Finally, I'd like to thank Joaquim Dias Garcia, Oscar Dowson, Mehdi 
>>> Madani, and Jarrett Revels for contributions to this release which are 
>>> cited in the release notes.
>>>
>>> Miles, Iain, and Joey
>>>
>>>
>>>

[julia-users] Re: ANN: JuMP 0.14 released

2016-08-08 Thread Miles Lubin
ForwardDiff 0.2 introduced some breaking changes, you will need to update 
your code (GradientNumber is no longer defined). See the upgrading guide 
.

On Monday, August 8, 2016 at 11:10:50 AM UTC-6, Uwe Fechner wrote:
>
> Hello,
> I updated, and now I get the following error:
> julia> include("Plotting.jl")
> INFO: Recompiling stale cache file /home/ufechner/.julia/lib/v0.4/JuMP.ji 
> for module JuMP.
> INFO: Recompiling stale cache file 
> /home/ufechner/.julia/lib/v0.4/ReverseDiffSparse.ji for module 
> ReverseDiffSparse.
> INFO: Recompiling stale cache file 
> /home/ufechner/.julia/lib/v0.4/ForwardDiff.ji for module ForwardDiff.
> INFO: Recompiling stale cache file /home/ufechner/.julia/lib/v0.4/HDF5.ji 
> for module HDF5.
> ERROR: LoadError: LoadError: LoadError: LoadError: UndefVarError: 
> GradientNumber not defined
> while loading /home/ufechner/00PythonSoftware/FastSim/src/Projects.jl, in 
> expression starting on line 433
> while loading /home/ufechner/00PythonSoftware/FastSim/src/Model.jl, in 
> expression starting on line 19
> while loading /home/ufechner/00PythonSoftware/FastSim/src/Optimizer.jl, in 
> expression starting on line 13
> while loading /home/ufechner/00PythonSoftware/FastSim/src/Plotting.jl, in 
> expression starting on line 22
>
> The code, that fails is the following:
> """
> Helper function to convert the value of an optimization results, but also
> simple real values.
> """
> my_value(value::ForwardDiff.GradientNumber) = ForwardDiff.value(value)
> my_value(value::Real) = value
> my_value(val_vector::Vector) = [my_value(value) for value in val_vector]
>
> Any idea how to fix this?
>
> Uwe
>
> On Monday, August 8, 2016 at 4:57:16 PM UTC+2, Miles Lubin wrote:
>>
>> The JuMP team is happy to announce the release of JuMP 0.14. The release 
>> should clear most, if not all, deprecation warnings on Julia 0.5 and is 
>> compatible with ForwardDiff 0.2. The full release notes are here 
>> ,
>>  
>> and I'd just like to highlight a few points:
>>
>> - *All JuMP users read this*: As previously announced 
>> , we 
>> will be deprecating the sum{}, prod{}, and norm{} syntax in favor of using 
>> Julia 0.5's new syntax for generator statements, e.g., sum(x[i] for i in 
>> 1:N) instead of sum{x[i], i in 1:N}. In this release, the new syntax is 
>> available for testing if using Julia 0.5. No deprecation warnings are 
>> printed yet. In JuMP 0.15, which will drop support for Julia 0.4, we will 
>> begin printing deprecation warnings for the old syntax.
>>
>> - *Advanced JuMP users read this*: We have introduced a new syntax for 
>> "anonymous" objects, which means that when declaring an optimization 
>> variable, constraint, expression, or parameter, you may omit the name of 
>> the object within the macro. The macro will instead return the object 
>> itself which you can assign to a variable if you'd like. Example:
>>
>> # instead of @variable(m, l[i] <= x[i=1:N] <= u[i]):
>> x = @variable(m, [i=1:N], lowerbound=l[i], upperbound=u[i]) 
>>
>> This syntax should be comfortable for advanced use cases of JuMP (e.g., 
>> within a library) and should obviate some confusions about JuMP's variable 
>> scoping rules.
>>
>> - We also have a new input form for nonlinear expressions that has the 
>> potential to extend JuMP's scope as an AD tool. Previously all nonlinear 
>> expressions needed to be input via macros, which isn't convenient if the 
>> expression is generated programmatically. You can now set nonlinear 
>> objectives and add nonlinear constraints by providing a Julia Expr 
>> object directly with JuMP variables spliced in. This means that you can now 
>> generate expressions via symbolic manipulation and add them directly to a 
>> JuMP model. See the example in the documentation 
>> .
>>
>> Finally, I'd like to thank Joaquim Dias Garcia, Oscar Dowson, Mehdi 
>> Madani, and Jarrett Revels for contributions to this release which are 
>> cited in the release notes.
>>
>> Miles, Iain, and Joey
>>
>>
>>

[julia-users] Re: ANN: JuMP 0.14 released

2016-08-08 Thread Uwe Fechner
Hello,
I updated, and now I get the following error:
julia> include("Plotting.jl")
INFO: Recompiling stale cache file /home/ufechner/.julia/lib/v0.4/JuMP.ji 
for module JuMP.
INFO: Recompiling stale cache file 
/home/ufechner/.julia/lib/v0.4/ReverseDiffSparse.ji for module 
ReverseDiffSparse.
INFO: Recompiling stale cache file 
/home/ufechner/.julia/lib/v0.4/ForwardDiff.ji for module ForwardDiff.
INFO: Recompiling stale cache file /home/ufechner/.julia/lib/v0.4/HDF5.ji 
for module HDF5.
ERROR: LoadError: LoadError: LoadError: LoadError: UndefVarError: 
GradientNumber not defined
while loading /home/ufechner/00PythonSoftware/FastSim/src/Projects.jl, in 
expression starting on line 433
while loading /home/ufechner/00PythonSoftware/FastSim/src/Model.jl, in 
expression starting on line 19
while loading /home/ufechner/00PythonSoftware/FastSim/src/Optimizer.jl, in 
expression starting on line 13
while loading /home/ufechner/00PythonSoftware/FastSim/src/Plotting.jl, in 
expression starting on line 22

The code, that fails is the following:
"""
Helper function to convert the value of an optimization results, but also
simple real values.
"""
my_value(value::ForwardDiff.GradientNumber) = ForwardDiff.value(value)
my_value(value::Real) = value
my_value(val_vector::Vector) = [my_value(value) for value in val_vector]

Any idea how to fix this?

Uwe

On Monday, August 8, 2016 at 4:57:16 PM UTC+2, Miles Lubin wrote:
>
> The JuMP team is happy to announce the release of JuMP 0.14. The release 
> should clear most, if not all, deprecation warnings on Julia 0.5 and is 
> compatible with ForwardDiff 0.2. The full release notes are here 
> ,
>  
> and I'd just like to highlight a few points:
>
> - *All JuMP users read this*: As previously announced 
> , we 
> will be deprecating the sum{}, prod{}, and norm{} syntax in favor of using 
> Julia 0.5's new syntax for generator statements, e.g., sum(x[i] for i in 
> 1:N) instead of sum{x[i], i in 1:N}. In this release, the new syntax is 
> available for testing if using Julia 0.5. No deprecation warnings are 
> printed yet. In JuMP 0.15, which will drop support for Julia 0.4, we will 
> begin printing deprecation warnings for the old syntax.
>
> - *Advanced JuMP users read this*: We have introduced a new syntax for 
> "anonymous" objects, which means that when declaring an optimization 
> variable, constraint, expression, or parameter, you may omit the name of 
> the object within the macro. The macro will instead return the object 
> itself which you can assign to a variable if you'd like. Example:
>
> # instead of @variable(m, l[i] <= x[i=1:N] <= u[i]):
> x = @variable(m, [i=1:N], lowerbound=l[i], upperbound=u[i]) 
>
> This syntax should be comfortable for advanced use cases of JuMP (e.g., 
> within a library) and should obviate some confusions about JuMP's variable 
> scoping rules.
>
> - We also have a new input form for nonlinear expressions that has the 
> potential to extend JuMP's scope as an AD tool. Previously all nonlinear 
> expressions needed to be input via macros, which isn't convenient if the 
> expression is generated programmatically. You can now set nonlinear 
> objectives and add nonlinear constraints by providing a Julia Expr object 
> directly with JuMP variables spliced in. This means that you can now 
> generate expressions via symbolic manipulation and add them directly to a 
> JuMP model. See the example in the documentation 
> .
>
> Finally, I'd like to thank Joaquim Dias Garcia, Oscar Dowson, Mehdi 
> Madani, and Jarrett Revels for contributions to this release which are 
> cited in the release notes.
>
> Miles, Iain, and Joey
>
>
>

Re: [julia-users] Plots with scale bars

2016-08-08 Thread Tom Breloff
There's a bunchof ways to do this with Plots. Draw the lines and add
annotations or add an inset subplot with labels? I'm not at a computer...
maybe I'll throw together an example later.

On Monday, August 8, 2016, Islam Badreldin 
wrote:

>
>
> Hello,
>
> Is there a simple way in Julia to add scale bars with labels to plots and
> to hide the x-y axes as well? The way to do in MATLAB involves a lot of
> manual tweaking as described here
> http://www.mathworks.com/matlabcentral/answers/151248-
> add-a-scale-bar-to-my-plot
>
> I'm hoping to find a more elegant way in Julia!
>
> Thanks,
> Islam
>


[julia-users] Plots with scale bars

2016-08-08 Thread Islam Badreldin


Hello,

Is there a simple way in Julia to add scale bars with labels to plots and 
to hide the x-y axes as well? The way to do in MATLAB involves a lot of 
manual tweaking as described here
http://www.mathworks.com/matlabcentral/answers/151248-add-a-scale-bar-to-my-plot

I'm hoping to find a more elegant way in Julia!

Thanks,
Islam


Re: [julia-users] How do I write `import Base.!` in julia 0.5?

2016-08-08 Thread Scott T
I checked Compat, but it doesn't work with import or using:

julia> @compat import Base.:!
ERROR: syntax: invalid "import" statement: expected identifier

julia> import @compat Base.:!
ERROR: ArgumentError: @compat not found in path.
Run Pkg.add("@compat") to install the @compat package
 in require(::Symbol) at ./loading.jl:346

On Monday, 8 August 2016 16:56:34 UTC+1, Jacob Quinn wrote:
>
> There's also a Compat.jl entry for this, see the first bullet point in the 
> documentation on the README. That way, you can just do 
>
> @compat Base.:!
>
> and it will be valid for both 0.4/0.5.
>
> https://github.com/JuliaLang/Compat.jl
>
> -Jacob
>
>
> On Mon, Aug 8, 2016 at 9:04 AM, Scott T  > wrote:
>
>> Great, thanks. I found that 
>> import Base: !
>> also works and is also compatible with 0.4/0.5. 
>>
>> On Monday, 8 August 2016 15:26:46 UTC+1, Kevin Squire wrote:
>>>
>>> Try
>>>
>>>   import Base.(!)
>>>
>>> Cheers,
>>>   Kevin 
>>>
>>> On Monday, August 8, 2016, Scott T  wrote:
>>>
 In 0.4 I would write: import Base.!
 The syntax Base.:! is not yet supported.

 In 0.5:

 julia> import Base.:!
 ERROR: syntax: invalid "import" statement: expected identifier

 julia> import Base.!
 ERROR: syntax: invalid operator ".!"

 What am I missing here?

>>>
>

Re: [julia-users] How do I write `import Base.!` in julia 0.5?

2016-08-08 Thread Jacob Quinn
There's also a Compat.jl entry for this, see the first bullet point in the
documentation on the README. That way, you can just do

@compat Base.:!

and it will be valid for both 0.4/0.5.

https://github.com/JuliaLang/Compat.jl

-Jacob


On Mon, Aug 8, 2016 at 9:04 AM, Scott T  wrote:

> Great, thanks. I found that
> import Base: !
> also works and is also compatible with 0.4/0.5.
>
> On Monday, 8 August 2016 15:26:46 UTC+1, Kevin Squire wrote:
>>
>> Try
>>
>>   import Base.(!)
>>
>> Cheers,
>>   Kevin
>>
>> On Monday, August 8, 2016, Scott T  wrote:
>>
>>> In 0.4 I would write: import Base.!
>>> The syntax Base.:! is not yet supported.
>>>
>>> In 0.5:
>>>
>>> julia> import Base.:!
>>> ERROR: syntax: invalid "import" statement: expected identifier
>>>
>>> julia> import Base.!
>>> ERROR: syntax: invalid operator ".!"
>>>
>>> What am I missing here?
>>>
>>


[julia-users] [HELP] I am a beginner and I need some guidance in writing Optimised Code in Julia rather than writing Julia Programs in C Style.

2016-08-08 Thread Rishabh Raghunath
Hello Guys.. 
I am a beginner in Julia Programming Language !! and I absolutely love it 
!! 

If you can .. Can you evaluate my code and point out the things I could 
implement better..
Being good at C.. I feel I am bringing the C style into my Julia Program.
I want it to be optimised for Julia and do things in the right way in 
Julia..
Would be really helpful if anyone could tell me how to get things done and 
some usefull functions I could use to implement the attached program in a 
more efficient way.

and .. How do I get my Input directly as an integer or any specific 
datatype rather than as a string and then convert it to the datatype I want 
?
is there an alternative to getting integers via this way ?: 
a=parse(Int64, readline(STDIN))
is this even the right way to do it ?
It would be helpful If you could highlight the non ideal code and give me a 
replacements for the same.

*I've attached my first Julia Program with this post*

Thanks a ton !!.


juliatest.jl
Description: Binary data


[julia-users] Re: I get this error while trying to run *.jl file from Terminal ERROR: LoadError: UndefVarError: input not defined

2016-08-08 Thread Rishabh Raghunath
Cool !! Thanks a ton Ján Adamčák !! for helping me out !!

On Monday, August 8, 2016 at 2:39:07 AM UTC+5:30, Rishabh Raghunath wrote:
>
>
>
> Hello,
> I am a beginner in the Julia language .. 
> I get this error while I try to run .jl Julia program file from the 
> terminal. However It works perfectly fine in the Juno IDE. This following 
> is the error message I receive while trying to run the program from the 
> Linux terminal:
>
>
> ERROR: LoadError: UndefVarError: input not defined
> in legacy at /home/rishabh/Desktop/ProjectJulia/juliatest.jl:120
> while loading /home/rishabh/Desktop/ProjectJulia/juliatest.jl, in 
> expression starting on line 138
>
> How do I resolve this and run .jl file in the terminal rather than in Juno 
> ?
> Thanks In advance
>


[julia-users] Re: I get this error while trying to run *.jl file from Terminal ERROR: LoadError: UndefVarError: input not defined

2016-08-08 Thread Rishabh Raghunath
Cool !!.. Th Ján Adamčák.. for helping me out * !!*

On Monday, August 8, 2016 at 2:39:07 AM UTC+5:30, Rishabh Raghunath wrote:
>
>
>
> Hello,
> I am a beginner in the Julia language .. 
> I get this error while I try to run .jl Julia program file from the 
> terminal. However It works perfectly fine in the Juno IDE. This following 
> is the error message I receive while trying to run the program from the 
> Linux terminal:
>
>
> ERROR: LoadError: UndefVarError: input not defined
> in legacy at /home/rishabh/Desktop/ProjectJulia/juliatest.jl:120
> while loading /home/rishabh/Desktop/ProjectJulia/juliatest.jl, in 
> expression starting on line 138
>
> How do I resolve this and run .jl file in the terminal rather than in Juno 
> ?
> Thanks In advance
>


[julia-users] Re: I get this error while trying to run *.jl file from Terminal ERROR: LoadError: UndefVarError: input not defined

2016-08-08 Thread Ján Adamčák
I am only C coder, too. I think, I am not good teacher for you. I can only 
fix broken things... 

Good Luck ;)

Dňa pondelok, 8. augusta 2016 16:43:04 UTC+2 Rishabh Raghunath napísal(-a):
>
> Awesome.. Thanks a lot !! It worked .. 
> If you can .. Can you evaluate my code and point out the things I could 
> implement better..
> Being good at C.. I feel I am bringing the C style into my Julia Program.
> I want it to be optimised and do things in the right way in julia..
> How do I get my Input directly as an integer or any specific datatype 
> rather than as a string and then convert it to the datatype I want ?
> It would be helpfull If you could highlight the non ideal code and give me 
> a replacement for the same.
> Thanks
>
> On Monday, August 8, 2016 at 2:39:07 AM UTC+5:30, Rishabh Raghunath wrote:
>>
>>
>>
>> Hello,
>> I am a beginner in the Julia language .. 
>> I get this error while I try to run .jl Julia program file from the 
>> terminal. However It works perfectly fine in the Juno IDE. This following 
>> is the error message I receive while trying to run the program from the 
>> Linux terminal:
>>
>>
>> ERROR: LoadError: UndefVarError: input not defined
>> in legacy at /home/rishabh/Desktop/ProjectJulia/juliatest.jl:120
>> while loading /home/rishabh/Desktop/ProjectJulia/juliatest.jl, in 
>> expression starting on line 138
>>
>> How do I resolve this and run .jl file in the terminal rather than in 
>> Juno ?
>> Thanks In advance
>>
>

Re: [julia-users] How do I write `import Base.!` in julia 0.5?

2016-08-08 Thread Scott T
Great, thanks. I found that 
import Base: !
also works and is also compatible with 0.4/0.5. 

On Monday, 8 August 2016 15:26:46 UTC+1, Kevin Squire wrote:
>
> Try
>
>   import Base.(!)
>
> Cheers,
>   Kevin 
>
> On Monday, August 8, 2016, Scott T  
> wrote:
>
>> In 0.4 I would write: import Base.!
>> The syntax Base.:! is not yet supported.
>>
>> In 0.5:
>>
>> julia> import Base.:!
>> ERROR: syntax: invalid "import" statement: expected identifier
>>
>> julia> import Base.!
>> ERROR: syntax: invalid operator ".!"
>>
>> What am I missing here?
>>
>

[julia-users] ANN: JuMP 0.14 released

2016-08-08 Thread Miles Lubin
The JuMP team is happy to announce the release of JuMP 0.14. The release
should clear most, if not all, deprecation warnings on Julia 0.5 and is
compatible with ForwardDiff 0.2. The full release notes are here
,
and I'd just like to highlight a few points:

- *All JuMP users read this*: As previously announced
, we
will be deprecating the sum{}, prod{}, and norm{} syntax in favor of using
Julia 0.5's new syntax for generator statements, e.g., sum(x[i] for i in
1:N) instead of sum{x[i], i in 1:N}. In this release, the new syntax is
available for testing if using Julia 0.5. No deprecation warnings are
printed yet. In JuMP 0.15, which will drop support for Julia 0.4, we will
begin printing deprecation warnings for the old syntax.

- *Advanced JuMP users read this*: We have introduced a new syntax for
"anonymous" objects, which means that when declaring an optimization
variable, constraint, expression, or parameter, you may omit the name of
the object within the macro. The macro will instead return the object
itself which you can assign to a variable if you'd like. Example:

# instead of @variable(m, l[i] <= x[i=1:N] <= u[i]):
x = @variable(m, [i=1:N], lowerbound=l[i], upperbound=u[i])

This syntax should be comfortable for advanced use cases of JuMP (e.g.,
within a library) and should obviate some confusions about JuMP's variable
scoping rules.

- We also have a new input form for nonlinear expressions that has the
potential to extend JuMP's scope as an AD tool. Previously all nonlinear
expressions needed to be input via macros, which isn't convenient if the
expression is generated programmatically. You can now set nonlinear
objectives and add nonlinear constraints by providing a Julia Expr object
directly with JuMP variables spliced in. This means that you can now
generate expressions via symbolic manipulation and add them directly to a
JuMP model. See the example in the documentation
.

Finally, I'd like to thank Joaquim Dias Garcia, Oscar Dowson, Mehdi Madani,
and Jarrett Revels for contributions to this release which are cited in the
release notes.

Miles, Iain, and Joey


Re: [julia-users] I cant view Plots in Juno - Atom IDE

2016-08-08 Thread Chris Rackauckas
I agree. I probably answer this question at least once a day. I don't find 
it intuitive either, I just know the answer. It should either the early on 
in the documentation, or the displaying should be on by default. It would 
match things like R or MATLAB where people are used to the scripts just 
generating the plots and throwing them in the plot pane (and having the 
option to turn that off would still be nice, and is already implemented).

On Monday, August 8, 2016 at 4:35:29 AM UTC-7, Tom Breloff wrote:
>
> I think this behavior should be changed to match the REPL... There have 
> been too many questions like this
>
> On Monday, August 8, 2016, Chris Rackauckas  > wrote:
>
>> Use the command `gui()`
>>
>> On Sunday, August 7, 2016 at 9:32:29 PM UTC-7, Erick J Zagal wrote:
>>>
>>> I have this code:
>>>
>>> using Plots
>>>
>>> x = linspace(0, 10, 200)
>>> y = sin(x)
>>> plot(x, y, color=:blue, linewidth=2, label="sine")
>>>
>>> when run this in the console , show the plot , but  trying in Juno only 
>>> show [Plots.jl] Initializing backend:plotly
>>> the plot is nothing 
>>> Add a Screenshot
>>>
>>> Help
>>>
>>

[julia-users] Re: I get this error while trying to run *.jl file from Terminal ERROR: LoadError: UndefVarError: input not defined

2016-08-08 Thread Rishabh Raghunath
Awesome.. Thanks a lot !! It worked .. 
If you can .. Can you evaluate my code and point out the things I could 
implement better..
Being good at C.. I feel I am bringing the C style into my Julia Program.
I want it to be optimised and do things in the right way in julia..
How do I get my Input directly as an integer or any specific datatype 
rather than as a string and then convert it to the datatype I want ?
It would be helpfull If you could highlight the non ideal code and give me 
a replacement for the same.
Thanks

On Monday, August 8, 2016 at 2:39:07 AM UTC+5:30, Rishabh Raghunath wrote:
>
>
>
> Hello,
> I am a beginner in the Julia language .. 
> I get this error while I try to run .jl Julia program file from the 
> terminal. However It works perfectly fine in the Juno IDE. This following 
> is the error message I receive while trying to run the program from the 
> Linux terminal:
>
>
> ERROR: LoadError: UndefVarError: input not defined
> in legacy at /home/rishabh/Desktop/ProjectJulia/juliatest.jl:120
> while loading /home/rishabh/Desktop/ProjectJulia/juliatest.jl, in 
> expression starting on line 138
>
> How do I resolve this and run .jl file in the terminal rather than in Juno 
> ?
> Thanks In advance
>


Re: [julia-users] How do I write `import Base.!` in julia 0.5?

2016-08-08 Thread Kevin Squire
Try

  import Base.(!)

Cheers,
  Kevin

On Monday, August 8, 2016, Scott T  wrote:

> In 0.4 I would write: import Base.!
> The syntax Base.:! is not yet supported.
>
> In 0.5:
>
> julia> import Base.:!
> ERROR: syntax: invalid "import" statement: expected identifier
>
> julia> import Base.!
> ERROR: syntax: invalid operator ".!"
>
> What am I missing here?
>


[julia-users] Re: I get this error while trying to run *.jl file from Terminal ERROR: LoadError: UndefVarError: input not defined

2016-08-08 Thread Ján Adamčák
Hi, you are using function 
input()

, but this function is not known for julia. You can use function 
readline(STDIN)

instead of input(), or you can insert 
function input() 
  readline(STDIN) 
end
to the first line of your file.

Have a nice day with Julia ;)


Dňa pondelok, 8. augusta 2016 15:58:49 UTC+2 Rishabh Raghunath napísal(-a):
>
> Thanks for replying
> I've attached the file with this reply..
> It works in Juno.. but not via the terminal and get the before said error
>
> On Monday, August 8, 2016 at 2:39:07 AM UTC+5:30, Rishabh Raghunath wrote:
>>
>>
>>
>> Hello,
>> I am a beginner in the Julia language .. 
>> I get this error while I try to run .jl Julia program file from the 
>> terminal. However It works perfectly fine in the Juno IDE. This following 
>> is the error message I receive while trying to run the program from the 
>> Linux terminal:
>>
>>
>> ERROR: LoadError: UndefVarError: input not defined
>> in legacy at /home/rishabh/Desktop/ProjectJulia/juliatest.jl:120
>> while loading /home/rishabh/Desktop/ProjectJulia/juliatest.jl, in 
>> expression starting on line 138
>>
>> How do I resolve this and run .jl file in the terminal rather than in 
>> Juno ?
>> Thanks In advance
>>
>

[julia-users] How do I write `import Base.!` in julia 0.5?

2016-08-08 Thread Scott T
In 0.4 I would write: import Base.!
The syntax Base.:! is not yet supported.

In 0.5:

julia> import Base.:!
ERROR: syntax: invalid "import" statement: expected identifier

julia> import Base.!
ERROR: syntax: invalid operator ".!"

What am I missing here?


[julia-users] Re: I get this error while trying to run *.jl file from Terminal ERROR: LoadError: UndefVarError: input not defined

2016-08-08 Thread Rishabh Raghunath
Thanks for replying
I've attached the file with this reply..
It works in Juno.. but not via the terminal and get the before said error

On Monday, August 8, 2016 at 2:39:07 AM UTC+5:30, Rishabh Raghunath wrote:
>
>
>
> Hello,
> I am a beginner in the Julia language .. 
> I get this error while I try to run .jl Julia program file from the 
> terminal. However It works perfectly fine in the Juno IDE. This following 
> is the error message I receive while trying to run the program from the 
> Linux terminal:
>
>
> ERROR: LoadError: UndefVarError: input not defined
> in legacy at /home/rishabh/Desktop/ProjectJulia/juliatest.jl:120
> while loading /home/rishabh/Desktop/ProjectJulia/juliatest.jl, in 
> expression starting on line 138
>
> How do I resolve this and run .jl file in the terminal rather than in Juno 
> ?
> Thanks In advance
>


test.jl
Description: Binary data


[julia-users] Re: DifferentialEquations

2016-08-08 Thread Henri Girard
OK I found it... It's system of equations

Le lundi 8 août 2016 12:57:34 UTC+2, Henri Girard a écrit :
>
> Hi,
> I read somewhere in your doc, the way to solve DE in a matrix... Sorry but 
> I can't remenber which example, I have been looking back but can't see it 
> again ?
> Could you guide me to this ?
> Regards
> Henri
>


Re: [julia-users] I cant view Plots in Juno - Atom IDE

2016-08-08 Thread Tom Breloff
I think this behavior should be changed to match the REPL... There have
been too many questions like this

On Monday, August 8, 2016, Chris Rackauckas  wrote:

> Use the command `gui()`
>
> On Sunday, August 7, 2016 at 9:32:29 PM UTC-7, Erick J Zagal wrote:
>>
>> I have this code:
>>
>> using Plots
>>
>> x = linspace(0, 10, 200)
>> y = sin(x)
>> plot(x, y, color=:blue, linewidth=2, label="sine")
>>
>> when run this in the console , show the plot , but  trying in Juno only
>> show [Plots.jl] Initializing backend:plotly
>> the plot is nothing
>> Add a Screenshot
>>
>> Help
>>
>


[julia-users] DifferentialEquations

2016-08-08 Thread Henri Girard
Hi,
I read somewhere in your doc, the way to solve DE in a matrix... Sorry but 
I can't remenber which example, I have been looking back but can't see it 
again ?
Could you guide me to this ?
Regards
Henri


[julia-users] Re: I cant view Plots in Juno - Atom IDE

2016-08-08 Thread Chris Rackauckas
Use the command `gui()`

On Sunday, August 7, 2016 at 9:32:29 PM UTC-7, Erick J Zagal wrote:
>
> I have this code:
>
> using Plots
>
> x = linspace(0, 10, 200)
> y = sin(x)
> plot(x, y, color=:blue, linewidth=2, label="sine")
>
> when run this in the console , show the plot , but  trying in Juno only 
> show [Plots.jl] Initializing backend:plotly
> the plot is nothing 
> Add a Screenshot
>
> Help
>