[julia-users] Re: Runtime inference of a function's return type

2015-06-23 Thread Fábio Cardeal
That's handy, thanks! And alright, I will keep that in mind.


[julia-users] Runtime inference of a function's return type

2015-06-23 Thread Daniel Høegh
You can use Julia's type inference as: 

Base.return_types(max,(Int, Float64))

But remember if the function contains any global non-constant variables the 
function will most likely return any.

[julia-users] matlab-like textscan function?

2015-06-23 Thread Garrett Jenkinson
http://www.mathworks.com/help/matlab/ref/textscan.html

Basically I am looking for something that does the opposite of the @printf 
macro...i.e., it reads in lines from a text file that has a specific format 
specifier. readdlm does not seem to do what I want (or maybe I'm not using 
it properly). As a specific example, say I have a file that is in a 
"bedgraph" format (.bed) that would look something like this:

chr1 1 4 1.5
chr1 6 10 2.7
chr2 70 230 6.4

I would like to specify that the file is formatted as "chr%u %u %u %f" and 
to receive four vectors (3 Ints and 1 Float):

[1,1,2]
[1,6,70]
[4,10,230]
[1.5,2.7,6.4]

Does anyone know of a function that has this feature? If not, is there an 
easy way to do this with existing functions/macros?

Thanks!
Garrett

P.S. Sorry if this is a duplicate, my previous post didn't seem to go 
through.


[julia-users] Runtime inference of a function's return type

2015-06-23 Thread Fábio Cardeal
Is there a good way to find a function's return type with only the types of 
its arguments?
I mean, I can do something like:


 codetyp(code) = code.args[3].typ

 function returntype(f::Function, types::Tuple)
 if isgeneric(f) 
 mapreduce(codetyp, Union, code_typed(f, types))::Type
 else
 Any
 end
 end

 function returntype(f, types::Tuple)
 returntype(call, (typeof(f), types...))
 end



 julia> returntype(identity, (Int,))
 Int64

 julia> returntype(returntype, (Function, Tuple))
 Type{T} 


But is this really reliable? Is there a way to do this other than directly 
accessing the function's internals?


[julia-users] matlab-like textscan function?

2015-06-23 Thread Garrett Jenkinson
Sorry if this is overly basic question, but I searched around the 
documentation and the user group questions and have not been able to find 
an answer. I wondering if there was a way in Julia to read from a formatted 
text file, in the same way as matlab's textscan function:

http://www.mathworks.com/help/matlab/ref/textscan.html

readdlm does not seem to do what I am looking for (or maybe I'm using it 
wrong!). Suppose I have data coming from a bed file, which is formatted 
like this:

chr1  500  34543   1.433
chr1  46546  3543   4.68
chr2  454334456  6.3545

It would be nice to specify the format "chr%u %u %u %f" and to get four 
vectors (three with Ints and one with floats):

[1,1,2]
[500,46546,4543]
[34543,3543,34456]
[1.433,4.68,6.3545]

Is there a function to do this? If not, is there a simple way to do this 
with the functions that are available? 

Thanks!
Garrett

P.S. I know that @printf basically allows the opposite of this to be done 
(i.e., to write out to a file by specifying a format). Basically, my 
question is if there exists the equivalent @readf to read in something that 
was produced by @printf?


Re: [julia-users] Re: Any function to generate code in String from Expr?

2015-06-23 Thread jiyinyiyong
I think it's exactly the function I need! Thanks.

Turns out I asked this question in a bad way...

On Wed, Jun 24, 2015 at 10:56 AM Isaiah Norton 
wrote:

> a = string(parse("1 + 1"))
>
> On Tue, Jun 23, 2015 at 10:52 PM, jiyinyiyong 
> wrote:
>
>> That would be nice. But it turns out to be "not defined" when I call `a =
>> parse("1 + 1"); inverse(a)`. Is that a package or something?
>>
>> On Wed, Jun 24, 2015 at 4:44 AM Steven G. Johnson 
>> wrote:
>>
>>>
>>> On Sunday, June 21, 2015 at 1:31:22 AM UTC-4, Jiyin Yiyong wrote:
>>>
 As described in http://blog.leahhanson.us/julia-introspects.html Julia
 parses code to AST.
 But is there function provided to generate code back?

>>>
>>> If you just want a parseable string from an AST, i.e. the "inverse" of
>>> the ast=parse(string) function, you can just use string(ast).
>>>
>>
>


Re: [julia-users] Re: Any function to generate code in String from Expr?

2015-06-23 Thread Isaiah Norton
a = string(parse("1 + 1"))

On Tue, Jun 23, 2015 at 10:52 PM, jiyinyiyong  wrote:

> That would be nice. But it turns out to be "not defined" when I call `a =
> parse("1 + 1"); inverse(a)`. Is that a package or something?
>
> On Wed, Jun 24, 2015 at 4:44 AM Steven G. Johnson 
> wrote:
>
>>
>> On Sunday, June 21, 2015 at 1:31:22 AM UTC-4, Jiyin Yiyong wrote:
>>
>>> As described in http://blog.leahhanson.us/julia-introspects.html Julia
>>> parses code to AST.
>>> But is there function provided to generate code back?
>>>
>>
>> If you just want a parseable string from an AST, i.e. the "inverse" of
>> the ast=parse(string) function, you can just use string(ast).
>>
>


Re: [julia-users] Re: Any function to generate code in String from Expr?

2015-06-23 Thread jiyinyiyong
That would be nice. But it turns out to be "not defined" when I call `a =
parse("1 + 1"); inverse(a)`. Is that a package or something?

On Wed, Jun 24, 2015 at 4:44 AM Steven G. Johnson 
wrote:

>
> On Sunday, June 21, 2015 at 1:31:22 AM UTC-4, Jiyin Yiyong wrote:
>
>> As described in http://blog.leahhanson.us/julia-introspects.html Julia
>> parses code to AST.
>> But is there function provided to generate code back?
>>
>
> If you just want a parseable string from an AST, i.e. the "inverse" of the
> ast=parse(string) function, you can just use string(ast).
>


Re: [julia-users] Re: When are function arguments going to be inlined?

2015-06-23 Thread Colin Bowers
Yes, that is pretty much how I would do it, although, as I said in my
previous post, I would set `UtilityFunction` to an abstract type, and then
define my actual utility function immutable, say `MyCustomUtilityFunc`, as
a subtype of `UtilityFunction`. That way you can easily add different types
of utility functions later without having to change your existing code. By
the way, just for the record, a fair test between the two approaches would
be as follows:

abstract UtilityFunction
immutable MyCustomUtilityFunction <: UtilityFunction
sigmac::Float64
sigmal::Float64
psi::Float64
end
u4(sigmac, sigmal, psi, consump,labor) = consump.^(1-sigmac)/(1-sigmac) +
psi*(1-labor).^(1-sigmal)/(1-sigmal)
u(UF::MyCustomUtilityFunctionconsump,labor) =
consump.^(1-UF.sigmac)/(1-UF.sigmac) +
UF.psi*(1-labor).^(1-UF.sigmal)/(1-UF.sigmal)


function test1(sigmac, sigmal, psi)
for i = 1:100
u4(sigmac, sigmal, psi, 1.0 + 1/i, 0.5)
end
end
function test2(UF::UtilityFunction)
for i = 1:100
u(UF, 1.0 + 1/i , 0.5)
end
end

UF = MyCustomUtilityFunction(4,2,1)
test1(4.0, 2.0, 1.0)
test2(UF)


On my machine that returns:

elapsed time: 0.090409383 seconds (80 bytes allocated)
elapsed time: 0.091065473 seconds (80 bytes allocated)

ie, no significant performance difference

On 24 June 2015 at 00:56, Andrew  wrote:

> Thanks, this is all very useful. I think I am going to back away from
> using the @anon functions at the moment, so I'll postpone my idea to
> encapsulate the functions into a type. Instead, I will just pass a
> parameter type to an externally defined(not nested) function. I had thought
> this would be slow (see my question here
> https://groups.google.com/forum/#!topic/julia-users/6U-otLSx7B0 ), but I
> did a little testing.
>
> immutable UtilityFunction
> sigmac::Float64
> sigmal::Float64
> psi::Float64
> end
> function u(UF::UtilityFunction,consump,labor)
> sigmac = UF.sigmac
> sigmal = UF.sigmal
> psi = UF.psi
> consump.^(1-sigmac)/(1-sigmac) + psi*(1-labor).^(1-sigmal)/(1-sigmal)
> end
> function u4(consump,labor)
> consump.^(1-4)/(1-4) + 1*(1-labor).^(1-2)/(1-2)
> end
>
> function test1(UF)
> for i = 1:100
> u4(1. + 1/i, .5)
> end
> end
> function test2(UF)
> for i = 1:100
> u(UF,1. + 1/i ,.5)
> end
> end
> UF = UtilityFunction(4,2,1)
>
> @time test1(UF)
> @time test2(UF)
>
> elapsed time: 0.068562617 seconds (80 bytes allocated)
> elapsed time: 0.139422608 seconds (80 bytes allocated)
>
>
>  So, even versus the extreme case where I built the constants into the
> function, the slowdown is not huge. I asume @anon would have similar
> performance to the constants built in case, which is nice. However, I want
> to be able to share my Julia code with others who aren't very experienced
> with the language, so I'd be uncomfortable asking them to understand the
> workings of FastAnonymous. It's useful to know about in case I need the
> speedup in my own personal code though.
>
>
> On Tuesday, June 23, 2015 at 8:51:25 AM UTC-4, colint...@gmail.com wrote:
>>
>> Yes, this proves to be an issue for me sometimes too. I asked a
>> StackOverflow question on this topic a few months ago and got a very
>> interesting response, as well as some interesting links. See here:
>>
>>
>> http://stackoverflow.com/questions/28356437/julia-compiler-does-not-appear-to-optimize-when-a-function-is-passed-a-function
>>
>> As a general rule, if the function you are passing round is very simple
>> and gets called a lot, then you will really notice the performance
>> overhead. In other cases where the function is more complicated, or is not
>> called that often, the overhead will be barely measurable.
>>
>> If the number of functions that you want to pass around is not that
>> large, one way around this is to use types and multiple dispatch instead of
>> functions, eg
>>
>> abstract UtilityFunctions
>> type QuadraticUtility <: UtilityFunctions
>> a::Float64
>> b::Float64
>> c::Float64
>> end
>> evaluate(x::Number, f::QuadraticUtility) = f.a*x^2 + f.b*x + f.c
>>
>> Now your function would be something like:
>>
>> function solveModel(f::UtilityFunctions, ...)
>>
>> and you would call evaluate at the appropriate place in the function body
>> and multiple dispatch will take care of the rest. There is no performance
>> overhead with this approach.
>>
>> Of course, if you want to be able to just pass in any arbitrary function
>> that a user might think up, then this approach is not tenable.
>>
>> On Tuesday, 23 June 2015 01:07:25 UTC+10, Andrew wrote:
>>>
>>>
>>>
>>> I'm trying to write some abstract Julia code to solve a variety of
>>> economics models. Julia provides powerful abstraction tools which I think
>>> makes it very well-suited to this; however, I've read in several places
>>> that Julia doesn't yet know how to inline functions passed as arguments,
>>> hence code like
>>>
>>> function SolveModel(Utility::Function

[julia-users] Re: Any function to generate code in String from Expr?

2015-06-23 Thread Steven G. Johnson

On Sunday, June 21, 2015 at 1:31:22 AM UTC-4, Jiyin Yiyong wrote:
>
> As described in http://blog.leahhanson.us/julia-introspects.html Julia 
> parses code to AST.
> But is there function provided to generate code back?
>

If you just want a parseable string from an AST, i.e. the "inverse" of the 
ast=parse(string) function, you can just use string(ast).


[julia-users] Re: Problems with NLopt

2015-06-23 Thread Steven G. Johnson


On Tuesday, June 23, 2015 at 2:06:29 PM UTC-4, Nils Gudat wrote:
>
> function objective(parameters::Array{Float64,1}, obs=observations, 
> wgt=weight)
>
 
As described in the NLopt documentation, this is the wrong form for your 
objective function.  NLopt expects a function like objective(params, grad), 
where it passes an empty array for grad when you use a derivative-free 
optimization algorithm (like ESCH) or in general when the gradient is not 
needed.

So, NLopt is calling your objective function and passing an empty array for 
your second argument (obs), and apparently your function always returns 0.0 
when the second argument is empty.

(You can also try just inserting a println statement into your objective 
function to see what arguments are being passed.)

Just add a second "grad" argument which is ignored (or throw an exception 
if it is nonempty) and you should be okay, I'm guessing.


Re: [julia-users] Escher demos

2015-06-23 Thread Leonardo
Thanks,
but also with your code the error persists:

`consume` has no method matching consume(::Function, ::Input{Int32})
 in anonymous at C:\Users\Leonardo\.julia\v0.3\Escher\src\cli\serve.jl:134
 in anonymous at C:\Users\Leonardo\.julia\v0.3\Mux\src\Mux.jl:15
 in anonymous at C:\Users\Leonardo\.julia\v0.3\Mux\src\Mux.jl:8
 in splitquery at C:\Users\Leonardo\.julia\v0.3\Mux\src\basics.jl:28
 in anonymous at C:\Users\Leonardo\.julia\v0.3\Mux\src\Mux.jl:8
 in wcatch at C:\Users\Leonardo\.julia\v0.3\Mux\src\websockets.jl:12
 in anonymous at C:\Users\Leonardo\.julia\v0.3\Mux\src\Mux.jl:8
 in todict at C:\Users\Leonardo\.julia\v0.3\Mux\src\basics.jl:21
 in anonymous at C:\Users\Leonardo\.julia\v0.3\Mux\src\Mux.jl:12 (repeats 2 
times)
 in anonymous at C:\Users\Leonardo\.julia\v0.3\Mux\src\Mux.jl:8
 in anonymous at C:\Users\Leonardo\.julia\v0.3\Mux\src\server.jl:36
 in handle at C:\Users\Leonardo\.julia\v0.3\WebSockets\src\WebSockets.jl:287
 in on_message_complete at C:\Users\Leonardo\.julia\v0.3\HttpServer\src\
HttpServer.jl:359
 in on_message_complete at C:\Users\Leonardo\.julia\v0.3\HttpServer\src\
RequestParser.jl:99
 in pointer at pointer.jl:29 (repeats 2 times)
(also form.jl, despite I've downloaded latest versionfrom github dated 
about 1 month ago, doesn't work)

Leonardo

 

Il giorno martedì 23 giugno 2015 15:26:29 UTC+2, Shashi Gowda ha scritto:
>
> consume is essentially an alias to lift. Escher now lets you embed 
> interactive UIs in static surroundings. So your example can be simplified 
> to:
>
> using Color
>
> with_hue(hue, tile=size(4em, 4em, empty)) =
> fillcolor(HSV(hue, 0.6, 0.6), tile)
>
> function main(window)
> push!(window.assets, "widgets")
>
> xt = Input(0)
> 
> vbox(
> subscribe(slider(0:360), xt),
> consume(with_hue, xt)
> )
> 
> end
>
>
>
>
> On Tue, Jun 23, 2015 at 12:29 PM, Leonardo  > wrote:
>
>> Hello,
>> I'm a newbie in Julia but I'm very interested to use it in scientific / 
>> numeric application with some minimal User Interface, and I think that 
>> Escher is great for this scope, useful also for remote computation.
>>
>> Unfortunately I've some problem running simple demos (I use win32 Julia 
>> in Win7 with latest Firefox)
>>
>> Trying to understand use of buttons, I've found example *form.jl *included 
>> in Escher, but it display at the bottom of page Dict{Any,Any}() and 
>> button doesn't work.
>>
>> Moreover I've extracted from *Reactive Programming Guide* (*reactive.jl* 
>> doc included in Escher) a simple hue example (attached), but doesn't work 
>> in original version because julia cannot found consume()
>> using Escher
>> using Color
>>
>>
>> xt = Input(0)
>>
>> with_hue(hue, tile=size(4em, 4em, empty)) =
>> fillcolor(HSV(hue, 0.6, 0.6), tile)
>>
>> function main(window)
>> # Load HTML dependencies related to widgets
>> push!(window.assets, "widgets")
>> 
>> lift(xt) do x
>> slider_and_huebox(x) = vbox(
>> subscribe(slider(0:360), xt),
>> "The current hue is: $x",
>> with_hue(x)
>> )
>> 
>> consume(slider_and_huebox,xt)
>> end
>> end
>>
>> (I've found a consume()in Julia in file *task.jl*, but it has a 
>> completely different signature and use)
>>
>>
>> Maybe I made trivial mistakes, but I cannot found right solutions.
>> Someone can help me?
>>
>> Many thanks in advance
>>
>> Leonardo
>>
>>
>

[julia-users] Problems with NLopt

2015-06-23 Thread Nils Gudat
After using the Optim package for quite a while, I'm now dabbling with 
NLopt.jl for the first time to solve a bounded multivariate optimization 
problem. For some reason, I can't seem to get NLopt to do anything though, 
and while the problem is hard to recreate as a small working example, I 
thought I'd try my luck here with a bit of pseudo-code to see whether 
anyone has got an idea for what could be going on. The problem I'm trying 
to solve is a minimum distance estimator that tries to find 64 parameters 
to minimize the distance between values of a function of those parameters 
and some observed values. The idea is:

observations = (observed values, Array{Float64,2})
weight = (weight for each observation, Array{Float64,2})

function objective(parameters::Array{Float64,1}, obs=observations, 
wgt=weight)
values = f(parameters) # construct a matrix of the same size as 
observations
  diff = [(obs - values).*wgt][:]
  return [diff'*diff][1]
end

x_0 = (initial values, Vector{Float64})
lb = (some values, Vector{Float64})
ub = (some values, Vector{Float64})

opt = Opt(:GN_ESCH, length(x_0))
min_objective!(opt, objective)
upper_bounds!(opt, ub)
lower_bounds!(opt, lb)
(optf, optx, flag) = optimize(opt, x_0)

The result of this calculation is, for whichever method I specify, just the 
vector of initial values, x_0 (or a vector with values halfway between ub 
and lb). The problem seems to be that the value of the objective function 
is not correctly calculated in the optimization routine, as the returned 
function value is always 0.0. I get close to reasonable results when I use 
the Optim package and just call Optim.optimize(objective, x_0, iterations = 
5000), but I'd like to put bounds on the parameters and so a working NLopt 
version would be preferable.

The full problem is available in this repo 
.


Re: [julia-users] Re: Julia Summer of Code

2015-06-23 Thread Shashi Gowda
An update:

We have 2 more accepted JSoC projects, thanks to further funding from The
Moor Foundation and MIT.

*Kyunghun Kim*  will be working on HPGPU
Programming for Julia (mentored by Tim Holy)
*Brian Cohen * will be working on
implementing a test suit for Escher.jl (mentored by me)


On Tue, Jun 23, 2015 at 5:24 AM, David Anthoff  wrote:

> Congratulations, looks like a great list!
>
>
>
> *From:* julia-users@googlegroups.com [mailto:julia-users@googlegroups.com]
> *On Behalf Of *Jiahao Chen
> *Sent:* Tuesday, June 9, 2015 11:15 PM
> *To:* julia-users@googlegroups.com
> *Subject:* [julia-users] Re: Julia Summer of Code
>
>
>
> I am pleased to announce the list of accepted participants and projects
> for the 2015 Julia Summer of Code:
>
>-
>- *Ambuj Agrawal*, Improving debug information generation in Julia
>(mentor: Keno Fischer @Keno )
>- *David Gold (@ davidagold
>)*, Nullable arrays (mentor: John Myles
>White @johnmyleswhite )
>- *Jacob Quinn (@quinnj )*, Pipelines.jl:
>composable streams for data transfer and processing (mentor: Viral B. Shah
>@ViralBShah )
>- *Jarrett Revels (@jrevels )*, Automatic
>differentiation (mentors: Miles Lubin @mlubin
> and Theodore Papamarkou @scidom
>)
>- *Kenta Sato (@bicycle1885 )*,
>Efficient data structures and algorithms for sequence analysis in BioJulia
> (mentor: Daniel C. Jones @dcjones )
>- *Rohit Varkey Thankachan (@
>rohitvarkey
>)*, Compose3D.jl: declarative 3D
>graphics (mentors: Shashi Gowda @shashi 
>and Simon Danisch @SimonDanisch )
>- *Simon Danisch (@ SimonDanisch
>)*, GLVisualize.jl: OpenGL
>visualization in Julia (mentor: Keno Fischer @Keno
>)
>
> Congratulations to the selected participants and a big thank you to all
> the mentors who agreed to donate their time toward improving Julia.
>
>
>
> Thanks also to the other committee members Alan Edelman, Keno Fischer,
> Miles Lubin, Shashi Gowda, Stefan Karpinski, and Viral Shah for their
> efforts in evaluating the many proposals received.
>


[julia-users] Re: WebGL with Julia?

2015-06-23 Thread Jason Grout
You might have a look at pythreejs if you're also interested in python 
solutions.  Pythreejs exposes much of the three.js webgl library as 
interactive IPython notebook widgets.  So basically, you get interactive 
three.js, controllable from python in the IPython notebook.

https://github.com/jasongrout/pythreejs

Thanks,

Jason


On Monday, June 22, 2015 at 3:47:20 PM UTC-4, Uwe Fechner wrote:
>
> Hello,
>
> are there any packages that support to create animated, rendered
> WebGL graphics?
>
> We are working on airborne wind energy (see: http://awec2015.tudelft.nl/ 
> ) and are looking for options to replace our
> current visualization tools, that are based on Python (cgkit) and Java.
>
> We would like to write some software, that allows students to 
> interactively control the flight of a kite.
>
> For Python there is pythonocc (see: 
> http://www.pythonocc.org/features_overview/experimental-webgl-renderer-towards-cad-in-a-browser/
>  
> ,  
> online demo: http://webgl.pythonocc.org ), but it allows only to create 
> output, but does not yet support interactive input.
>
> For Julia, there is Escher.jl, but it does not (yet) support WebGL.
>
> Any suggestions?
>
> Uwe Fechner, TU Delft
>


[julia-users] Re: cycle detection and cycle basis

2015-06-23 Thread Tony Kelman
Looks like cycle detection is part of https://github.com/JuliaLang/Graphs.jl


On Monday, June 22, 2015 at 10:11:35 AM UTC-4, Michela Di Lullo wrote:
>
> Hi, 
>
> does anyone know if there is any algorithm for cycle detection or cycle 
> basis computation (for directed graphs) in julia? 
>
> Thank you in advance
>
> Michela Di Lullo
>
> ___
> INVESTI SUL FUTURO, FAI CRESCERE L’UNIVERSITÀ:
>
> *DONA IL 5 PER MILLE ALLA SAPIENZA*
>
> CODICE FISCALE *80209930587*
>


Re: [julia-users] Can I build constants into a function? Also, functions contained in a type.

2015-06-23 Thread Andrew
That is really nice. I see the 0.4 version also avoids the special syntax, 
so you can just treat fun() like an ordinary function. That's good for 
readability, plus it means I can share the same code with people who don't 
understand FastAnonymous by just dropping the @anon marker and leaving 
everything else unchanged.

On Monday, June 22, 2015 at 11:38:44 PM UTC-4, Tim Holy wrote:
>
> The implementation of FastAnonymous for julia 0.4 is radically different 
> from 
> the implementation for julia 0.3---they are truly two different packages 
> that 
> happen to have the same name. The version for 0.4 gives you the ability to 
> modify/own the parameters. That design is simply not possible with 0.3. 
>
> --Tim 
>
> On Monday, June 22, 2015 05:56:42 PM Andrew wrote: 
> > I think part of the reason I wanted to do this is because I used to code 
> in 
> > Java, and in Java it's common to encapsulate variables and methods 
> within 
> > an object, and then have the object's methods reference itself. I guess 
> > there's nothing stopping me from doing similar here, I could write 
> > something like 
> > 
> > function u(UF::UtilityFunction,consump,labor) 
> > sigmac = UF.sigmac 
> > sigmal = UF.sigmal 
> > psi = UF.psi 
> > consump.^(1-sigmac)/(1-sigmac) + 
> psi*(1-labor).^(1-sigmal)/(1-sigmal) 
> > end 
> > 
> >  but this is sort of ugly, so I'd prefer to avoid explicitly passing any 
> > parameters, I also keep thinking that explicitly passing around 
> parameters 
> > should slow things down, so I've been reluctant to do it, but I just 
> tested 
> > it a bit. I haven't observed any performance loss. Perhaps I am 
> conditioned 
> > by MATLAB to avoid these things since it uses pass-by-value, so I 
> believe 
> > passing parameters makes copies there. Since Julia doesn't do this, 
> there 
> > shouldn't be a performance hit. 
>
>

[julia-users] Re: When are function arguments going to be inlined?

2015-06-23 Thread Andrew
Thanks, this is all very useful. I think I am going to back away from using 
the @anon functions at the moment, so I'll postpone my idea to encapsulate 
the functions into a type. Instead, I will just pass a parameter type to an 
externally defined(not nested) function. I had thought this would be slow 
(see my question here 
https://groups.google.com/forum/#!topic/julia-users/6U-otLSx7B0 ), but I 
did a little testing.

immutable UtilityFunction
sigmac::Float64
sigmal::Float64
psi::Float64
end
function u(UF::UtilityFunction,consump,labor)
sigmac = UF.sigmac
sigmal = UF.sigmal
psi = UF.psi
consump.^(1-sigmac)/(1-sigmac) + psi*(1-labor).^(1-sigmal)/(1-sigmal)
end
function u4(consump,labor)  
consump.^(1-4)/(1-4) + 1*(1-labor).^(1-2)/(1-2)
end

function test1(UF)
for i = 1:100
u4(1. + 1/i, .5)
end
end
function test2(UF)
for i = 1:100
u(UF,1. + 1/i ,.5)
end
end
UF = UtilityFunction(4,2,1)

@time test1(UF)
@time test2(UF)

elapsed time: 0.068562617 seconds (80 bytes allocated)
elapsed time: 0.139422608 seconds (80 bytes allocated)


 So, even versus the extreme case where I built the constants into the 
function, the slowdown is not huge. I asume @anon would have similar 
performance to the constants built in case, which is nice. However, I want 
to be able to share my Julia code with others who aren't very experienced 
with the language, so I'd be uncomfortable asking them to understand the 
workings of FastAnonymous. It's useful to know about in case I need the 
speedup in my own personal code though.

On Tuesday, June 23, 2015 at 8:51:25 AM UTC-4, colint...@gmail.com wrote:
>
> Yes, this proves to be an issue for me sometimes too. I asked a 
> StackOverflow question on this topic a few months ago and got a very 
> interesting response, as well as some interesting links. See here:
>
>
> http://stackoverflow.com/questions/28356437/julia-compiler-does-not-appear-to-optimize-when-a-function-is-passed-a-function
>
> As a general rule, if the function you are passing round is very simple 
> and gets called a lot, then you will really notice the performance 
> overhead. In other cases where the function is more complicated, or is not 
> called that often, the overhead will be barely measurable.
>
> If the number of functions that you want to pass around is not that large, 
> one way around this is to use types and multiple dispatch instead of 
> functions, eg
>
> abstract UtilityFunctions
> type QuadraticUtility <: UtilityFunctions
> a::Float64
> b::Float64
> c::Float64
> end
> evaluate(x::Number, f::QuadraticUtility) = f.a*x^2 + f.b*x + f.c
>
> Now your function would be something like:
>
> function solveModel(f::UtilityFunctions, ...)
>
> and you would call evaluate at the appropriate place in the function body 
> and multiple dispatch will take care of the rest. There is no performance 
> overhead with this approach.
>
> Of course, if you want to be able to just pass in any arbitrary function 
> that a user might think up, then this approach is not tenable.
>
> On Tuesday, 23 June 2015 01:07:25 UTC+10, Andrew wrote:
>>
>>
>>
>> I'm trying to write some abstract Julia code to solve a variety of 
>> economics models. Julia provides powerful abstraction tools which I think 
>> makes it very well-suited to this; however, I've read in several places 
>> that Julia doesn't yet know how to inline functions passed as arguments, 
>> hence code like
>>
>> function SolveModel(Utility::Function, ProductionTechnology::Function
>> ,...)
>> ...
>>
>> will be slow. I performed this very simple test.
>>
>> function ftest1()
>> u(x) = log(x)
>> function hello(fun::Function)
>> for i = 1:100
>> fun(i.^(1/2))
>> end
>> end
>> end
>> 
>> function ftest2()
>> function hello()
>> for i = 1:100
>> log(i.^(1/2))
>> end
>> end
>> end
>>
>> @time ftest1()
>> @time ftest2()
>>
>> elapsed time: 6.065e-6 seconds (496 bytes allocated)
>> elapsed time: 3.784e-6 seconds (264 bytes allocated)
>>
>>
>>  The inlined version is about twice as fast, which isn't all that bad, 
>> although I'm not sure if it would be worse in a more complicated example. 
>> Perhaps I shouldn't worry about this, and should code how I want. I was 
>> wondering though, if anybody knows when this is going to change. I've read 
>> about functors, which I don't really understand, but it sounds like people 
>> are working on this problem.
>>
>

Re: [julia-users] Current Performance w Trunk Compared to 0.3

2015-06-23 Thread Kristoffer Carlsson
git bisect points to https://github.com/JuliaLang/julia/commit/5cb2835 as 
the first bad commit.

The results are apparently not as bad as I posted above but it is at least 
an overall 30% performance hit.

On Tuesday, June 23, 2015 at 12:11:09 PM UTC+2, Kristoffer Carlsson wrote:
>
> I have found a quite severe performance hit in my KNN searches in 
> https://github.com/KristofferC/KDTrees.jl
>
> Earlier results: knn / sec: 730922, now; knn / sec: 271581
>
> I will bisect it later today and see if there are major performance hits 
> somewhere.
>
>
> On Tuesday, June 23, 2015 at 1:54:09 AM UTC+2, David Anthoff wrote:
>>
>> I also saw a huge performance drop for a pretty involved piece of code 
>> when 
>> I tried a pre-release version of 0.4 a couple of weeks ago, compared to 
>> running things on 0.3. 
>>
>> My plan was to wait for a feature freeze of 0.4 and then investigate and 
>> report these things, I don't have the bandwidth to track these things 
>> ongoing. Maybe that would be a good rule of thumb, to ask people to look 
>> for 
>> performance regressions with existing code once there is a feature freeze 
>> on 
>> 0.4? 
>>
>> Also, are there ongoing performance tests? Especially tests that don't 
>> micro-benchmark, but test runtimes of relatively complex pieces of code? 
>>
>> > -Original Message- 
>> > From: julia...@googlegroups.com [mailto:julia- 
>> > us...@googlegroups.com] On Behalf Of Tim Holy 
>> > Sent: Sunday, June 14, 2015 4:36 AM 
>> > To: julia...@googlegroups.com 
>> > Subject: Re: [julia-users] Current Performance w Trunk Compared to 0.3 
>> > 
>> > git bisect? 
>> > 
>> > Perhaps the leading candidate is 
>> > https://github.com/JuliaLang/julia/issues/11681 
>> > which may be fixed by 
>> > https://github.com/JuliaLang/julia/pull/11683 
>> > 
>> > --Tim 
>> > 
>> > On Sunday, June 14, 2015 02:58:19 AM Viral Shah wrote: 
>> > > FWIW, I have seen a 25% regression from 0.3 to 0.4 on a reasonably 
>> > > complex codebase, but haven't been able to isolate the offending 
>> code. 
>> > > GC time in the 0.4 run is significantly smaller than 0.3, which means 
>> > > that if you discount GC, the difference is more like 40%. I wonder if 
>> > > this is some weird interaction with the caching in the new GC, or it 
>> > > is the quality of generated code. 
>> > > 
>> > > I didn't report it yet, since it wouldn't be useful without narrowing 
>> > > down 
>> > > - but since this thread came up, I at least thought I'd register my 
>> > > observations. 
>> > > 
>> > > -viral 
>> > > 
>> > > On Friday, June 12, 2015 at 12:21:46 PM UTC-4, Tim Holy wrote: 
>> > > > Just a reminder: if anyone still sees this kind of performance 
>> > > > regression, please do provide some more detail, as it's impossible 
>> to 
>> fix 
>> > without it. 
>> > > > It's 
>> > > > really as simple as this: 
>> > > > 
>> > > > run_my_workload()  # run once to force compilation @profile 
>> > > > run_my_workload() using ProfileView 
>> > > > ProfileView.view() 
>> > > > 
>> > > > and then hover over any big (especially, red) boxes along the top 
>> row. 
>> > > > Right- 
>> > > > clicking will put details into the REPL command line---if the 
>> > > > problematic 
>> > > > line(s) are indeed in base julia, and you can copy/paste them into 
>> > > > an email or issue report. You can also paste the output of 
>> > > > Profile.print(), if more detail about the full backtrace seems 
>> > > > useful (and if that output isn't too long). 
>> > > > 
>> > > > --Tim 
>> > > > 
>> > > > On Thursday, June 11, 2015 11:09:13 AM Sebastian Good wrote: 
>> > > > > I've seen the same. Looked away for a few weeks, and my code got 
>> > > > > ~5x slower. There's a lot going on so it's hard to say without 
>> > > > > detailed testing. However this code was always very sensitive to 
>> > > > > optimization to 
>> > > > 
>> > > > be 
>> > > > 
>> > > > > able to specialize code which read data of different types. I got 
>> > > > 
>> > > > massive 
>> > > > 
>> > > > > increases in memory allocations. I'll try to narrow it down, but 
>> > > > > it 
>> > > > 
>> > > > seems 
>> > > > 
>> > > > > like perhaps something was done with optimization passes or type 
>> > > > 
>> > > > inference? 
>> > > > 
>> > > > > On Wednesday, June 10, 2015 at 9:31:59 AM UTC-4, Kevin Squire 
>> wrote: 
>> > > > > > Short answer: no, poor performance across the board is not a 
>> > > > > > known 
>> > > > 
>> > > > issue. 
>> > > > 
>> > > > > > Just curious, do you see these timing issues locally as well? 
>> > > > > > In 
>> > > > 
>> > > > other 
>> > > > 
>> > > > > > words, is it a problem with Julia, or a problem with Travis 
>> (the 
>> > > > > > continuous integration framework)? 
>> > > > > > 
>> > > > > > It might be the case that some changes in v0.4 have (possibly 
>> > > > > > inadvertantly) slowed down certain workflows compared with 
>> v0.3, 
>> > > > 
>> > > > whereas 
>> > > > 
>> > > > > > others are unchanged or even faster. 
>> > > > > > 

Re: [julia-users] Using composite types with many fields

2015-06-23 Thread Isaiah Norton
See also: https://github.com/JuliaGeometry

On Mon, Jun 22, 2015 at 7:36 PM, Stef Kynaston 
wrote:

> Many thanks for this Mauro - the mesh library may very well be helpful in
> the near future. As it stands I am replicating exactly some existing MATLAB
> code that my group has been working with; the hope is to provoke a move
> over to Julia by a straight speed-comparison! However our mesh construction
> is not particularly refined (geddit? ;)), so I will definitely be looking
> to use something more sophisticated ultimately.
>
> On Sunday, June 21, 2015 at 2:43:53 PM UTC+1, Mauro wrote:
>
> Also, I got a (unregistered) mesh library, LMesh.jl [2], maybe that
>> could be of use to you as well?
>>
>> [1] https://github.com/mauro3/Parameters.jl
>> [2] https://bitbucket.org/maurow/lmesh.jl
>>
>>
>
>


Re: [julia-users] Re: WebGL with Julia?

2015-06-23 Thread Isaiah Norton
>
>  analyze where julia intermediate LLVM code could be forwarded to the JS
> backend that Emscripten enables.
>

https://github.com/JuliaLang/julia/issues/9430

On Tue, Jun 23, 2015 at 7:14 AM, Andreas Lobinger 
wrote:

> Hello colleague,
>
> On Monday, June 22, 2015 at 9:47:20 PM UTC+2, Uwe Fechner wrote:
>
>> Any suggestions?
>>
>
> Not really -> Start own development.
>
> One thing that's maybe far fetched at the moment. Your needed output is
> html5/js/WebGL. Emscripten is enabling this as compiler from C/C++ source
> including OpenGL to JS + WebGL (at least their web site tells so). And this
> is done via LLVM, the same basis, julia uses.
>
> So a project might be to analyze where julia intermediate LLVM code could
> be forwarded to the JS backend that Emscripten enables.
>
>
>


Re: [julia-users] Escher demos

2015-06-23 Thread Shashi Gowda
consume is essentially an alias to lift. Escher now lets you embed
interactive UIs in static surroundings. So your example can be simplified
to:

using Color

with_hue(hue, tile=size(4em, 4em, empty)) =
fillcolor(HSV(hue, 0.6, 0.6), tile)

function main(window)
push!(window.assets, "widgets")

xt = Input(0)

vbox(
subscribe(slider(0:360), xt),
consume(with_hue, xt)
)

end




On Tue, Jun 23, 2015 at 12:29 PM, Leonardo  wrote:

> Hello,
> I'm a newbie in Julia but I'm very interested to use it in scientific /
> numeric application with some minimal User Interface, and I think that
> Escher is great for this scope, useful also for remote computation.
>
> Unfortunately I've some problem running simple demos (I use win32 Julia
> in Win7 with latest Firefox)
>
> Trying to understand use of buttons, I've found example *form.jl *included
> in Escher, but it display at the bottom of page Dict{Any,Any}() and
> button doesn't work.
>
> Moreover I've extracted from *Reactive Programming Guide* (*reactive.jl*
> doc included in Escher) a simple hue example (attached), but doesn't work
> in original version because julia cannot found consume()
> using Escher
> using Color
>
>
> xt = Input(0)
>
> with_hue(hue, tile=size(4em, 4em, empty)) =
> fillcolor(HSV(hue, 0.6, 0.6), tile)
>
> function main(window)
> # Load HTML dependencies related to widgets
> push!(window.assets, "widgets")
>
> lift(xt) do x
> slider_and_huebox(x) = vbox(
> subscribe(slider(0:360), xt),
> "The current hue is: $x",
> with_hue(x)
> )
>
> consume(slider_and_huebox,xt)
> end
> end
>
> (I've found a consume()in Julia in file *task.jl*, but it has a
> completely different signature and use)
>
>
> Maybe I made trivial mistakes, but I cannot found right solutions.
> Someone can help me?
>
> Many thanks in advance
>
> Leonardo
>
>


Re: [julia-users] IJulia: Swap shift-enter for enter?

2015-06-23 Thread NotSoRecentConvert
Finally got around to giving it a shot. After a bit of searching it worked!

The config file was located in 
~/.config/sublime-text-3/Packages/IJulia/Default (Linux).sublime-keymap.

This is the original snippet.
{ "keys": ["shift+enter"], "command": "i_julia_enter", "args": {},
"context":
[
{ "key": "setting.julia", "operator": "equal", "operand": true 
},
{ "key": "auto_complete_visible", "operator": "equal", "operand"
: false }
]
},

{ "keys": ["enter"], "command": "i_julia_shift_enter", "args": {},
"context":
[
{ "key": "setting.julia", "operator": "equal", "operand": true 
},
{ "key": "auto_complete_visible", "operator": "equal", "operand"
: false }
]
},

This is the final.
{ "keys": ["enter"], "command": "i_julia_enter", "args": {},
"context":
[
{ "key": "setting.julia", "operator": "equal", "operand": true 
},
{ "key": "auto_complete_visible", "operator": "equal", "operand"
: false }
]
},

I changed "shift+enter" to just "enter" as you said and then removed the 
original "shift+enter" chunk. This combination fully swapped the default 
shift+enter and enter combinations.

*enter* now executes.
*shift + enter* now just adds a new line

Thanks!




[julia-users] Escher demos

2015-06-23 Thread Leonardo
Hello,
I'm a newbie in Julia but I'm very interested to use it in scientific / 
numeric application with some minimal User Interface, and I think that 
Escher is great for this scope, useful also for remote computation.

Unfortunately I've some problem running simple demos (I use win32 Julia in 
Win7 with latest Firefox)

Trying to understand use of buttons, I've found example *form.jl *included 
in Escher, but it display at the bottom of page Dict{Any,Any}() and button 
doesn't work.

Moreover I've extracted from *Reactive Programming Guide* (*reactive.jl* 
doc included in Escher) a simple hue example (attached), but doesn't work 
in original version because julia cannot found consume()
using Escher
using Color


xt = Input(0)

with_hue(hue, tile=size(4em, 4em, empty)) =
fillcolor(HSV(hue, 0.6, 0.6), tile)

function main(window)
# Load HTML dependencies related to widgets
push!(window.assets, "widgets")

lift(xt) do x
slider_and_huebox(x) = vbox(
subscribe(slider(0:360), xt),
"The current hue is: $x",
with_hue(x)
)

consume(slider_and_huebox,xt)
end
end

(I've found a consume()in Julia in file *task.jl*, but it has a completely 
different signature and use)


Maybe I made trivial mistakes, but I cannot found right solutions.
Someone can help me?

Many thanks in advance

Leonardo



hue_wrong.jl
Description: Binary data


hue_work.jl
Description: Binary data


[julia-users] Re: When are function arguments going to be inlined?

2015-06-23 Thread colintbowers
Yes, this proves to be an issue for me sometimes too. I asked a 
StackOverflow question on this topic a few months ago and got a very 
interesting response, as well as some interesting links. See here:

http://stackoverflow.com/questions/28356437/julia-compiler-does-not-appear-to-optimize-when-a-function-is-passed-a-function

As a general rule, if the function you are passing round is very simple and 
gets called a lot, then you will really notice the performance overhead. In 
other cases where the function is more complicated, or is not called that 
often, the overhead will be barely measurable.

If the number of functions that you want to pass around is not that large, 
one way around this is to use types and multiple dispatch instead of 
functions, eg

abstract UtilityFunctions
type QuadraticUtility <: UtilityFunctions
a::Float64
b::Float64
c::Float64
end
evaluate(x::Number, f::QuadraticUtility) = f.a*x^2 + f.b*x + f.c

Now your function would be something like:

function solveModel(f::UtilityFunctions, ...)

and you would call evaluate at the appropriate place in the function body 
and multiple dispatch will take care of the rest. There is no performance 
overhead with this approach.

Of course, if you want to be able to just pass in any arbitrary function 
that a user might think up, then this approach is not tenable.

On Tuesday, 23 June 2015 01:07:25 UTC+10, Andrew wrote:
>
>
>
> I'm trying to write some abstract Julia code to solve a variety of 
> economics models. Julia provides powerful abstraction tools which I think 
> makes it very well-suited to this; however, I've read in several places 
> that Julia doesn't yet know how to inline functions passed as arguments, 
> hence code like
>
> function SolveModel(Utility::Function, ProductionTechnology::Function,...)
> ...
>
> will be slow. I performed this very simple test.
>
> function ftest1()
> u(x) = log(x)
> function hello(fun::Function)
> for i = 1:100
> fun(i.^(1/2))
> end
> end
> end
> 
> function ftest2()
> function hello()
> for i = 1:100
> log(i.^(1/2))
> end
> end
> end
>
> @time ftest1()
> @time ftest2()
>
> elapsed time: 6.065e-6 seconds (496 bytes allocated)
> elapsed time: 3.784e-6 seconds (264 bytes allocated)
>
>
>  The inlined version is about twice as fast, which isn't all that bad, 
> although I'm not sure if it would be worse in a more complicated example. 
> Perhaps I shouldn't worry about this, and should code how I want. I was 
> wondering though, if anybody knows when this is going to change. I've read 
> about functors, which I don't really understand, but it sounds like people 
> are working on this problem.
>


[julia-users] Re: Using Winston in Julia 0.3.9

2015-06-23 Thread Patrick Kofod Mogensen
It works on v0.4 here. What version of Winston are you using?

On Tuesday, June 23, 2015 at 12:24:18 PM UTC+2, Ferran Mazzanti wrote:
>
> Hi folks,
> I'm experiencing problems trying to use Winston under Julia 0.3.9 in my 
> Linux machine. I just grabbed from the internet an exemple that
> should work, something like the code I paste at the end of the post. This 
> is similar to somthing simpler I tried to use, to no avail. The problem 
> comes
> with the last sentence: the add(p,s,a,b,l), where the REPL keeps 
> complaining add does not exist :(
> Any way to sort this problem? How it comes actually that it was suppossed 
> to exist, and now it does not exist anymore? 
> BTW I've found very little documentation on using Winston under Julia :(
> Thanks for your help.
>
> p = FramedPlot(
>  aspect_ratio=1,
>  xrange=(0,100),
>  yrange=(0,100))
>
>  n = 21
>  x = linspace(0, 100, n)
>  yA = 40 .+ 10randn(n)
>  yB = x .+ 5randn(n)
>
>  a = Points(x, yA, kind="circle")
>  setattr(a, label="a points")
>
>  b = Points(x, yB)
>  setattr(b, label="b points")
>  style(b, kind="filled circle")
>
>  s = Slope(1, (0,0), kind="dotted")
>  setattr(s, label="slope")
>
>  l = Legend(.1, .9, {a,b,s})
>
>  add(p, s, a, b, l)
>
>  
>


[julia-users] Re: WebGL with Julia?

2015-06-23 Thread Andreas Lobinger
Hello colleague,

On Monday, June 22, 2015 at 9:47:20 PM UTC+2, Uwe Fechner wrote:

> Any suggestions?
>

Not really -> Start own development.

One thing that's maybe far fetched at the moment. Your needed output is 
html5/js/WebGL. Emscripten is enabling this as compiler from C/C++ source 
including OpenGL to JS + WebGL (at least their web site tells so). And this 
is done via LLVM, the same basis, julia uses. 

So a project might be to analyze where julia intermediate LLVM code could 
be forwarded to the JS backend that Emscripten enables.




[julia-users] Re: Using Winston in Julia 0.3.9

2015-06-23 Thread Andreas Lobinger
Hello colleague,

i'm not sure, but you could try prefixing the add (or other Winston calls) 
with Winston -> Winston.add. The 'using'/'import' of Modules can interfere 
with each others 'exports' and add might be lost due to other imports 
before you import Winston. 






Re: [julia-users] Re: How to deploy Julia

2015-06-23 Thread René Donner
Cross-linking this to the corresponding issue: 
https://github.com/JuliaLang/julia/issues/11816


Am 23.06.2015 um 12:17 schrieb Vladislav Falfushinsky 
:

> Recently I`ve tried to deploy julia application, but I recieved an error.
> 
> The code is rather simple:
> 
> function main()
> println("Hello world!!!")
> println(1 + 1)
> end
> main()
> 
> I use command:
> 
> julia julia-master/contrib/build_executable.jl testexec ./test2.jl --force
> 
> However I recieve an error:
> 
> LoadError(at "sysimg.jl" line 327: LoadError(at 
> "/opt/emgs/projects/tmp/julia/julia-master/base/userimg.jl" line 1: 
> LoadError(at "/opt/emgs/projects/tmp/julia/test2.jl" line 7: 
> UndefVarError(var=:STDOUT
> rec_backtrace at /opt/emgs/projects/tmp/julia/julia-master/src/task.c:649
> jl_throw at /opt/emgs/projects/tmp/julia/julia-master/src/task.c:809
> jl_undefined_var_error at 
> /opt/emgs/projects/tmp/julia/julia-master/src/builtins.c:124
> main at /opt/emgs/projects/tmp/julia/test2.jl:3
> jlcall_main_21949 at  (unknown line)
> jl_apply at /opt/emgs/projects/tmp/julia/julia-master/src/gf.c:1650
> jl_apply at /opt/emgs/projects/tmp/julia/julia-master/src/interpreter.c:66
> eval at /opt/emgs/projects/tmp/julia/julia-master/src/interpreter.c:212
> jl_toplevel_eval_flex at 
> /opt/emgs/projects/tmp/julia/julia-master/src/toplevel.c:517
> jl_parse_eval_all at 
> /opt/emgs/projects/tmp/julia/julia-master/src/toplevel.c:567
> jl_load at /opt/emgs/projects/tmp/julia/julia-master/src/toplevel.c:610
> include at boot.jl:254
> jl_apply at /opt/emgs/projects/tmp/julia/julia-master/src/gf.c:1625
> include_from_node1 at loading.jl:133
> jl_apply at /opt/emgs/projects/tmp/julia/julia-master/src/gf.c:1625
> jl_apply at /opt/emgs/projects/tmp/julia/julia-master/src/interpreter.c:66
> eval at /opt/emgs/projects/tmp/julia/julia-master/src/interpreter.c:212
> jl_toplevel_eval_flex at 
> /opt/emgs/projects/tmp/julia/julia-master/src/toplevel.c:517
> jl_parse_eval_all at 
> /opt/emgs/projects/tmp/julia/julia-master/src/toplevel.c:567
> jl_load at /opt/emgs/projects/tmp/julia/julia-master/src/toplevel.c:610
> include at boot.jl:254
> jl_apply at /opt/emgs/projects/tmp/julia/julia-master/src/gf.c:1625
> include_from_node1 at loading.jl:133
> jl_apply at /opt/emgs/projects/tmp/julia/julia-master/src/gf.c:1625
> jl_apply at /opt/emgs/projects/tmp/julia/julia-master/src/interpreter.c:66
> eval at /opt/emgs/projects/tmp/julia/julia-master/src/interpreter.c:212
> eval_body at /opt/emgs/projects/tmp/julia/julia-master/src/interpreter.c:598
> jl_toplevel_eval_body at 
> /opt/emgs/projects/tmp/julia/julia-master/src/interpreter.c:529
> jl_toplevel_eval_flex at 
> /opt/emgs/projects/tmp/julia/julia-master/src/toplevel.c:511
> jl_parse_eval_all at 
> /opt/emgs/projects/tmp/julia/julia-master/src/toplevel.c:567
> jl_load at /opt/emgs/projects/tmp/julia/julia-master/src/toplevel.c:610
> unknown function (ip: 0x40249c)
> unknown function (ip: 0x4029a5)
> unknown function (ip: 0x402a5f)
> __libc_start_main at /lib64/libc.so.6 (unknown line)
> unknown function (ip: 0x401589)
> 
> ERROR: LoadError: failed process: 
> Process(`/opt/emgs/projects/tmp/julia/julia-master/usr/bin/julia -C native 
> --build /opt/emgs/projects/tmp/julia/julia-master/usr/lib/libtestexec -J 
> /opt/emgs/projects/tmp/julia/julia-master/usr/lib/inference.ji -f sysimg.jl`, 
> ProcessExited(1)) [1]
>  in run at ./process.jl:490
>  in anonymous at 
> /opt/emgs/projects/tmp/julia/julia-master/contrib/build_sysimg.jl:66
>  in cd at ./file.jl:22
>  in build_sysimg at 
> /opt/emgs/projects/tmp/julia/julia-master/contrib/build_sysimg.jl:27
>  in include at ./boot.jl:254
>  in include_from_node1 at loading.jl:133
>  in process_options at ./client.jl:304
>  in _start at ./client.jl:404
> while loading 
> /opt/emgs/projects/tmp/julia/julia-master/contrib/build_sysimg.jl, in 
> expression starting on line 176
> 
> ERROR: LoadError: failed process: 
> Process(`/opt/emgs/projects/tmp/julia/julia-master/usr/bin/julia 
> /opt/emgs/projects/tmp/julia/julia-master/contrib/build_sysimg.jl 
> /opt/emgs/projects/tmp/julia/julia-master/usr/lib/libtestexec native 
> /tmp/tmpyZPKz2/userimg.jl --force`, ProcessExited(1)) [1]
>  in run at ./process.jl:490
>  in build_executable at 
> /opt/emgs/projects/tmp/julia/julia-master/contrib/build_executable.jl:109
>  in build_executable at 
> /opt/emgs/projects/tmp/julia/julia-master/contrib/build_executable.jl:58
>  in include at ./boot.jl:254
>  in include_from_node1 at loading.jl:133
>  in process_options at ./client.jl:304
>  in _start at ./client.jl:404
> while loading 
> /opt/emgs/projects/tmp/julia/julia-master/contrib/build_executable.jl, in 
> expression starting on line 277
> 
> 
> On Sunday, June 14, 2015 at 4:17:03 PM UTC+2, Daniel Carrera wrote:
> I never cease to be impressed by Julia's speed. I just wrote a test program 
> in Julia and Fortran 90 (it computes the gravitational force between some 
> planets). To my surprise, the Julia version was 7%

[julia-users] Using Winston in Julia 0.3.9

2015-06-23 Thread Ferran Mazzanti
Hi folks,
I'm experiencing problems trying to use Winston under Julia 0.3.9 in my 
Linux machine. I just grabbed from the internet an exemple that
should work, something like the code I paste at the end of the post. This 
is similar to somthing simpler I tried to use, to no avail. The problem 
comes
with the last sentence: the add(p,s,a,b,l), where the REPL keeps 
complaining add does not exist :(
Any way to sort this problem? How it comes actually that it was suppossed 
to exist, and now it does not exist anymore? 
BTW I've found very little documentation on using Winston under Julia :(
Thanks for your help.

p = FramedPlot(
 aspect_ratio=1,
 xrange=(0,100),
 yrange=(0,100))

 n = 21
 x = linspace(0, 100, n)
 yA = 40 .+ 10randn(n)
 yB = x .+ 5randn(n)

 a = Points(x, yA, kind="circle")
 setattr(a, label="a points")

 b = Points(x, yB)
 setattr(b, label="b points")
 style(b, kind="filled circle")

 s = Slope(1, (0,0), kind="dotted")
 setattr(s, label="slope")

 l = Legend(.1, .9, {a,b,s})

 add(p, s, a, b, l)

 


[julia-users] Re: How to deploy Julia

2015-06-23 Thread Vladislav Falfushinsky
Recently I`ve tried to deploy julia application, but I recieved an error.

The code is rather simple:

function main()
println("Hello world!!!")
println(1 + 1)
end
main()

I use command:

julia julia-master/contrib/build_executable.jl testexec ./test2.jl --force

However I recieve an error:

LoadError(at "sysimg.jl" line 327: LoadError(at 
"/opt/emgs/projects/tmp/julia/julia-master/base/userimg.jl" line 1: 
LoadError(at "/opt/emgs/projects/tmp/julia/test2.jl" line 7: 
UndefVarError(var=:STDOUT
rec_backtrace at /opt/emgs/projects/tmp/julia/julia-master/src/task.c:649
jl_throw at /opt/emgs/projects/tmp/julia/julia-master/src/task.c:809
jl_undefined_var_error at 
/opt/emgs/projects/tmp/julia/julia-master/src/builtins.c:124
main at /opt/emgs/projects/tmp/julia/test2.jl:3
jlcall_main_21949 at  (unknown line)
jl_apply at /opt/emgs/projects/tmp/julia/julia-master/src/gf.c:1650
jl_apply at /opt/emgs/projects/tmp/julia/julia-master/src/interpreter.c:66
eval at /opt/emgs/projects/tmp/julia/julia-master/src/interpreter.c:212
jl_toplevel_eval_flex at 
/opt/emgs/projects/tmp/julia/julia-master/src/toplevel.c:517
jl_parse_eval_all at 
/opt/emgs/projects/tmp/julia/julia-master/src/toplevel.c:567
jl_load at /opt/emgs/projects/tmp/julia/julia-master/src/toplevel.c:610
include at boot.jl:254
jl_apply at /opt/emgs/projects/tmp/julia/julia-master/src/gf.c:1625
include_from_node1 at loading.jl:133
jl_apply at /opt/emgs/projects/tmp/julia/julia-master/src/gf.c:1625
jl_apply at /opt/emgs/projects/tmp/julia/julia-master/src/interpreter.c:66
eval at /opt/emgs/projects/tmp/julia/julia-master/src/interpreter.c:212
jl_toplevel_eval_flex at 
/opt/emgs/projects/tmp/julia/julia-master/src/toplevel.c:517
jl_parse_eval_all at 
/opt/emgs/projects/tmp/julia/julia-master/src/toplevel.c:567
jl_load at /opt/emgs/projects/tmp/julia/julia-master/src/toplevel.c:610
include at boot.jl:254
jl_apply at /opt/emgs/projects/tmp/julia/julia-master/src/gf.c:1625
include_from_node1 at loading.jl:133
jl_apply at /opt/emgs/projects/tmp/julia/julia-master/src/gf.c:1625
jl_apply at /opt/emgs/projects/tmp/julia/julia-master/src/interpreter.c:66
eval at /opt/emgs/projects/tmp/julia/julia-master/src/interpreter.c:212
eval_body at /opt/emgs/projects/tmp/julia/julia-master/src/interpreter.c:598
jl_toplevel_eval_body at 
/opt/emgs/projects/tmp/julia/julia-master/src/interpreter.c:529
jl_toplevel_eval_flex at 
/opt/emgs/projects/tmp/julia/julia-master/src/toplevel.c:511
jl_parse_eval_all at 
/opt/emgs/projects/tmp/julia/julia-master/src/toplevel.c:567
jl_load at /opt/emgs/projects/tmp/julia/julia-master/src/toplevel.c:610
unknown function (ip: 0x40249c)
unknown function (ip: 0x4029a5)
unknown function (ip: 0x402a5f)
__libc_start_main at /lib64/libc.so.6 (unknown line)
unknown function (ip: 0x401589)

ERROR: LoadError: failed process: 
Process(`/opt/emgs/projects/tmp/julia/julia-master/usr/bin/julia -C native 
--build /opt/emgs/projects/tmp/julia/julia-master/usr/lib/libtestexec -J 
/opt/emgs/projects/tmp/julia/julia-master/usr/lib/inference.ji -f sysimg.jl`, 
ProcessExited(1)) [1]
 in run at ./process.jl:490
 in anonymous at 
/opt/emgs/projects/tmp/julia/julia-master/contrib/build_sysimg.jl:66
 in cd at ./file.jl:22
 in build_sysimg at 
/opt/emgs/projects/tmp/julia/julia-master/contrib/build_sysimg.jl:27
 in include at ./boot.jl:254
 in include_from_node1 at loading.jl:133
 in process_options at ./client.jl:304
 in _start at ./client.jl:404
while loading 
/opt/emgs/projects/tmp/julia/julia-master/contrib/build_sysimg.jl, in 
expression starting on line 176

ERROR: LoadError: failed process: 
Process(`/opt/emgs/projects/tmp/julia/julia-master/usr/bin/julia 
/opt/emgs/projects/tmp/julia/julia-master/contrib/build_sysimg.jl 
/opt/emgs/projects/tmp/julia/julia-master/usr/lib/libtestexec native 
/tmp/tmpyZPKz2/userimg.jl --force`, ProcessExited(1)) [1]
 in run at ./process.jl:490
 in build_executable at 
/opt/emgs/projects/tmp/julia/julia-master/contrib/build_executable.jl:109
 in build_executable at 
/opt/emgs/projects/tmp/julia/julia-master/contrib/build_executable.jl:58
 in include at ./boot.jl:254
 in include_from_node1 at loading.jl:133
 in process_options at ./client.jl:304
 in _start at ./client.jl:404
while loading 
/opt/emgs/projects/tmp/julia/julia-master/contrib/build_executable.jl, in 
expression starting on line 277


On Sunday, June 14, 2015 at 4:17:03 PM UTC+2, Daniel Carrera wrote:
>
> I never cease to be impressed by Julia's speed. I just wrote a test 
> program in Julia and Fortran 90 (it computes the gravitational force 
> between some planets). To my surprise, the Julia version was 7% faster than 
> Fortran.
>
> So... apparently one can write N-body codes with Julia.
>
> Suppose I wrote a simulation in Julia. What would be the best way to 
> deploy it to a distant computer cluster where I do not have root access? 
> The cluster runs Linux. Can I get a single stand-alone Julia binary that I 
> can just push to the cluster so I can run Julia

Re: [julia-users] Current Performance w Trunk Compared to 0.3

2015-06-23 Thread Kristoffer Carlsson
I have found a quite severe performance hit in my KNN searches 
in https://github.com/KristofferC/KDTrees.jl

Earlier results: knn / sec: 730922, now; knn / sec: 271581

I will bisect it later today and see if there are major performance hits 
somewhere.


On Tuesday, June 23, 2015 at 1:54:09 AM UTC+2, David Anthoff wrote:
>
> I also saw a huge performance drop for a pretty involved piece of code 
> when 
> I tried a pre-release version of 0.4 a couple of weeks ago, compared to 
> running things on 0.3. 
>
> My plan was to wait for a feature freeze of 0.4 and then investigate and 
> report these things, I don't have the bandwidth to track these things 
> ongoing. Maybe that would be a good rule of thumb, to ask people to look 
> for 
> performance regressions with existing code once there is a feature freeze 
> on 
> 0.4? 
>
> Also, are there ongoing performance tests? Especially tests that don't 
> micro-benchmark, but test runtimes of relatively complex pieces of code? 
>
> > -Original Message- 
> > From: julia...@googlegroups.com  [mailto:julia- 
>  
> > us...@googlegroups.com ] On Behalf Of Tim Holy 
> > Sent: Sunday, June 14, 2015 4:36 AM 
> > To: julia...@googlegroups.com  
> > Subject: Re: [julia-users] Current Performance w Trunk Compared to 0.3 
> > 
> > git bisect? 
> > 
> > Perhaps the leading candidate is 
> > https://github.com/JuliaLang/julia/issues/11681 
> > which may be fixed by 
> > https://github.com/JuliaLang/julia/pull/11683 
> > 
> > --Tim 
> > 
> > On Sunday, June 14, 2015 02:58:19 AM Viral Shah wrote: 
> > > FWIW, I have seen a 25% regression from 0.3 to 0.4 on a reasonably 
> > > complex codebase, but haven't been able to isolate the offending code. 
> > > GC time in the 0.4 run is significantly smaller than 0.3, which means 
> > > that if you discount GC, the difference is more like 40%. I wonder if 
> > > this is some weird interaction with the caching in the new GC, or it 
> > > is the quality of generated code. 
> > > 
> > > I didn't report it yet, since it wouldn't be useful without narrowing 
> > > down 
> > > - but since this thread came up, I at least thought I'd register my 
> > > observations. 
> > > 
> > > -viral 
> > > 
> > > On Friday, June 12, 2015 at 12:21:46 PM UTC-4, Tim Holy wrote: 
> > > > Just a reminder: if anyone still sees this kind of performance 
> > > > regression, please do provide some more detail, as it's impossible 
> to 
> fix 
> > without it. 
> > > > It's 
> > > > really as simple as this: 
> > > > 
> > > > run_my_workload()  # run once to force compilation @profile 
> > > > run_my_workload() using ProfileView 
> > > > ProfileView.view() 
> > > > 
> > > > and then hover over any big (especially, red) boxes along the top 
> row. 
> > > > Right- 
> > > > clicking will put details into the REPL command line---if the 
> > > > problematic 
> > > > line(s) are indeed in base julia, and you can copy/paste them into 
> > > > an email or issue report. You can also paste the output of 
> > > > Profile.print(), if more detail about the full backtrace seems 
> > > > useful (and if that output isn't too long). 
> > > > 
> > > > --Tim 
> > > > 
> > > > On Thursday, June 11, 2015 11:09:13 AM Sebastian Good wrote: 
> > > > > I've seen the same. Looked away for a few weeks, and my code got 
> > > > > ~5x slower. There's a lot going on so it's hard to say without 
> > > > > detailed testing. However this code was always very sensitive to 
> > > > > optimization to 
> > > > 
> > > > be 
> > > > 
> > > > > able to specialize code which read data of different types. I got 
> > > > 
> > > > massive 
> > > > 
> > > > > increases in memory allocations. I'll try to narrow it down, but 
> > > > > it 
> > > > 
> > > > seems 
> > > > 
> > > > > like perhaps something was done with optimization passes or type 
> > > > 
> > > > inference? 
> > > > 
> > > > > On Wednesday, June 10, 2015 at 9:31:59 AM UTC-4, Kevin Squire 
> wrote: 
> > > > > > Short answer: no, poor performance across the board is not a 
> > > > > > known 
> > > > 
> > > > issue. 
> > > > 
> > > > > > Just curious, do you see these timing issues locally as well? 
> > > > > > In 
> > > > 
> > > > other 
> > > > 
> > > > > > words, is it a problem with Julia, or a problem with Travis (the 
> > > > > > continuous integration framework)? 
> > > > > > 
> > > > > > It might be the case that some changes in v0.4 have (possibly 
> > > > > > inadvertantly) slowed down certain workflows compared with v0.3, 
> > > > 
> > > > whereas 
> > > > 
> > > > > > others are unchanged or even faster. 
> > > > > > 
> > > > > > Could you run profiling and see what parts of the code are the 
> > > > 
> > > > slowest, 
> > > > 
> > > > > > and then file issues for any slowdowns, with (preferably 
> > > > > > minimal) examples? 
> > > > > > 
> > > > > > Cheers, 
> > > > > > 
> > > > > >Kevin 
> > > > > > 
> > > > > > On Wed, Jun 10, 2015 at 9:10 AM, andrew cooke  > > > > > 
> > > > > > > wrote: 
> > > > > >> Is it the current p

[julia-users] Re: Redirecting output julia (log to file)

2015-06-23 Thread Avik Sengupta
>From a batch file, you can redirect the output using ">" . 
See 
http://blog.crankybit.com/redirecting-output-to-a-file-in-windows-batch-scripts/
 
for some details. 

Alternatively, if you control how the output is generated, you can use the 
Logging.jl package to write them to file in reasonably flexible way. 

Regards
-
Avik

On Tuesday, 23 June 2015 07:55:54 UTC+1, bernhard wrote:
>
> Hi all
>
> I am invoking a Julia process from an external program (say a Java script, 
> Windows or any program). 
> Julia starts, does some calculations and closes again. While Julia runs I 
> can of course see the REPL which shows me some logging and evaluations.
>
> I now want to set up a web service (REST or similar) where people can 
> upload data, start of a model, and download results. For this I want to 
> have a log file which shows the same or similar output which the REPL 
> (=Julia session) displays. I understand that the REPL is does much more 
> than just displaying stdout and stderr (this is why is said similar before).
>
> Leah Hanson is describing 
>   (Base.|> 
> ) which works perfectly for me. But this only works if I use Julia to 
> start another Julia session. In my case I want to start Julia from windows 
> (a *.bat file) or any other software. How can I achieve the same result?
>
> I am aware of redirect_stdout and redirect_stderr. This does not work 
> properly for me, as the program might run for several minutes, and it seems 
> to me that output is only written after I flush or close the stream. But I 
> want to see the progress in the log file (just as the REPL does) while my 
> code runs. It would be tedious to add "flush"  everywhere in my code where 
> I have a print command.
>
> Is there a way to achieve this? 
>
> As mentioned above I can invoke a julia session which starts another julia 
> session with the actual code:
> (st,pr) = open(`$(juliaExecutable) $(juliaProgram) $(arguments)` |> 
> logFilename)
> Is there a way to achieve the same result without the intermediate julia 
> session?
>
> Thanks
> Bernhard
>


[julia-users] Re: Getting started with documentation using Docile and Lexicon in 0.3.x

2015-06-23 Thread NotSoRecentConvert
A recent updated by Michael Hatherly to Docile fixed the problem. It works 
now.