RE: [julia-users] Re: Julia Summer of Code

2015-05-30 Thread Rohit Kashyap
Its ok Sir. I am drafting a proposal and will be sharing with you for 
consideration and discussion for improvement before submission.

-Original Message-
From: "Jiahao Chen" 
Sent: ‎30-‎05-‎2015 12:13
To: "julia-users@googlegroups.com" 
Subject: Re: [julia-users] Re: Julia Summer of Code

Sorry, that should have been June 1.



On Sat, May 30, 2015, 11:52 Jiahao Chen  wrote:

Hi Rohit,


Please read the CFP and submit a proposal by the end of 1 May. The CFP contains 
sample code projects.


http://julialang.org/blog/2015/05/jsoc-cfp/

[julia-users] Re: Julia on Android (and/or the web) - a scientific calculator on steroids.. good for tablets

2015-05-30 Thread Simon Danisch
>But looking at other tools, it seems that the bar is not set very high :D
Just realized that this sounds quite arrogant without some context ;)
This just applies to fast and extendable libraries. You could not really 
extend HTML itself, could you?
So QT seems to be one of the libraries, that is good at everything...But I 
think wrapping it for Julia will negate some of the appeals. 

And your other questions should probably be turned into a Julia issue, to 
collect milestones and issues that need to get resolved in order to get 
Julia to Android && IOS (&& Windows Phone).

Am Donnerstag, 28. Mai 2015 16:17:08 UTC+2 schrieb Páll Haraldsson:
>
>
> I've noticed: "I guess we can announce alpha support for arm in 0.4 as 
> well." (and the other thread on Julia on ARM).
>
> Now, Android runs on x86 (already covered, then if you have that kind of 
> device, no need to wait for ARM support), ARM, and MIPS (actually do not 
> know of a single device that uses it..).
>
>
> I would like to know the most promising way to support Android and..
>
> A. For Firefox OS and the web in general, and hybrid apps, compiling to 
> JavaScript (or Dart and then to JavaScript) would be a possibility, with 
> asm.js/Emscripten.
>
> B. Just making native Android apps is probably easier. Assuming the ARM 
> CPU is solved, it seems easier. And iOS would be very similar.. But would 
> not work for Firefox OS - not a priority for now, but the web in general 
> would be nice..
>
>
> B. seems more promising except for the tiny/non-existent MIPS "problem".. 
> Also better long term, for full Android framework support and full Julia 
> support (concurrency/BLAS etc. that JavaScript would not handle).
>
>
> 1. Just getting Julia to work on Android is the first step. Just the REPL, 
> wouldn't have to be Juno IDE etc. or GUI stuff.
>
> 2. You could to a lot with just the REPL and a real keyboard or just an 
> alternative programmers virtual keyboard.. However, graphing would be nice, 
> and what would be needed? What are the most promising GUI libraries already 
> supported by Julia (or not..)? Say Qt, supported by Julia and Android. 
> Would it just work?
>
> 3. Long term, making apps, even standalone (Julia "supports" that) with 
> Julia. If GUIs work for graphing, is then really anything possible? I know 
> Android/Java has a huge framework. Google is already supporting Android 
> with Go (without any Java) as of version 1.4 and with Dart (for hybrid 
> apps). For Go they have a "framework problem" going to support games at 
> first. Some people are sceptical about Julia and games because of GC (I'm 
> not so much). I note Go also has GC..
>
> JavaCall.jl only works for JVM not Dalvik or ART. Would it be best to just 
> use the native C support on Android or somehow go through Go? Anyone 
> already tried to call Go from Julia? Rust is possible, but doesn't have GC. 
> Go should be possible, just as Java, but have similar problems..
>
> Do/could macros somehow help with supporting the full Android framework? 
> Julia already has "no overhead" calling, could you generate bindings from 
> automatically from some metadata and/or on the fly?
>
>
> This could be a cool pet project - anyone else working along these lines?
>
> Any reason plan B couldn't succeed relatively quickly? There are some ways 
> to make apps *on* Android already, I think all crappy, Julia wouldn't be..?
>
>
> Thanks in advance,
> -- 
> Palli.
>
>

Re: [julia-users] Re: Does union() imply worse performance?

2015-05-30 Thread Jiahao Chen
For this use case of optionally present data, Nullable would seem
appropriate (although this is 0.4-only).

http://julia.readthedocs.org/en/latest/manual/types/#nullable-types-representing-missing-values

Thanks,

Jiahao Chen
Research Scientist
MIT CSAIL

On Sat, May 30, 2015 at 2:08 PM, Tobias Knopp 
wrote:

> There is one exception though, which is keyword arguments
>
>
> Am Samstag, 30. Mai 2015 03:49:45 UTC+2 schrieb Steven G. Johnson:
>>
>> *No!*  This is one of the most common misconceptions about Julia
>> programming.
>>
>> The type declarations in function arguments have *no impact* on
>> performance.  Zero.  Nada.  Zip.  You *don't have to declare a type at
>> all* in the function argument, and it *still* won't matter for
>> performance.
>>
>> The argument types are just a filter for when the function is applicable.
>>
>> The first time a function is called, a specialized version is compiled
>> for the types of the arguments that you pass it.  Subsequently, when you
>> call it with arguments of the same type, the specialized version is called.
>>
>> Note also that a default argument foo(x, y=false) is exactly equivalent
>> to defining
>>
>> foo(x,y) = ...
>> foo(x) = foo(x, false)
>>
>> So, if you call foo(x, [1,2,3]), it calls a version of foo(x,y)
>> specialized for an Array{Int} in the second argument.  The existence of a
>> version of foo specialized for a boolean y is irrelevant.
>>
>


Re: [julia-users] Re: Does union() imply worse performance?

2015-05-30 Thread Tobias Knopp
Yes thats true but thats the future and currently not in stable Julia.

Am Samstag, 30. Mai 2015 12:39:10 UTC+2 schrieb Jiahao Chen:
>
> For this use case of optionally present data, Nullable would seem 
> appropriate (although this is 0.4-only).
>
>
> http://julia.readthedocs.org/en/latest/manual/types/#nullable-types-representing-missing-values
>
> Thanks,
>
> Jiahao Chen
> Research Scientist
> MIT CSAIL
>
> On Sat, May 30, 2015 at 2:08 PM, Tobias Knopp  > wrote:
>
>> There is one exception though, which is keyword arguments
>>
>>
>> Am Samstag, 30. Mai 2015 03:49:45 UTC+2 schrieb Steven G. Johnson:
>>>
>>> *No!*  This is one of the most common misconceptions about Julia 
>>> programming.
>>>
>>> The type declarations in function arguments have *no impact* on 
>>> performance.  Zero.  Nada.  Zip.  You *don't have to declare a type at 
>>> all* in the function argument, and it *still* won't matter for 
>>> performance.
>>>
>>> The argument types are just a filter for when the function is applicable.
>>>
>>> The first time a function is called, a specialized version is compiled 
>>> for the types of the arguments that you pass it.  Subsequently, when you 
>>> call it with arguments of the same type, the specialized version is called.
>>>
>>> Note also that a default argument foo(x, y=false) is exactly equivalent 
>>> to defining
>>>
>>> foo(x,y) = ...
>>> foo(x) = foo(x, false)
>>>
>>> So, if you call foo(x, [1,2,3]), it calls a version of foo(x,y) 
>>> specialized for an Array{Int} in the second argument.  The existence of a 
>>> version of foo specialized for a boolean y is irrelevant.
>>>
>>
>

[julia-users] Re: Does union() imply worse performance?

2015-05-30 Thread David Gold
@Steven,

Would you help me to understand the difference between this case here and 
the case of DataArray{T}s -- which, by my understanding, are basically 
AbstractArray{Union{T, NaN}, 1}'s? My first thought was that taking a 
Union{Bool, AbstractArray{Float, 2}} argument would potentially interfere 
with the compiler's ability to perform type inference, similar to how 
looping through a DataArray can experience a cost from the compiler having 
to deal with possible NaNs. 

But what you're saying is that this does not apply here, since presumably 
the argument, whether it is a Bool or an AbstractArray, would be 
type-stable throughout the functions operations -- unlike the values 
contained in a DataArray. Would it be fair to say that dealing with Union{} 
types tends to be dangerous to performance mostly when they are looped over 
in some sort of container, since in that case it's not a matter of simply 
dispatching a specially compiled method on one of the conjunct types or the 
other?

On Friday, May 29, 2015 at 9:49:45 PM UTC-4, Steven G. Johnson wrote:
>
> *No!*  This is one of the most common misconceptions about Julia 
> programming.
>
> The type declarations in function arguments have *no impact* on 
> performance.  Zero.  Nada.  Zip.  You *don't have to declare a type at 
> all* in the function argument, and it *still* won't matter for 
> performance.
>
> The argument types are just a filter for when the function is applicable.
>
> The first time a function is called, a specialized version is compiled for 
> the types of the arguments that you pass it.  Subsequently, when you call 
> it with arguments of the same type, the specialized version is called.
>
> Note also that a default argument foo(x, y=false) is exactly equivalent to 
> defining
>
> foo(x,y) = ...
> foo(x) = foo(x, false)
>
> So, if you call foo(x, [1,2,3]), it calls a version of foo(x,y) 
> specialized for an Array{Int} in the second argument.  The existence of a 
> version of foo specialized for a boolean y is irrelevant.
>


Re: [julia-users] Macros generating Functions

2015-05-30 Thread David Gold
Something to note about Tom's method is that the name function must be 
passed to gf as a symbol, unlike in the case of a macro. However, in most 
cases this slight difference probably will not warrant a macro.

On Friday, May 29, 2015 at 8:58:56 PM UTC-4, Tom Lee wrote:
>
> You don't need to use a macro, a function can do this:
>
> julia> function gf(n::Symbol = gensym()) 
>@eval function $(n)() 
>1
>end 
>end
>
> I've also made the n argument optional, with gensym creating a unique name 
> by default - the newly defined function is returned by gf, so you don't 
> necessarily need to know its name. And of course if you give gf additional 
> arguments you can programatically construct expressions based those and 
> easily $ them into the @eval block. It's all very awesome.
>
> But the point is a macro probably isn't appropriate for this type of 
> thing. My understanding is that you should never use a macro if you can 
> easily write an equivalent function.
>
> Cheers,
>
> Tom
>
> On Thursday, 28 May 2015 23:26:39 UTC+10, Mauro wrote:
>>
>> Like this: 
>>
>> julia> macro gf(n) 
>>quote 
>>function $(esc(n))() 
>>1 
>>end 
>>end 
>>end 
>>
>> julia> @gf foo 
>> foo (generic function with 1 method) 
>>
>> julia> foo() 
>> 1 
>>
>> On Thu, 2015-05-28 at 12:06, Vasudha Khandelwal  
>> wrote: 
>> > Can I use macros to generate functions with names passed as argument to 
>> the 
>> > macro? 
>>
>>

Re: [julia-users] Re: Does union() imply worse performance?

2015-05-30 Thread Tim Holy
https://github.com/JuliaLang/Compat.jl#new-types

--Tim

On Saturday, May 30, 2015 04:50:07 AM Tobias Knopp wrote:
> Yes thats true but thats the future and currently not in stable Julia.
> 
> Am Samstag, 30. Mai 2015 12:39:10 UTC+2 schrieb Jiahao Chen:
> > For this use case of optionally present data, Nullable would seem
> > appropriate (although this is 0.4-only).
> > 
> > 
> > http://julia.readthedocs.org/en/latest/manual/types/#nullable-types-repres
> > enting-missing-values
> > 
> > Thanks,
> > 
> > Jiahao Chen
> > Research Scientist
> > MIT CSAIL
> > 
> > On Sat, May 30, 2015 at 2:08 PM, Tobias Knopp  > 
> > > wrote:
> >> There is one exception though, which is keyword arguments
> >> 
> >> Am Samstag, 30. Mai 2015 03:49:45 UTC+2 schrieb Steven G. Johnson:
> >>> *No!*  This is one of the most common misconceptions about Julia
> >>> programming.
> >>> 
> >>> The type declarations in function arguments have *no impact* on
> >>> performance.  Zero.  Nada.  Zip.  You *don't have to declare a type at
> >>> all* in the function argument, and it *still* won't matter for
> >>> performance.
> >>> 
> >>> The argument types are just a filter for when the function is
> >>> applicable.
> >>> 
> >>> The first time a function is called, a specialized version is compiled
> >>> for the types of the arguments that you pass it.  Subsequently, when you
> >>> call it with arguments of the same type, the specialized version is
> >>> called.
> >>> 
> >>> Note also that a default argument foo(x, y=false) is exactly equivalent
> >>> to defining
> >>> 
> >>> foo(x,y) = ...
> >>> foo(x) = foo(x, false)
> >>> 
> >>> So, if you call foo(x, [1,2,3]), it calls a version of foo(x,y)
> >>> specialized for an Array{Int} in the second argument.  The existence of
> >>> a
> >>> version of foo specialized for a boolean y is irrelevant.



Re: [julia-users] Multiple lines statement?

2015-05-30 Thread Christoph Ortner
I'm surprised so few people are bothered by this. Maybe it is just sloppy 
coders like myself who worry about it ;).

Christoph


On Friday, 29 May 2015 00:04:46 UTC+1, Yichao Yu wrote:
>
> Sorry. Somehow the gmail hotkey got messed up... 
>
> On Thu, May 28, 2015 at 6:08 PM, Christoph Ortner 
> > wrote: 
> > 
> > Is there any chance for a debate whether or not to introduce a symbol 
> for 
> > line-continuation? It could be optional. 
>
> I would +1 on this. 
>
> We can probably live without it but from time to time I find myself 
> looking for it. 
>
> > The reason I am asking is that I just wasted a day looking for a bug 
> that 
> > was caused by an equivalent situation to the example below. 
> > 
> > Christoph 
> > 
> > On Sunday, 30 November 2014 11:55:20 UTC, Christoph Ortner wrote: 
> >> 
> >> I think that the standard in mathematical typesetting is to write 
> >> 2 
> >>  + 3 
> >> rather than 
> >>2 + 
> >>   3 
> >> 
> >> so personally I find the Matlab syntax easier to read. One of the very 
> few 
> >> choices Julia made that  I am not so sure about. 
> >> 
> >> Christoph 
> >> 
> > 
>


Re: [julia-users] Multiple lines statement?

2015-05-30 Thread Tamas Papp
In general,

1. if my code stretches to multiple lines, that is usually a warning
sign that I may be doing something wrong. Expressions that are very long
are difficult to understand, and thus are likely source of bugs.

2. if I review my code and find that I really need multiple lines,
ending with a binary operator or wrapping the whole thing (or the part
that would break) in ()'s usually helps.

IMO further syntax for line continuations is not necessary, but of
course reasonable people can hold differing opinions on this.

Best,

Tamas

On Sat, May 30 2015, Christoph Ortner  wrote:

> I'm surprised so few people are bothered by this. Maybe it is just sloppy
> coders like myself who worry about it ;).
>
> Christoph
>
>
> On Friday, 29 May 2015 00:04:46 UTC+1, Yichao Yu wrote:
>>
>> Sorry. Somehow the gmail hotkey got messed up...
>>
>> On Thu, May 28, 2015 at 6:08 PM, Christoph Ortner
>> > wrote:
>> >
>> > Is there any chance for a debate whether or not to introduce a symbol
>> for
>> > line-continuation? It could be optional.
>>
>> I would +1 on this.
>>
>> We can probably live without it but from time to time I find myself
>> looking for it.
>>
>> > The reason I am asking is that I just wasted a day looking for a bug
>> that
>> > was caused by an equivalent situation to the example below.
>> >
>> > Christoph
>> >
>> > On Sunday, 30 November 2014 11:55:20 UTC, Christoph Ortner wrote:
>> >>
>> >> I think that the standard in mathematical typesetting is to write
>> >> 2
>> >>  + 3
>> >> rather than
>> >>2 +
>> >>   3
>> >>
>> >> so personally I find the Matlab syntax easier to read. One of the very
>> few
>> >> choices Julia made that  I am not so sure about.
>> >>
>> >> Christoph
>> >>
>> >
>>


Re: [julia-users] Multiple lines statement?

2015-05-30 Thread Tim Holy
I can't speak for anyone else, but my experience with Matlab's _obligatory_ 
line continuation characters makes me actively disinterested in them. Not sure 
how I would feel about an optional character.

--Tim

On Saturday, May 30, 2015 06:41:31 AM Christoph Ortner wrote:
> I'm surprised so few people are bothered by this. Maybe it is just sloppy
> coders like myself who worry about it ;).
> 
> Christoph
> 
> On Friday, 29 May 2015 00:04:46 UTC+1, Yichao Yu wrote:
> > Sorry. Somehow the gmail hotkey got messed up...
> > 
> > On Thu, May 28, 2015 at 6:08 PM, Christoph Ortner
> > 
> > > wrote:
> > > Is there any chance for a debate whether or not to introduce a symbol
> > 
> > for
> > 
> > > line-continuation? It could be optional.
> > 
> > I would +1 on this.
> > 
> > We can probably live without it but from time to time I find myself
> > looking for it.
> > 
> > > The reason I am asking is that I just wasted a day looking for a bug
> > 
> > that
> > 
> > > was caused by an equivalent situation to the example below.
> > > 
> > > Christoph
> > > 
> > > On Sunday, 30 November 2014 11:55:20 UTC, Christoph Ortner wrote:
> > >> I think that the standard in mathematical typesetting is to write
> > >> 
> > >> 2
> > >> 
> > >>  + 3
> > >> 
> > >> rather than
> > >> 
> > >>2 +
> > >>
> > >>   3
> > >> 
> > >> so personally I find the Matlab syntax easier to read. One of the very
> > 
> > few
> > 
> > >> choices Julia made that  I am not so sure about.
> > >> 
> > >> Christoph



Re: [julia-users] MATLAB MEX embedding signal handling segfault

2015-05-30 Thread Eric Davies
Well alright then! Today I learned something about C. Thanks :)

On Fri, May 29, 2015 at 6:25 PM, Jeff Bezanson 
wrote:

> This line will do it:
>
> jl_options.handle_signals = JL_OPTIONS_HANDLE_SIGNALS_OFF;
>
> jl_options is DLLEXPORTed, and both it and its type are in julia.h.
>
> I think this is a perfectly good API. True that in general functions
> are better than structs, but I don't see any benefit here.
>
> On Fri, May 29, 2015 at 5:29 PM, Isaiah Norton 
> wrote:
> > On Fri, May 29, 2015 at 3:49 PM, Eric Davies  wrote:
> >>
> >> It seems like it would but I can't figure out how to set any of those
> >> options from C. Does anyone here know how?
> >
> >
> > I don't think that is possible right now because the options struct is
> not
> > part of the external API (and, almost certainly should not be). The
> simplest
> > solution is to add an option to `jl_init_with_image` and modify
> jl_options
> > there. That plan doesn't scale very well, but I don't know how many more
> of
> > these options are both embedding-critical and imperative to set before
> Julia
> > is initialized. PR sent if you want to test:
> >
> > https://github.com/JuliaLang/julia/pull/11489
> >
> >>
> >>
> >>
> >> On Friday, 29 May 2015 12:31:20 UTC-5, Tim Holy wrote:
> >>>
> >>> Jeff just merged https://github.com/JuliaLang/julia/pull/11473, which
> >>> might
> >>> help?
> >>>
> >>> --Tim
> >>>
> >>> On Wednesday, March 04, 2015 02:51:19 PM Eric Davies wrote:
> >>> > It appears as if that did not solve the problem, though I verified
> that
> >>> > it
> >>> > did what was intended. This is unfortunate. I'll continue looking at
> >>> > this
> >>> > for a day or two but I'm not confident I can find a solution alone.
> >>> > Here is
> >>> > a minimal set of code to
> >>> > replicate: https://gist.github.com/iamed2/e883c6b0b8ff4220d946
> >>> >
> >>> > On Wednesday, 4 March 2015 06:06:19 UTC-6, Tim Holy wrote:
> >>> > > It seems possible we should add a second "boolean" argument to
> >>> > > _julia_init
> >>> > > (in
> >>> > > init.c) that controls whether the signal handlers get set. But as a
> >>> > > workaround:
> >>> > >
> >>> > >
> >>> > >
> http://stackoverflow.com/questions/9495113/how-to-get-the-handlers-name-ad
> >>> > > dress-for-some-signals-e-g-sigint-in-postgres
> >>> > >
> >>> > > You could at least test whether that solves the problem, and then
> if
> >>> > > so
> >>> > > perhaps it would be worth opening an issue in julia about whether
> we
> >>> > > need
> >>> > > a
> >>> > > better solution.
> >>> > >
> >>> > > BTW, I'm interested in this topic too; see also
> >>> > >
> https://groups.google.com/d/msg/julia-users/dP_J5KilsEs/4CIERQ14vdgJ.
> >>> > > We
> >>> > > should collaborate on a single solution (and it sounds like you're
> >>> > > farther
> >>> > > along and perhaps more invested). I'm happy to pitch in if code
> gets
> >>> > > posted
> >>> > > somewhere.
> >>> > >
> >>> > > --Tim
> >>> > >
> >>> > > On Tuesday, March 03, 2015 09:18:04 PM Eric Davies wrote:
> >>> > > > Hi all,
> >>> > > >
> >>> > > > I'm attempting to embed julia in a MATLAB MEX file. Everything is
> >>> > > > going
> >>> > > > great, but MATLAB will sometimes segfault after having run some
> >>> > > > code
> >>> > > > calling Julia.* I believe I have narrowed it down to this
> >>> > > > issue: http://ubuntuforums.org/showthread.php?t=2093057 .
> >>> > > > Basically, I
> >>> > > > believe Julia is registering a SIGSEGV (or maybe other signal?)
> >>> > > > handler
> >>> > > > that overwrites the default for the JVM set by MATLAB. after the
> >>> > > > Julia
> >>> > > > function is done, that memory is freed. Then a segfault (or maybe
> >>> > > > other
> >>> > > > signal?) happens in the JVM and it tries to call Julia's handler
> >>> > > > but
> >>> > > > segfaults (again) as it is no longer there.
> >>> > > >
> >>> > > > Can anyone help me find a workaround? Perhaps if there's a way to
> >>> > > > "deregister" the handler, or if someone knows a way to get the
> >>> > > > current
> >>> > > > handler (before calling into Julia) and then setting it back to
> >>> > > > that
> >>> > >
> >>> > > after
> >>> > >
> >>> > > > Julia is done.
> >>> > > >
> >>> > > > I've never dealt with signals in C before so I apologize if I'm
> >>> > >
> >>> > > describing
> >>> > >
> >>> > > > things incorrectly or missing something.
> >>> > > >
> >>> > > > Thanks,
> >>> > > > Eric
> >>> > > >
> >>> > > > *I am able to reliably reproduce this by running any MEX function
> >>> > >
> >>> > > linking
> >>> > >
> >>> > > > to Julia and calling jl_init, then calling `help clear` in
> MATLAB.
> >>> > > >
> >>> > > > P.S.: I'm on Mac OS X 10.10 with MATLAB R2012b and Julia
> >>> > >
> >>> > > v0.3.6/v0.4.0-dev
> >>>
> >
>


Re: [julia-users] Multiple lines statement?

2015-05-30 Thread Yichao Yu
On Sat, May 30, 2015 at 11:24 AM, Tim Holy  wrote:
> I can't speak for anyone else, but my experience with Matlab's _obligatory_
> line continuation characters makes me actively disinterested in them. Not sure
> how I would feel about an optional character.

The only reason I want line continuation instead of parenthesis is
that emacs' julia-mode does not indent function/code blocks inside
parenthesis.

OK. I guess that's a different bug and should be fixed in julia-mode
instead..

>
> --Tim
>
> On Saturday, May 30, 2015 06:41:31 AM Christoph Ortner wrote:
>> I'm surprised so few people are bothered by this. Maybe it is just sloppy
>> coders like myself who worry about it ;).
>>
>> Christoph
>>
>> On Friday, 29 May 2015 00:04:46 UTC+1, Yichao Yu wrote:
>> > Sorry. Somehow the gmail hotkey got messed up...
>> >
>> > On Thu, May 28, 2015 at 6:08 PM, Christoph Ortner
>> >
>> > > wrote:
>> > > Is there any chance for a debate whether or not to introduce a symbol
>> >
>> > for
>> >
>> > > line-continuation? It could be optional.
>> >
>> > I would +1 on this.
>> >
>> > We can probably live without it but from time to time I find myself
>> > looking for it.
>> >
>> > > The reason I am asking is that I just wasted a day looking for a bug
>> >
>> > that
>> >
>> > > was caused by an equivalent situation to the example below.
>> > >
>> > > Christoph
>> > >
>> > > On Sunday, 30 November 2014 11:55:20 UTC, Christoph Ortner wrote:
>> > >> I think that the standard in mathematical typesetting is to write
>> > >>
>> > >> 2
>> > >>
>> > >>  + 3
>> > >>
>> > >> rather than
>> > >>
>> > >>2 +
>> > >>
>> > >>   3
>> > >>
>> > >> so personally I find the Matlab syntax easier to read. One of the very
>> >
>> > few
>> >
>> > >> choices Julia made that  I am not so sure about.
>> > >>
>> > >> Christoph
>


Re: [julia-users] Multiple lines statement?

2015-05-30 Thread Gabriel Mihalache
Once you spend a few days tracking down a bug due to this, you never
forget. The idea would be to find a way to save people from this experience.

Some lines are naturally long because e.g. the equation is long or because
you prefer long, informative variable names. You can always use variables
for parts of the expression but then that just feels like working around
poor language features/design.


[julia-users] Re: Does union() imply worse performance?

2015-05-30 Thread John Myles White
David,

To clarify your understanding of what's wrong with DataArrays, check out 
the DataArray code for something like 
getindex(): 
https://github.com/JuliaStats/DataArrays.jl/blob/master/src/indexing.jl#L109

I don't have a full understanding of Julia's type inference system, but 
here's my best attempt to explain my current understanding of the system 
and how it affects Seth's original example.

Consider two simple functions, f and g, and their application inside a 
larger function, gf():

# Given pre-existing definitions such that:
#
# f(input::R) => output::S
# g(input::S) => output::T
#
# What can we infer about the following larger function?
function gf(x::Any)
return g(f(x))
end

The important questions to ask are about what we can infer at 
method-compile-time for gf(). Specifically, ask:

(1) Can we determine the type S given the type R, which is currently bound 
to the type of the specific value of x on which we called gf()? (Note that 
it was the act of calling gf(x) on a specific value that triggered the 
entire method-compilation process.)

(2) Can we determine that the type S is a specific concrete type? 
Concreteness matters, because we're going to have to think about how the 
output of f() affects the input of g(). In particular, we need to know 
whether we need to perform run-time dispatch inside of gf() or whether all 
dispatch inside of gf() can be determined statically given the type of 
gf()'s argument x.

(3) Assuming that we successfully determined a concrete type S given R, can 
we repeat the process for g() to yield a concrete type for T? If so, then 
we'll be able to infer, at least for one specific type R, the concrete 
output type of gf(x). If not, we'll have to give looser bounds on the 
concrete types that come out of gf() given an input of a specific value 
like our current x. That would be important if we were going to call gf() 
inside another function.

Hope that helps.

 -- John

On Saturday, May 30, 2015 at 4:51:09 AM UTC-7, David Gold wrote:
>
> @Steven,
>
> Would you help me to understand the difference between this case here and 
> the case of DataArray{T}s -- which, by my understanding, are basically 
> AbstractArray{Union{T, NaN}, 1}'s? My first thought was that taking a 
> Union{Bool, AbstractArray{Float, 2}} argument would potentially interfere 
> with the compiler's ability to perform type inference, similar to how 
> looping through a DataArray can experience a cost from the compiler having 
> to deal with possible NaNs. 
>
> But what you're saying is that this does not apply here, since presumably 
> the argument, whether it is a Bool or an AbstractArray, would be 
> type-stable throughout the functions operations -- unlike the values 
> contained in a DataArray. Would it be fair to say that dealing with Union{} 
> types tends to be dangerous to performance mostly when they are looped over 
> in some sort of container, since in that case it's not a matter of simply 
> dispatching a specially compiled method on one of the conjunct types or the 
> other?
>
> On Friday, May 29, 2015 at 9:49:45 PM UTC-4, Steven G. Johnson wrote:
>>
>> *No!*  This is one of the most common misconceptions about Julia 
>> programming.
>>
>> The type declarations in function arguments have *no impact* on 
>> performance.  Zero.  Nada.  Zip.  You *don't have to declare a type at 
>> all* in the function argument, and it *still* won't matter for 
>> performance.
>>
>> The argument types are just a filter for when the function is applicable.
>>
>> The first time a function is called, a specialized version is compiled 
>> for the types of the arguments that you pass it.  Subsequently, when you 
>> call it with arguments of the same type, the specialized version is called.
>>
>> Note also that a default argument foo(x, y=false) is exactly equivalent 
>> to defining
>>
>> foo(x,y) = ...
>> foo(x) = foo(x, false)
>>
>> So, if you call foo(x, [1,2,3]), it calls a version of foo(x,y) 
>> specialized for an Array{Int} in the second argument.  The existence of a 
>> version of foo specialized for a boolean y is irrelevant.
>>
>

[julia-users] Re: Does union() imply worse performance?

2015-05-30 Thread Seth
Steven,

Thanks. Is it then the recommended/usual practice to have one "main" 
function with every possible argument you might want, and then several 
methods that provide specific dispatch and pass just the arguments relevant 
to that method? That is,

function dijkstra_shortest_paths(graph, use_dists, edge_dists) ... # this 
is where the algorithm is implemented

dijkstra_shortest_paths(graph, edge_dists::AbstractArray{Float64,2}) = 
dijkstra_shortest_paths(graph, true, edge_dists)
dijkstra_shortest_paths(graph) = dijkstra_shortest_paths(graph, false, 
Array{Float64,2}())

?

On Friday, May 29, 2015 at 6:49:45 PM UTC-7, Steven G. Johnson wrote:
>
> *No!*  This is one of the most common misconceptions about Julia 
> programming.
>
> The type declarations in function arguments have *no impact* on 
> performance.  Zero.  Nada.  Zip.  You *don't have to declare a type at 
> all* in the function argument, and it *still* won't matter for 
> performance.
>
> The argument types are just a filter for when the function is applicable.
>
> The first time a function is called, a specialized version is compiled for 
> the types of the arguments that you pass it.  Subsequently, when you call 
> it with arguments of the same type, the specialized version is called.
>
> Note also that a default argument foo(x, y=false) is exactly equivalent to 
> defining
>
> foo(x,y) = ...
> foo(x) = foo(x, false)
>
> So, if you call foo(x, [1,2,3]), it calls a version of foo(x,y) 
> specialized for an Array{Int} in the second argument.  The existence of a 
> version of foo specialized for a boolean y is irrelevant.
>


[julia-users] Segmentation fault when building Cairo

2015-05-30 Thread Nathan Baum
When trying to Pkg.add("Cairo"), the build process reliably fails - and 
kills the runtime - with a segmentation fault, every time.


julia> Pkg.add("Cairo")
INFO: Installing BinDeps v0.3.12
INFO: Installing Cairo v0.2.27
INFO: Installing Color v0.4.5
INFO: Installing Compat v0.4.4
INFO: Installing FixedPointNumbers v0.0.7
INFO: Installing Graphics v0.1.0
INFO: Installing SHA v0.0.4
INFO: Installing URIParser v0.0.5
INFO: Building Cairo

signal (11): Segmentation fault
unknown function (ip: -61501312)
unknown function (ip: -201944468)
unknown function (ip: -202401246)
Segmentation fault (core dumped)

$


This leaves the packages in an inconsistent state - further attempts to 
install Cairo report there's nothing to do, but Cairo can't actually be 
used[1]. Pkg.rm("Cairo") seems to clean it up, though.

I've tried this on both 0.3.8 and git master. The same thing happens with 
julia-debug.

I don't know how to find out where the segfault is actually happening. Any 
advice would be gratefully received.

[1]

julia> using Cairo
ERROR: LoadError: could not open file 
/home/nathan/.julia/v0.4/Cairo/src/../deps/deps.jl
while loading /home/nathan/.julia/v0.4/Cairo/src/Cairo.jl, in expression 
starting on line 5




Re: [julia-users] Multiple lines statement?

2015-05-30 Thread Tim Holy
I don't understand why you can't just use parentheses for this. A key 
advantage is that if you edit it to put the break at a different place, or 
decide to add/eliminate a break, the expression is still valid. This is not 
true in, say, Matlab, where a statement like
  x = 5 + ...3;
is invalid. You end up having to do a lot of editing when you only want to 
add/subtract/move one character (a carriage return), and for me at least, 
changing the position of the line break is something I do a lot when working 
on code that has long computations. If you use parentheses, you have none of 
these problems, nor will you create bugs.

Python, for example, recommends against their backslash line continuation 
character in their style guide (https://www.python.org/dev/peps/pep-0008/), 
even though it's available.

Best,
--Tim

On Saturday, May 30, 2015 12:21:16 PM Gabriel Mihalache wrote:
> Once you spend a few days tracking down a bug due to this, you never
> forget. The idea would be to find a way to save people from this experience.
> 
> Some lines are naturally long because e.g. the equation is long or because
> you prefer long, informative variable names. You can always use variables
> for parts of the expression but then that just feels like working around
> poor language features/design.



Re: [julia-users] Segmentation fault when building Cairo

2015-05-30 Thread Tim Holy
Did this only start today? I just tagged a new Cairo release; you could try 
pinning an older one.

Otherwise, follow the usual issue-filing procedures. Useful info is here:
http://docs.julialang.org/en/latest/devdocs/backtraces/
no matter whether you decide to just report an issue or want to try to dig 
into this yourself.

Best,
--Tim

On Saturday, May 30, 2015 10:00:40 AM Nathan Baum wrote:
> When trying to Pkg.add("Cairo"), the build process reliably fails - and
> kills the runtime - with a segmentation fault, every time.
> 
> 
> julia> Pkg.add("Cairo")
> INFO: Installing BinDeps v0.3.12
> INFO: Installing Cairo v0.2.27
> INFO: Installing Color v0.4.5
> INFO: Installing Compat v0.4.4
> INFO: Installing FixedPointNumbers v0.0.7
> INFO: Installing Graphics v0.1.0
> INFO: Installing SHA v0.0.4
> INFO: Installing URIParser v0.0.5
> INFO: Building Cairo
> 
> signal (11): Segmentation fault
> unknown function (ip: -61501312)
> unknown function (ip: -201944468)
> unknown function (ip: -202401246)
> Segmentation fault (core dumped)
> 
> $
> 
> 
> This leaves the packages in an inconsistent state - further attempts to
> install Cairo report there's nothing to do, but Cairo can't actually be
> used[1]. Pkg.rm("Cairo") seems to clean it up, though.
> 
> I've tried this on both 0.3.8 and git master. The same thing happens with
> julia-debug.
> 
> I don't know how to find out where the segfault is actually happening. Any
> advice would be gratefully received.
> 
> [1]
> 
> julia> using Cairo
> ERROR: LoadError: could not open file
> /home/nathan/.julia/v0.4/Cairo/src/../deps/deps.jl
> while loading /home/nathan/.julia/v0.4/Cairo/src/Cairo.jl, in expression
> starting on line 5



Re: [julia-users] Re: julia on arm - some more progress

2015-05-30 Thread Viral Shah
Could you guys try with the latest master, with a fresh clone? I am no 
longer passing any flags to LLVM and also using LLVM 3.6.1.

-viral

On Saturday, May 30, 2015 at 7:18:41 AM UTC+5:30, Viral Shah wrote:
>
> We need to figure out the magic they use to build those binaries. What if 
> you remove all LLVM flags in the Julia build in the arm section of make.inc 
> and let LLVM do it's thing?
>
> -viral
> On 29 May 2015 9:27 pm, "Daan Huybrechs"  wrote:
>
>> On the other hand, building with the LLVM binaries from llvm.org does 
>> still work on a Pi 2 to get a working REPL -  I just tested that with 
>> gcc-4.9 and LLVM 3.6.1 today.
>>
>> Daan
>>
>

Re: [julia-users] Re: julia on arm - some more progress

2015-05-30 Thread Seth
On it. make distcleanall; git pull; make?

On Saturday, May 30, 2015 at 10:23:05 AM UTC-7, Viral Shah wrote:
>
> Could you guys try with the latest master, with a fresh clone? I am no 
> longer passing any flags to LLVM and also using LLVM 3.6.1.
>
> -viral
>
> On Saturday, May 30, 2015 at 7:18:41 AM UTC+5:30, Viral Shah wrote:
>>
>> We need to figure out the magic they use to build those binaries. What if 
>> you remove all LLVM flags in the Julia build in the arm section of make.inc 
>> and let LLVM do it's thing?
>>
>> -viral
>> On 29 May 2015 9:27 pm, "Daan Huybrechs" > > wrote:
>>
>>> On the other hand, building with the LLVM binaries from llvm.org does 
>>> still work on a Pi 2 to get a working REPL -  I just tested that with 
>>> gcc-4.9 and LLVM 3.6.1 today.
>>>
>>> Daan
>>>
>>

Re: [julia-users] Re: julia on arm - some more progress

2015-05-30 Thread Seth
Oh, also export JULIA_CPU_ARCH=arm1176jzf-s. Build running now - will 
report back in about 12 hours :)

On Saturday, May 30, 2015 at 10:24:15 AM UTC-7, Seth wrote:
>
> On it. make distcleanall; git pull; make?
>
> On Saturday, May 30, 2015 at 10:23:05 AM UTC-7, Viral Shah wrote:
>>
>> Could you guys try with the latest master, with a fresh clone? I am no 
>> longer passing any flags to LLVM and also using LLVM 3.6.1.
>>
>> -viral
>>
>> On Saturday, May 30, 2015 at 7:18:41 AM UTC+5:30, Viral Shah wrote:
>>>
>>> We need to figure out the magic they use to build those binaries. What 
>>> if you remove all LLVM flags in the Julia build in the arm section of 
>>> make.inc and let LLVM do it's thing?
>>>
>>> -viral
>>> On 29 May 2015 9:27 pm, "Daan Huybrechs"  wrote:
>>>
 On the other hand, building with the LLVM binaries from llvm.org does 
 still work on a Pi 2 to get a working REPL -  I just tested that with 
 gcc-4.9 and LLVM 3.6.1 today.

 Daan

>>>

[julia-users] Re: matlab code to julia for image segmentation

2015-05-30 Thread fshussaini
Thanks for the reply. Got it working this way

 imv = shareproperties(imhsv, [HSV(imhsv[i,j].h > 280, imhsv[i,j].s > 0.5, 
imhsv[i,j].v > 0.14) for i = 1:size(imhsv,1),j = 1:size(imhsv,2)])

Regards

On Thursday, May 28, 2015 at 3:10:53 PM UTC+2, fshus...@gmail.com wrote:
>
> Hi,
>  
> I use matlab for image segmentation.I am currently trying to do the same 
> in julia. How can regionproperties be implemented in julia to get the 
> bounding box.
>
> How can this be done in julia?
>
> BW = imgHSV(:,:,1) < 0.05 | imgHSV(:,:,1) > 0.15;
>
> Regards
>
>

Re: [julia-users] julia on arm - some more progress

2015-05-30 Thread Viral Shah
Yes, those steps are good. Could you leave out the JULIA_CPU_ARCH for now?

-viral



> On 30-May-2015, at 10:56 pm, Seth  wrote:
> 
> Oh, also export JULIA_CPU_ARCH=arm1176jzf-s. Build running now - will report 
> back in about 12 hours :)
> 
> On Saturday, May 30, 2015 at 10:24:15 AM UTC-7, Seth wrote:
> On it. make distcleanall; git pull; make?
> 
> On Saturday, May 30, 2015 at 10:23:05 AM UTC-7, Viral Shah wrote:
> Could you guys try with the latest master, with a fresh clone? I am no longer 
> passing any flags to LLVM and also using LLVM 3.6.1.
> 
> -viral
> 
> On Saturday, May 30, 2015 at 7:18:41 AM UTC+5:30, Viral Shah wrote:
> We need to figure out the magic they use to build those binaries. What if you 
> remove all LLVM flags in the Julia build in the arm section of make.inc and 
> let LLVM do it's thing?
> 
> -viral
> 
> On 29 May 2015 9:27 pm, "Daan Huybrechs"  wrote:
> On the other hand, building with the LLVM binaries from llvm.org does still 
> work on a Pi 2 to get a working REPL -  I just tested that with gcc-4.9 and 
> LLVM 3.6.1 today.
> 
> Daan



Re: [julia-users] julia on arm - some more progress

2015-05-30 Thread Seth
Oh, ok - let me restart.


On Saturday, May 30, 2015 at 10:29:50 AM UTC-7, Viral Shah wrote:
>
> Yes, those steps are good. Could you leave out the JULIA_CPU_ARCH for now? 
>
> -viral 
>
>
>
> > On 30-May-2015, at 10:56 pm, Seth > 
> wrote: 
> > 
> > Oh, also export JULIA_CPU_ARCH=arm1176jzf-s. Build running now - will 
> report back in about 12 hours :) 
> > 
> > On Saturday, May 30, 2015 at 10:24:15 AM UTC-7, Seth wrote: 
> > On it. make distcleanall; git pull; make? 
> > 
> > On Saturday, May 30, 2015 at 10:23:05 AM UTC-7, Viral Shah wrote: 
> > Could you guys try with the latest master, with a fresh clone? I am no 
> longer passing any flags to LLVM and also using LLVM 3.6.1. 
> > 
> > -viral 
> > 
> > On Saturday, May 30, 2015 at 7:18:41 AM UTC+5:30, Viral Shah wrote: 
> > We need to figure out the magic they use to build those binaries. What 
> if you remove all LLVM flags in the Julia build in the arm section of 
> make.inc and let LLVM do it's thing? 
> > 
> > -viral 
> > 
> > On 29 May 2015 9:27 pm, "Daan Huybrechs"  wrote: 
> > On the other hand, building with the LLVM binaries from llvm.org does 
> still work on a Pi 2 to get a working REPL -  I just tested that with 
> gcc-4.9 and LLVM 3.6.1 today. 
> > 
> > Daan 
>
>

[julia-users] Re: Does union() imply worse performance?

2015-05-30 Thread David Gold
Thank you for the link and the explanation, John -- it's definitely 
helpful. Is current work with Nullable and data structures available 
anywhere in JuliaStats, or is it being developed elsewhere?

On Saturday, May 30, 2015 at 12:23:09 PM UTC-4, John Myles White wrote:
>
> David,
>
> To clarify your understanding of what's wrong with DataArrays, check out 
> the DataArray code for something like getindex(): 
> https://github.com/JuliaStats/DataArrays.jl/blob/master/src/indexing.jl#L109 
> 
>
> I don't have a full understanding of Julia's type inference system, but 
> here's my best attempt to explain my current understanding of the system 
> and how it affects Seth's original example.
>
> Consider two simple functions, f and g, and their application inside a 
> larger function, gf():
>
> # Given pre-existing definitions such that:
> #
> # f(input::R) => output::S
> # g(input::S) => output::T
> #
> # What can we infer about the following larger function?
> function gf(x::Any)
> return g(f(x))
> end
>
> The important questions to ask are about what we can infer at 
> method-compile-time for gf(). Specifically, ask:
>
> (1) Can we determine the type S given the type R, which is currently bound 
> to the type of the specific value of x on which we called gf()? (Note that 
> it was the act of calling gf(x) on a specific value that triggered the 
> entire method-compilation process.)
>
> (2) Can we determine that the type S is a specific concrete type? 
> Concreteness matters, because we're going to have to think about how the 
> output of f() affects the input of g(). In particular, we need to know 
> whether we need to perform run-time dispatch inside of gf() or whether all 
> dispatch inside of gf() can be determined statically given the type of 
> gf()'s argument x.
>
> (3) Assuming that we successfully determined a concrete type S given R, 
> can we repeat the process for g() to yield a concrete type for T? If so, 
> then we'll be able to infer, at least for one specific type R, the concrete 
> output type of gf(x). If not, we'll have to give looser bounds on the 
> concrete types that come out of gf() given an input of a specific value 
> like our current x. That would be important if we were going to call gf() 
> inside another function.
>
> Hope that helps.
>
>  -- John
>
> On Saturday, May 30, 2015 at 4:51:09 AM UTC-7, David Gold wrote:
>>
>> @Steven,
>>
>> Would you help me to understand the difference between this case here and 
>> the case of DataArray{T}s -- which, by my understanding, are basically 
>> AbstractArray{Union{T, NaN}, 1}'s? My first thought was that taking a 
>> Union{Bool, AbstractArray{Float, 2}} argument would potentially interfere 
>> with the compiler's ability to perform type inference, similar to how 
>> looping through a DataArray can experience a cost from the compiler having 
>> to deal with possible NaNs. 
>>
>> But what you're saying is that this does not apply here, since presumably 
>> the argument, whether it is a Bool or an AbstractArray, would be 
>> type-stable throughout the functions operations -- unlike the values 
>> contained in a DataArray. Would it be fair to say that dealing with Union{} 
>> types tends to be dangerous to performance mostly when they are looped over 
>> in some sort of container, since in that case it's not a matter of simply 
>> dispatching a specially compiled method on one of the conjunct types or the 
>> other?
>>
>> On Friday, May 29, 2015 at 9:49:45 PM UTC-4, Steven G. Johnson wrote:
>>>
>>> *No!*  This is one of the most common misconceptions about Julia 
>>> programming.
>>>
>>> The type declarations in function arguments have *no impact* on 
>>> performance.  Zero.  Nada.  Zip.  You *don't have to declare a type at 
>>> all* in the function argument, and it *still* won't matter for 
>>> performance.
>>>
>>> The argument types are just a filter for when the function is applicable.
>>>
>>> The first time a function is called, a specialized version is compiled 
>>> for the types of the arguments that you pass it.  Subsequently, when you 
>>> call it with arguments of the same type, the specialized version is called.
>>>
>>> Note also that a default argument foo(x, y=false) is exactly equivalent 
>>> to defining
>>>
>>> foo(x,y) = ...
>>> foo(x) = foo(x, false)
>>>
>>> So, if you call foo(x, [1,2,3]), it calls a version of foo(x,y) 
>>> specialized for an Array{Int} in the second argument.  The existence of a 
>>> version of foo specialized for a boolean y is irrelevant.
>>>
>>

Re: [julia-users] Multiple lines statement?

2015-05-30 Thread Scott Jones
Ditto on what Tim said... had to use C/C++ \ for macros for years... always 
accidentally getting deleted, and causing things to break.  In Julia you 
can use parenthesis, for macros quote ; end...
I happily haven't really seen a need for them in Julia (at least yet).

On Saturday, May 30, 2015 at 7:11:06 PM UTC+2, Tim Holy wrote:
>
> I don't understand why you can't just use parentheses for this. A key 
> advantage is that if you edit it to put the break at a different place, or 
> decide to add/eliminate a break, the expression is still valid. This is 
> not 
> true in, say, Matlab, where a statement like 
>   x = 5 + ...3; 
> is invalid. You end up having to do a lot of editing when you only want to 
> add/subtract/move one character (a carriage return), and for me at least, 
> changing the position of the line break is something I do a lot when 
> working 
> on code that has long computations. If you use parentheses, you have none 
> of 
> these problems, nor will you create bugs. 
>
> Python, for example, recommends against their backslash line continuation 
> character in their style guide (https://www.python.org/dev/peps/pep-0008/), 
>
> even though it's available. 
>
> Best, 
> --Tim 
>
> On Saturday, May 30, 2015 12:21:16 PM Gabriel Mihalache wrote: 
> > Once you spend a few days tracking down a bug due to this, you never 
> > forget. The idea would be to find a way to save people from this 
> experience. 
> > 
> > Some lines are naturally long because e.g. the equation is long or 
> because 
> > you prefer long, informative variable names. You can always use 
> variables 
> > for parts of the expression but then that just feels like working around 
> > poor language features/design. 
>
>

[julia-users] Please help debug this code segment - StatsBase / LoadError: BoundsError

2015-05-30 Thread Maco Anshu
Hi,

The following code segment seems to throw error:

using StatsBase
 other lines
println(size(sqdists),m," ",sampleSize)
wv = WeightVec(sqdists)
println("done-3")
println(sample(1:m,wv,sampleSize))

(31670,)31670 50
ERROR: LoadError: BoundsError
 in getindex at range.jl:346
done-3
while loading E:\projs\HPCA\driver.jl, in expression starting on line 25

Also it seems t work sometimes and sometimes not (works on another smaller 
array ). Please see if you have encountered such error earlier.

Thanks in advance



[julia-users] Re: example for ccall use and fortran

2015-05-30 Thread Andre Bieler
Ok so I have a few simple examples working for ccalling fortran functions 
and subroutines from Julia.
Maybe someone will find this useful examples when first looking into 
calling fortran from julia.

compile the following fortran module with

*gfortran simplemodule.f95 -o simplemodule.so -shared -fPIC*


```
!fileName = simplemodule.f95
module simpleModule

implicit none

contains
function foo(x)
  integer :: foo, x
  foo = x * 2
end function foo

subroutine bar(x, a, b)
  integer, intent(in) :: x
  integer, intent(out) :: a, b
  
  a = x + 3
  b = x * 3
end subroutine bar

subroutine keg(x, a, b)
  real*8, intent(in) :: x
  real*8, intent(out) :: a, b
  
  a = x + 3.0
  b = x * 3.0
end subroutine keg

subroutine ruf(x, y)
  real*8, dimension(3), intent(in) :: x
  real*8, dimension(3), intent(out) :: y
  integer :: i
  
  DO i = 1, 3
y(i) = 2*x(i)
  END DO
end subroutine ruf

end module simplemodule
```

then you can use the following julia script to call the functions from the 
shared library.

```
x1 = 7
a1 = [0]
b1 = [0]

r1 = ccall((:__simplemodule_MOD_foo, 
"/home/abieler/testPrograms/fortranShardLib/simplemodule.so"), Int64, 
(Ptr{Int64},), &x1)

println(r1)
println()

ccall((:__simplemodule_MOD_bar, 
"/home/abieler/testPrograms/fortranShardLib/simplemodule.so"), Void, 
  (Ptr{Int64}, Ptr{Int64}, Ptr{Int64}), &x1, a1, b1)
  
println(a1[1])
println(b1[1])
println()

x2 = 7.0
a2 = Cdouble[1.0]
b2 = Cdouble[1.0]

ccall((:__simplemodule_MOD_keg, 
"/home/abieler/testPrograms/fortranShardLib/simplemodule.so"), Void, 
  (Ptr{Float64}, Ptr{Float64}, Ptr{Float64}), &x2, a2, b2)
  
println(a2[1])
println(b2[1])
println()

x3 = [1.0, 2.0, 3.0]
y3 = [0.0, 0.0, 0.0]
ccall((:__simplemodule_MOD_ruf, 
"/home/abieler/testPrograms/fortranShardLib/simplemodule.so"), Void, 
  (Ptr{Float64}, Ptr{Float64}), x3, y3)
  
println(y3)
```



[julia-users] Re: example for ccall use and fortran

2015-05-30 Thread Andre Bieler
Ok so I have a few simple examples working for ccalling fortran functions 
and subroutines from Julia.
Maybe someone will find this useful examples when first looking into 
calling fortran from julia.

compile the following fortran mod
```
!fileName = simplemodule.f95
module simpleModule

implicit none

contains
function foo(x)
  integer :: foo, x
  foo = x * 2
end function foo

subroutine bar(x, a, b)
  integer, intent(in) :: x
  integer, intent(out) :: a, b
  
  a = x + 3
  b = x * 3
end subroutine bar

subroutine keg(x, a, b)
  real*8, intent(in) :: x
  real*8, intent(out) :: a, b
  
  a = x + 3.0
  b = x * 3.0
end subroutine keg

subroutine ruf(x, y)
  real*8, dimension(3), intent(in) :: x
  real*8, dimension(3), intent(out) :: y
  integer :: i
  
  DO i = 1, 3
y(i) = 2*x(i)
  END DO
end subroutine ruf

end module simplemodule
```


[julia-users] Re: Does union() imply worse performance?

2015-05-30 Thread John Myles White
The NullableArrays work is very far behind schedule. I developed RSI right 
after announcing the work on NullableArrays and am still recovering, which 
means that I can spend very little time working on Julia code these days.

I'll give you more details offline.

 -- John

On Saturday, May 30, 2015 at 10:48:10 AM UTC-7, David Gold wrote:
>
> Thank you for the link and the explanation, John -- it's definitely 
> helpful. Is current work with Nullable and data structures available 
> anywhere in JuliaStats, or is it being developed elsewhere?
>
> On Saturday, May 30, 2015 at 12:23:09 PM UTC-4, John Myles White wrote:
>>
>> David,
>>
>> To clarify your understanding of what's wrong with DataArrays, check out 
>> the DataArray code for something like getindex(): 
>> https://github.com/JuliaStats/DataArrays.jl/blob/master/src/indexing.jl#L109 
>> 
>>
>> I don't have a full understanding of Julia's type inference system, but 
>> here's my best attempt to explain my current understanding of the system 
>> and how it affects Seth's original example.
>>
>> Consider two simple functions, f and g, and their application inside a 
>> larger function, gf():
>>
>> # Given pre-existing definitions such that:
>> #
>> # f(input::R) => output::S
>> # g(input::S) => output::T
>> #
>> # What can we infer about the following larger function?
>> function gf(x::Any)
>> return g(f(x))
>> end
>>
>> The important questions to ask are about what we can infer at 
>> method-compile-time for gf(). Specifically, ask:
>>
>> (1) Can we determine the type S given the type R, which is currently 
>> bound to the type of the specific value of x on which we called gf()? (Note 
>> that it was the act of calling gf(x) on a specific value that triggered the 
>> entire method-compilation process.)
>>
>> (2) Can we determine that the type S is a specific concrete type? 
>> Concreteness matters, because we're going to have to think about how the 
>> output of f() affects the input of g(). In particular, we need to know 
>> whether we need to perform run-time dispatch inside of gf() or whether all 
>> dispatch inside of gf() can be determined statically given the type of 
>> gf()'s argument x.
>>
>> (3) Assuming that we successfully determined a concrete type S given R, 
>> can we repeat the process for g() to yield a concrete type for T? If so, 
>> then we'll be able to infer, at least for one specific type R, the concrete 
>> output type of gf(x). If not, we'll have to give looser bounds on the 
>> concrete types that come out of gf() given an input of a specific value 
>> like our current x. That would be important if we were going to call gf() 
>> inside another function.
>>
>> Hope that helps.
>>
>>  -- John
>>
>> On Saturday, May 30, 2015 at 4:51:09 AM UTC-7, David Gold wrote:
>>>
>>> @Steven,
>>>
>>> Would you help me to understand the difference between this case here 
>>> and the case of DataArray{T}s -- which, by my understanding, are basically 
>>> AbstractArray{Union{T, NaN}, 1}'s? My first thought was that taking a 
>>> Union{Bool, AbstractArray{Float, 2}} argument would potentially interfere 
>>> with the compiler's ability to perform type inference, similar to how 
>>> looping through a DataArray can experience a cost from the compiler having 
>>> to deal with possible NaNs. 
>>>
>>> But what you're saying is that this does not apply here, since 
>>> presumably the argument, whether it is a Bool or an AbstractArray, would be 
>>> type-stable throughout the functions operations -- unlike the values 
>>> contained in a DataArray. Would it be fair to say that dealing with Union{} 
>>> types tends to be dangerous to performance mostly when they are looped over 
>>> in some sort of container, since in that case it's not a matter of simply 
>>> dispatching a specially compiled method on one of the conjunct types or the 
>>> other?
>>>
>>> On Friday, May 29, 2015 at 9:49:45 PM UTC-4, Steven G. Johnson wrote:

 *No!*  This is one of the most common misconceptions about Julia 
 programming.

 The type declarations in function arguments have *no impact* on 
 performance.  Zero.  Nada.  Zip.  You *don't have to declare a type at 
 all* in the function argument, and it *still* won't matter for 
 performance.

 The argument types are just a filter for when the function is 
 applicable.

 The first time a function is called, a specialized version is compiled 
 for the types of the arguments that you pass it.  Subsequently, when you 
 call it with arguments of the same type, the specialized version is called.

 Note also that a default argument foo(x, y=false) is exactly equivalent 
 to defining

 foo(x,y) = ...
 foo(x) = foo(x, false)

 So, if you call foo(x, [1,2,3]), it c

[julia-users] Julia v0.3.9

2015-05-30 Thread Tony Kelman
Hello all! The latest bugfix release of the 0.3.x Julia line has been 
released. Binaries are available from the usual place 
, and as is typical with such things, 
please report all issues to either the issue tracker 
, or email the julia-users list.

This is a bugfix release, though it includes a number of speedups for 
Pkg.update() and Pkg.publish() that have been working well on Julia master 
for a month or two. These are intended to be backwards-compatible (please 
let us know if not!), and strictly improve Pkg performance. To see all 
other bugs fixed since 0.3.8, see this commit log 
.

This is a recommended upgrade for anyone using any of the previous 0.3.x 
releases, and should act as a drop-in replacement for any of the 0.3.x 
line. We would like to get feedback if someone has a working program that 
breaks after this upgrade.

-Tony



Re: [julia-users] Segmentation fault when building Cairo

2015-05-30 Thread Nathan Baum
I was installing in a fresh home directory yesterday, so I would have been 
using v0.2.26.

To be sure, I've pinned that and retried, and it fails in the same way.

I've filed an issue.

Thanks for the documentation pointer. :)

I'll try to see where it's going wrong, although it looks like the stack 
has been mangled beyond usefulness by the time of the crash. :(

On Saturday, 30 May 2015 18:14:42 UTC+1, Tim Holy wrote:
>
> Did this only start today? I just tagged a new Cairo release; you could 
> try 
> pinning an older one. 
>
> Otherwise, follow the usual issue-filing procedures. Useful info is here: 
> http://docs.julialang.org/en/latest/devdocs/backtraces/ 
> no matter whether you decide to just report an issue or want to try to dig 
> into this yourself. 
>
> Best, 
> --Tim 
>
> On Saturday, May 30, 2015 10:00:40 AM Nathan Baum wrote: 
> > When trying to Pkg.add("Cairo"), the build process reliably fails - and 
> > kills the runtime - with a segmentation fault, every time. 
> > 
> > 
> > julia> Pkg.add("Cairo") 
> > INFO: Installing BinDeps v0.3.12 
> > INFO: Installing Cairo v0.2.27 
> > INFO: Installing Color v0.4.5 
> > INFO: Installing Compat v0.4.4 
> > INFO: Installing FixedPointNumbers v0.0.7 
> > INFO: Installing Graphics v0.1.0 
> > INFO: Installing SHA v0.0.4 
> > INFO: Installing URIParser v0.0.5 
> > INFO: Building Cairo 
> > 
> > signal (11): Segmentation fault 
> > unknown function (ip: -61501312) 
> > unknown function (ip: -201944468) 
> > unknown function (ip: -202401246) 
> > Segmentation fault (core dumped) 
> > 
> > $ 
> > 
> > 
> > This leaves the packages in an inconsistent state - further attempts to 
> > install Cairo report there's nothing to do, but Cairo can't actually be 
> > used[1]. Pkg.rm("Cairo") seems to clean it up, though. 
> > 
> > I've tried this on both 0.3.8 and git master. The same thing happens 
> with 
> > julia-debug. 
> > 
> > I don't know how to find out where the segfault is actually happening. 
> Any 
> > advice would be gratefully received. 
> > 
> > [1] 
> > 
> > julia> using Cairo 
> > ERROR: LoadError: could not open file 
> > /home/nathan/.julia/v0.4/Cairo/src/../deps/deps.jl 
> > while loading /home/nathan/.julia/v0.4/Cairo/src/Cairo.jl, in expression 
> > starting on line 5 
>
>

[julia-users] Adding vectors in place using BLAS?

2015-05-30 Thread Gabriel Goh
Hey All,

I'm wondering if its easy to do an in place assignment, say

x[10:20] = x[10:20] + a*y

using the axpy! library. I want to avoid the use of any temporary variables 
if possible!

Thanks!
Gabe


Re: [julia-users] Adding vectors in place using BLAS?

2015-05-30 Thread Tim Holy
Not tested, but

xsub = sub(x, 10:20)
Base.LinAlg.axpy!(a, y, xsub)

should work just fine.

--Tim

On Saturday, May 30, 2015 02:35:01 PM Gabriel Goh wrote:
> Hey All,
> 
> I'm wondering if its easy to do an in place assignment, say
> 
> x[10:20] = x[10:20] + a*y
> 
> using the axpy! library. I want to avoid the use of any temporary variables
> if possible!
> 
> Thanks!
> Gabe



Re: [julia-users] Macros generating Functions

2015-05-30 Thread Jameson Nash
But "@eval" is still a macro, so it is even better to rewrite this without
that:
function getfn()
return function(); 1; end
end
const n = getfn()

On Sat, May 30, 2015 at 2:30 PM David Gold  wrote:

> Something to note about Tom's method is that the name function must be
> passed to gf as a symbol, unlike in the case of a macro. However, in most
> cases this slight difference probably will not warrant a macro.
>
>
> On Friday, May 29, 2015 at 8:58:56 PM UTC-4, Tom Lee wrote:
>>
>> You don't need to use a macro, a function can do this:
>>
>> julia> function gf(n::Symbol = gensym())
>>@eval function $(n)()
>>1
>>end
>>end
>>
>> I've also made the n argument optional, with gensym creating a unique
>> name by default - the newly defined function is returned by gf, so you
>> don't necessarily need to know its name. And of course if you give gf
>> additional arguments you can programatically construct expressions based
>> those and easily $ them into the @eval block. It's all very awesome.
>>
>> But the point is a macro probably isn't appropriate for this type of
>> thing. My understanding is that you should never use a macro if you can
>> easily write an equivalent function.
>>
>> Cheers,
>>
>> Tom
>>
>> On Thursday, 28 May 2015 23:26:39 UTC+10, Mauro wrote:
>>>
>>> Like this:
>>>
>>> julia> macro gf(n)
>>>quote
>>>function $(esc(n))()
>>>1
>>>end
>>>end
>>>end
>>>
>>> julia> @gf foo
>>> foo (generic function with 1 method)
>>>
>>> julia> foo()
>>> 1
>>>
>>> On Thu, 2015-05-28 at 12:06, Vasudha Khandelwal 
>>> wrote:
>>> > Can I use macros to generate functions with names passed as argument
>>> to the
>>> > macro?
>>>
>>>


[julia-users] Re: Julia Summer of Code

2015-05-30 Thread Gurshabad Grover
I'd like to work on making an autoformat tool for Julia. It is a generic 
but experimental project but I'm sure if the tool is made, it will be 
welcome by the community. I'm familiar with the language  an have a 
structured plan in mind; writing the proposal should not take much time if 
I am able to find a mentor soon. (I plan to use Go's autoformat tool 
 as a blueprint on the things 
the tool needs to take care of)

Please let me know soon if anyone would like to mentor this project! 
Looking forward to working with you.

On Friday, May 15, 2015 at 11:27:24 PM UTC+5:30, Viral Shah wrote:
>
> Folks,
>
> The Moore Foundation is generously funding us to allow for 6-8 Julia 
> Summer of Code projects. Details will be published soon, but if you are 
> interested, please mark your calendars and plan your projects.
>
> -viral
>


Re: [julia-users] Macros generating Functions

2015-05-30 Thread David P. Sanders


El domingo, 31 de mayo de 2015, 0:37:45 (UTC+2), Jameson escribió:
>
> But "@eval" is still a macro, so it is even better to rewrite this without 
> that:
> function getfn()
> return function(); 1; end
> end
> const n = getfn()
>

This does not give quite the same answer, though, since the function does 
not have a name.
Is there a way to specify the name of a generated function like this?
 

>
> On Sat, May 30, 2015 at 2:30 PM David Gold  > wrote:
>
>> Something to note about Tom's method is that the name function must be 
>> passed to gf as a symbol, unlike in the case of a macro. However, in most 
>> cases this slight difference probably will not warrant a macro.
>>
>>
>> On Friday, May 29, 2015 at 8:58:56 PM UTC-4, Tom Lee wrote:
>>>
>>> You don't need to use a macro, a function can do this:
>>>
>>> julia> function gf(n::Symbol = gensym()) 
>>>@eval function $(n)() 
>>>1
>>>end 
>>>end
>>>
>>> I've also made the n argument optional, with gensym creating a unique 
>>> name by default - the newly defined function is returned by gf, so you 
>>> don't necessarily need to know its name. And of course if you give gf 
>>> additional arguments you can programatically construct expressions based 
>>> those and easily $ them into the @eval block. It's all very awesome.
>>>
>>> But the point is a macro probably isn't appropriate for this type of 
>>> thing. My understanding is that you should never use a macro if you can 
>>> easily write an equivalent function.
>>>
>>> Cheers,
>>>
>>> Tom
>>>
>>> On Thursday, 28 May 2015 23:26:39 UTC+10, Mauro wrote:

 Like this: 

 julia> macro gf(n) 
quote 
function $(esc(n))() 
1 
end 
end 
end 

 julia> @gf foo 
 foo (generic function with 1 method) 

 julia> foo() 
 1 

 On Thu, 2015-05-28 at 12:06, Vasudha Khandelwal  
 wrote: 
 > Can I use macros to generate functions with names passed as argument 
 to the 
 > macro? 



Re: [julia-users] Re: example for ccall use and fortran

2015-05-30 Thread Jiahao Chen
It would be great if you could clean up your example and add it to the
documentation.

Thanks,

Jiahao Chen
Research Scientist
MIT CSAIL

On Sun, May 31, 2015 at 3:17 AM, Andre Bieler 
wrote:

> Ok so I have a few simple examples working for ccalling fortran functions
> and subroutines from Julia.
> Maybe someone will find this useful examples when first looking into
> calling fortran from julia.
>
> compile the following fortran mod
> ```
> !fileName = simplemodule.f95
> module simpleModule
>
> implicit none
>
> contains
> function foo(x)
>   integer :: foo, x
>   foo = x * 2
> end function foo
>
> subroutine bar(x, a, b)
>   integer, intent(in) :: x
>   integer, intent(out) :: a, b
>
>   a = x + 3
>   b = x * 3
> end subroutine bar
>
> subroutine keg(x, a, b)
>   real*8, intent(in) :: x
>   real*8, intent(out) :: a, b
>
>   a = x + 3.0
>   b = x * 3.0
> end subroutine keg
>
> subroutine ruf(x, y)
>   real*8, dimension(3), intent(in) :: x
>   real*8, dimension(3), intent(out) :: y
>   integer :: i
>
>   DO i = 1, 3
> y(i) = 2*x(i)
>   END DO
> end subroutine ruf
>
> end module simplemodule
> ```
>


[julia-users] a simple visualization of all the described color strings in Color packge

2015-05-30 Thread Li Zhang
hi folks,

i run into a lot of color usage lately, and always wanted to have a color 
palette to help me choose right and convenient color strings already listed 
in Color package, so that end up a simple palette which i thought someone 
may think it useful.

cheers 

ipython notebook: 
https://www.dropbox.com/s/memvq0ew6143tdm/ColorPalette.ipynb?dl=0




Re: [julia-users] julia on arm - some more progress

2015-05-30 Thread Seth
Nope:

error during bootstrap:
LoadError(at "sysimg.jl" line 278: LoadError(at "constants.jl" line 94: 
Base.AssertionError(msg="Float64(π) == Float64(big(π))")))




On Saturday, May 30, 2015 at 10:30:49 AM UTC-7, Seth wrote:
>
> Oh, ok - let me restart.
>
>
> On Saturday, May 30, 2015 at 10:29:50 AM UTC-7, Viral Shah wrote:
>>
>> Yes, those steps are good. Could you leave out the JULIA_CPU_ARCH for 
>> now? 
>>
>> -viral 
>>
>>
>>
>> > On 30-May-2015, at 10:56 pm, Seth  wrote: 
>> > 
>> > Oh, also export JULIA_CPU_ARCH=arm1176jzf-s. Build running now - will 
>> report back in about 12 hours :) 
>> > 
>> > On Saturday, May 30, 2015 at 10:24:15 AM UTC-7, Seth wrote: 
>> > On it. make distcleanall; git pull; make? 
>> > 
>> > On Saturday, May 30, 2015 at 10:23:05 AM UTC-7, Viral Shah wrote: 
>> > Could you guys try with the latest master, with a fresh clone? I am no 
>> longer passing any flags to LLVM and also using LLVM 3.6.1. 
>> > 
>> > -viral 
>> > 
>> > On Saturday, May 30, 2015 at 7:18:41 AM UTC+5:30, Viral Shah wrote: 
>> > We need to figure out the magic they use to build those binaries. What 
>> if you remove all LLVM flags in the Julia build in the arm section of 
>> make.inc and let LLVM do it's thing? 
>> > 
>> > -viral 
>> > 
>> > On 29 May 2015 9:27 pm, "Daan Huybrechs"  wrote: 
>> > On the other hand, building with the LLVM binaries from llvm.org does 
>> still work on a Pi 2 to get a working REPL -  I just tested that with 
>> gcc-4.9 and LLVM 3.6.1 today. 
>> > 
>> > Daan 
>>
>>

Re: [julia-users] julia on arm - some more progress

2015-05-30 Thread Viral Shah
:-(

I wonder what’s the magic incantation those LLVM binaries use.

-viral



> On 31-May-2015, at 9:29 am, Seth  wrote:
> 
> Nope:
> 
> error during bootstrap:
> LoadError(at "sysimg.jl" line 278: LoadError(at "constants.jl" line 94: 
> Base.AssertionError(msg="Float64(π) == Float64(big(π))")))
> 
> 
> 
> 
> On Saturday, May 30, 2015 at 10:30:49 AM UTC-7, Seth wrote:
> Oh, ok - let me restart.
> 
> 
> On Saturday, May 30, 2015 at 10:29:50 AM UTC-7, Viral Shah wrote:
> Yes, those steps are good. Could you leave out the JULIA_CPU_ARCH for now? 
> 
> -viral 
> 
> 
> 
> > On 30-May-2015, at 10:56 pm, Seth  wrote: 
> > 
> > Oh, also export JULIA_CPU_ARCH=arm1176jzf-s. Build running now - will 
> > report back in about 12 hours :) 
> > 
> > On Saturday, May 30, 2015 at 10:24:15 AM UTC-7, Seth wrote: 
> > On it. make distcleanall; git pull; make? 
> > 
> > On Saturday, May 30, 2015 at 10:23:05 AM UTC-7, Viral Shah wrote: 
> > Could you guys try with the latest master, with a fresh clone? I am no 
> > longer passing any flags to LLVM and also using LLVM 3.6.1. 
> > 
> > -viral 
> > 
> > On Saturday, May 30, 2015 at 7:18:41 AM UTC+5:30, Viral Shah wrote: 
> > We need to figure out the magic they use to build those binaries. What if 
> > you remove all LLVM flags in the Julia build in the arm section of make.inc 
> > and let LLVM do it's thing? 
> > 
> > -viral 
> > 
> > On 29 May 2015 9:27 pm, "Daan Huybrechs"  wrote: 
> > On the other hand, building with the LLVM binaries from llvm.org does still 
> > work on a Pi 2 to get a working REPL -  I just tested that with gcc-4.9 and 
> > LLVM 3.6.1 today. 
> > 
> > Daan 
> 



Re: [julia-users] Multiple lines statement?

2015-05-30 Thread Alex Ames
I lost multiple days attempting to pin down this behavior. I had something 
along the lines of
x = a + b
 + c
It's not clear to me why the second line is a valid expression. At the very 
least, it would be nice for lint to catch these, or an optional `...` 
line-continuation operator.

On Saturday, May 30, 2015 at 11:21:18 AM UTC-5, Gabriel Mihalache wrote:
>
> Once you spend a few days tracking down a bug due to this, you never 
> forget. The idea would be to find a way to save people from this experience.
>
> Some lines are naturally long because e.g. the equation is long or because 
> you prefer long, informative variable names. You can always use variables 
> for parts of the expression but then that just feels like working around 
> poor language features/design.
>


[julia-users] Re: JuliaCon registrations open

2015-05-30 Thread Jiahao Chen
Am important note about online hotel reservations:

Some attendees have reported problems using the Hyatt's reservation site. For 
enquiries that include dates other than June 24-28, the website may erroneously 
report that no rooms are available. In reality, we still many rooms available 
at the contracted rate, but the website cannot accommodate bookings that extend 
to dates other than the precise dates of JuliaCon.

With apologies to our international guests, we ask that anyone experiencing a 
problem with the Hyatt's website to enquire directly through their US toll-free 
number. Meanwhile, we hope that the Hyatt will rectify the issue with their 
website soon.

Re: [julia-users] Adding vectors in place using BLAS?

2015-05-30 Thread Gabriel Goh
sub! Just what I was looking for! it works like a charm

On Saturday, May 30, 2015 at 3:35:50 PM UTC-7, Tim Holy wrote:
>
> Not tested, but 
>
> xsub = sub(x, 10:20) 
> Base.LinAlg.axpy!(a, y, xsub) 
>
> should work just fine. 
>
> --Tim 
>
> On Saturday, May 30, 2015 02:35:01 PM Gabriel Goh wrote: 
> > Hey All, 
> > 
> > I'm wondering if its easy to do an in place assignment, say 
> > 
> > x[10:20] = x[10:20] + a*y 
> > 
> > using the axpy! library. I want to avoid the use of any temporary 
> variables 
> > if possible! 
> > 
> > Thanks! 
> > Gabe 
>
>