Re: [julia-users] Re: The growth of Julia userbase

2015-11-04 Thread Patrick O'Leary
It's not in the repo, it's a GitHub feature. But it may only be visible if 
you have Collaborator or Owner status on the repository.

On Wednesday, November 4, 2015 at 10:51:34 AM UTC-6, Ben Ward wrote:
>
> I don't think the /graphs folder is part of the julia repo any-more :(
>
> On Wed, Nov 4, 2015 at 4:48 PM, Tony Kelman  wrote:
>
>> There are some interesting numbers at 
>> https://github.com/JuliaLang/julia/graphs/traffic
>>
>> Elliot Saba also did some scraping of the AWS download logs for binaries 
>> and shared the aggregate numbers (broken down by platform) privately with a 
>> few people, it may be worth sharing those publicly.
>>
>>
>> On Wednesday, November 4, 2015 at 8:26:19 AM UTC-8, Ben Ward wrote:
>>>
>>> Hi all,
>>>
>>> I was wondering are there any metrics or stats available that show how 
>>> the user-base of Julia has grown over the last few years, and what it's 
>>> size is now?
>>>
>>> Many Thanks,
>>> Ben W.
>>>
>>
>

Re: [julia-users] Re: Code runs 500 times slower under 0.4.0

2015-10-23 Thread Patrick O'Leary
On Friday, October 23, 2015 at 6:18:43 AM UTC-5, Kris De Meyer wrote:
>
> ...and then the only thing Julia will have going for it is that it's free. 
> But my cost to my employers is such that if I lose as little as 3 days a 
> year on compatibility issues, they would be better off paying for a Matlab 
> license... 
>

It's not quite that simple--if I have a problem with something in MATLAB, I 
file a bug report. The issue may (or may not) be fixed in 3-6 months, with 
the next release. If I'm lucky I catch problems in prereleases--but that's 
also time I have to spend to install the prerelease and test the fragile 
parts of the process. This isn't hypothetical; we've skipped entire 
releases which have variously broken code generation. (Full disclosure, we 
haven't seen such problems with MATLAB proper, but deployment with Coder is 
essential for what we do.)

But if something broke in Julia, there's a chance I can fix it. Free 
software matters.

Patrick


Re: [julia-users] Re: Code runs 500 times slower under 0.4.0

2015-10-23 Thread Patrick O'Leary
On Friday, October 23, 2015 at 10:11:16 AM UTC-5, Andreas Lobinger wrote:
>
>
> On Friday, October 23, 2015 at 3:06:13 PM UTC+2, Patrick O'Leary wrote:
>>
>> On Friday, October 23, 2015 at 6:18:43 AM UTC-5, Kris De Meyer wrote:
>>>
>>> ...and then the only thing Julia will have going for it is that it's 
>>> free. But my cost to my employers is such that if I lose as little as 3 
>>> days a year on compatibility issues, they would be better off paying for a 
>>> Matlab license... 
>>>
>>
>>
>> But if something broke in Julia, there's a chance I can fix it. Free 
>> software matters.
>>
>> Well, you can. But the average julia USER is not in the same position. 
> Fixing something in packages needs some background in development of the 
> package and fixing something in julia itself needs even more background.
>
> In any case, i prefer Open Source because there's even the possibility to 
> look inside and see how the magic is happening. 
> But the argument, 'you can help yourself' is not equal to 'you are able to 
> help yourself'.  
>

I didn't intend for the word "chance" to be read as anything other than 
"nonzero probability." 


Re: [julia-users] Re: Help reading structured binary data files

2015-10-03 Thread Patrick O'Leary
Going back to StrPack, there's syntax for that:

@struct type SomeThings
anInt::Int32
aVectorWithSixElements::Vector{Int32}(6)
aStringOfEightBytes::ASCIIString(8)
end

On Friday, October 2, 2015 at 9:01:29 AM UTC-5, David McInnis wrote:
>
> @Tom :  I couldn't figure out how to do arrays or strings of a specific 
> number of bytes.
>
>
> On Thursday, October 1, 2015 at 6:41:30 PM UTC-4, Tom Breloff wrote:
>>
>> This is exactly what my packedStruct macro is for... define the type 
>> (composed of bitstypes) and the macro will set up helper methods to read 
>> raw bytes into the fields.  I'm curious if it suits your needs, and what is 
>> missing.
>>
>> On Thu, Oct 1, 2015 at 3:39 PM, David McInnis  wrote:
>>
>>> Related follow-on question..
>>>
>>> Is there a nice way to get the data into a   type  format rather than a 
>>>  dict ?
>>>
>>> Here's the form I'm using now..
>>> function DNP(myfile::String)
>>>
>>> dnp = { "File" => myfile }
>>> fh = open(myfile, "r")
>>>
>>> dnp["Label"] = bytestring(readbytes(fh, 4))
>>> dnp["Version"] = read(fh, Uint32)
>>> dnp["Revision"] = read(fh, Uint32)
>>> dnp["Date"] = bytestring(readbytes(fh, 28))
>>> dnp["FileFormat"] = read(fh, Uint32)
>>> dnp["FileType"] = bytestring(readbytes(fh,4))
>>> dnp["OriginalFileName"] = bytestring(readbytes(fh,68))
>>> dnp["ReferenceFileName"] = bytestring(readbytes(fh,68))
>>> dnp["RelatedFileNameA"] = bytestring(readbytes(fh,68))
>>> dnp["RelatedFileNameB"] = bytestring(readbytes(fh,68))
>>> dnp["RelatedFileNameC"] = bytestring(readbytes(fh,68))
>>> dnp["Annotate"] = bytestring(readbytes(fh,84))
>>> dnp["InstrumentModel"] = bytestring(readbytes(fh,36))
>>> dnp["InstrumentSerialNumber"] = bytestring(readbytes(fh,36))
>>> dnp["SoftwareVersionNumber"] = bytestring(readbytes(fh,36))
>>> dnp["CrystalMaterial"] = bytestring(readbytes(fh,36))
>>> dnp["LaserWavelengthMicrons"] = read(fh, Float64)
>>> dnp["LaserNullDoubling"] = read(fh, Uint32)
>>> dnp["Padding"] = read(fh, Uint32)
>>> dnp["DispersionConstantXc"] = read(fh, Float64)
>>> dnp["DispersionConstantXm"] = read(fh, Float64)
>>> dnp["DispersionConstantXb"] = read(fh, Float64)
>>> dnp["NumChan"] = read(fh, Uint32)
>>> dnp["InterferogramSize"] = read(fh, Uint32)
>>> dnp["ScanDirection"] = read(fh, Uint32)
>>> dnp["ACQUIREMODE"] = read(fh, Uint32)
>>> dnp["EMISSIWITY"] = read(fh, Uint32)
>>> dnp["APODIZATION"] = read(fh, Uint32)
>>> dnp["ZEROFILL"] = read(fh, Uint32)
>>> dnp["RUNTIMEMATH"] = read(fh, Uint32)
>>> dnp["FFTSize"] = read(fh, Uint32)
>>> dnp["NumberOfCoAdds"] = read(fh, Uint32)
>>> dnp["SingleSided"] = read(fh, Uint32)
>>> dnp["ChanDisplay"] = read(fh, Uint32)
>>> dnp["AmbTemperature"] = read(fh, Float64)
>>> dnp["InstTemperature"] = read(fh, Float64)
>>> dnp["WBBTemperature"] = read(fh, Float64)
>>> dnp["CBBTemperature"] = read(fh, Float64)
>>> dnp["TEMPERATURE_DWR"] = read(fh, Float64)
>>> dnp["EMISSIVITY_DWR"] = read(fh, Float64)
>>> dnp["LaserTemperature"] = read(fh, Float64)
>>> dnp["SpareI"] = read(fh, Uint32,10)
>>> dnp["SpareF"] = read(fh, Float64,10)
>>> dnp["SpareNA"] = bytestring(readbytes(fh,68))
>>> dnp["SpareNB"] = bytestring(readbytes(fh,68))
>>> dnp["SpareNC"] = bytestring(readbytes(fh,68))
>>> dnp["SpareND"] = bytestring(readbytes(fh,68))
>>> dnp["SpareNE"] = bytestring(readbytes(fh,68))
>>> dnp["End"] = bytestring(readbytes(fh,4))
>>>
>>>
>>> dnp["Interferograms"] = read(fh, Int16, dnp["InterferogramSize"], 
>>> dnp["NumberOfCoAdds"])
>>> fft_size = dnp["FFTSize"] * dnp["ZEROFILL"] * 512
>>> dnp["Spectrum"] = read(fh, Float32, fft_size)
>>>
>>> close(fh)
>>>
>>> wavelength_range = 1.0 / dnp["LaserWavelengthMicrons"]
>>> spectral_range = wavelength_range / 2
>>> spectral_binsize = spectral_range / fft_size
>>> x_fft = [0:fft_size-1] * spectral_binsize
>>> m = dnp["DispersionConstantXm"]
>>> b = dnp["DispersionConstantXb"]
>>> c = dnp["DispersionConstantXc"]
>>> x_corrected = x_fft + c + exp10(m * x_fft + b)
>>> dnp["WL_cm"] = x_corrected
>>> dnp["WL_microns"] = 1.0 ./ x_corrected
>>>
>>>
>>> return dnp
>>> end
>>>
>>> Which works fine, but it leaves me with the data in a form that's ugly 
>>> (to me) in calculations:  dnp["InterferogramSize"] * dnp["NumberOfCoAdds"]
>>> Instead of:dnp.InterferogramSize * dnp.NumberOfCoAdds
>>>
>>> I can create an appropriate type like:
>>> type DNP
>>> Label
>>> Version
>>> Revision
>>> Date
>>>  etc etc
>>> end
>>>
>>> ..but I can't figure out a good way to get the data there.Well, 
>>> other than keeping my current function, defining the type, and then having 
>>> another function to copy the data into the type... ugly.
>>>
>>> I read all the docs I could find on types but never saw anything that 
>>> hinted at a solution..maybe a  function/type hybrid??
>>> I tried creating the type within the function but didn't get anywhere.
>>> Ideas?
>>>
>>
>>

Re: [julia-users] Re: Help reading structured binary data files

2015-10-03 Thread Patrick O'Leary
On Saturday, October 3, 2015 at 6:38:44 PM UTC-5, Tom Breloff wrote:
>
> Thanks Patrick.  Yes StrPack is the way to go then.  My only warning is 
> that StrPack has a bunch of logic for endianness, etc which slows it down a 
> little, but for most purposes it should work well.  (Also, it might have 
> changed since I last tried it ~8 months ago, so ymmv)
>

Trust me, it hasn't changed :D Those are totally valid points; I never 
really tried to wring out all the performance stuff, and Julia itself has 
changed a lot since I put it together--but it's not something I see myself 
getting to anytime soon.


Re: [julia-users] Help reading structured binary data files

2015-09-18 Thread Patrick O'Leary
On Thursday, September 17, 2015 at 8:23:01 PM UTC-5, Tom Breloff wrote:
>
> I have an alternative to StrPack.jl here: 
> https://github.com/tbreloff/CTechCommon.jl/blob/master/src/macros.jl.  If 
> you have a type that mimics a c-struct, you can create like:
>
> @packedStruct immutable MyStruct
>   field1::UInt8
>   field2::Byte
> end
>
> and it creates some methods: read, reinterpret, etc which can convert raw 
> bytes into the immutable type.
>
> I've only used it on very specific types of data, but it may work for you.
>

This looks like a nice simple alternative. I haven't touched StrPack in a 
long time, but I believe the analogous StrPack syntax is:

@struct type MyStruct
  field1::UInt8
  field2::Byte
end, align_packed

Though since these are both 1-byte fields, there wouldn't be any padding 
under the default strategy anyways.

I'm not sure if this works with immutables--someone may have contributed 
that? Maintainer neglect of StrPack is acknowledged :D Have you considered 
possibly spinning your simplified version into its own package, Tom?

StrPack is/was pretty ambitious; I wanted to be able to get down to 
bit-level unpacking, with the goal of being able to parse a Novatel 
RANGECMP log (http://www.novatel.com/assets/Documents/Bulletins/apn031.pdf) 
entirely with an annotated type declaration. I'd still like to do this, but 
it's been hard to find the motivation to work on it. (There's a branch up 
on the repository which pushes towards this, but the work is incomplete.)

The manual alternative Stefan proposes is definitely a good option, 
especially if this is a one-off structure.


Re: [julia-users] Segmentation fault during Julia compilation

2015-08-31 Thread Patrick O'Leary
Corresponding issue: https://github.com/JuliaLang/julia/issues/12840

On Monday, August 31, 2015 at 9:07:47 AM UTC-5, Mahesh Waidande wrote:
>
> Hi All,
>
>
> I am working on building/porting Julia on ppc64le architecture. I am using 
> Ubuntu 14.10 on top of ppc64le hardware, while compiling Julia code (master 
> branch) I was getting segmentation fault, I am able to resolve this 
> segmentation fault by turning on ‘MEMDEBUG’ flag from ‘src/options.h’ 
> file.  
>
>
> I decided to work more this issue and try to find out root cause of 
> segmentation fault, so I started studying/understanding memory management 
> of Julia. I have couple of questions in my mind regarding memory management 
> of Julia and I want to discuss those here. 
>
> 1. While defining ‘REGION_PG_COUNT’ macro, 4096 value is used, I want to 
> know what is the significance of 4096?
>   If 4096 is indicates page-size then this code is valid 
> or work fine on amd64/x86_64   architecture where page size 4k and it may 
> behave abnormally in case of PPC64 where page size is 64k, basically here I 
> want to discuss the impact of large page size on Julia code and what all 
> other things I need to take into consideration while porting Julia on 
> PPC64le.  
>
> 2. Past few days I was working on understanding memory management scheme 
> of Julia and I find it bit of difficult and time consuming process though I 
> have some success. I want to know is there any official / unofficial 
> document around which will help me understand it.
>
> Any suggestions/pointers on above mention points are much appreciated.
>
> -Mahesh  
>
> On Tue, Aug 18, 2015 at 8:11 PM, Jameson Nash  > wrote:
>
>> It is a considerable performance impact to run with MEMDEBUG, but 
>> otherwise has no side-effects. It is not necessary to run with this flag in 
>> production (and probably not helpful either, since you wouldn't have a 
>> debugger attached).
>>
>>
>> On Tue, Aug 18, 2015 at 9:53 AM Mahesh Waidande > > wrote:
>>
>>> Hi Jamseson,
>>>
>>> Thanks for explaining memory allocations on PPC and providing pointers 
>>> on resolving segmentation fault, pointers are really helpful and I am 
>>> working on those. I am able to compile Julia master branch after turning 
>>> ‘MEMDEBUG‘ flag on from options.h file, compilation went smooth and I am 
>>> able to see the Julia prompt. Although I will continue to work on finding 
>>> root cause of segmentation fault, occur at a time of Julia initialization. 
>>>
>>>
>>> I think when we turn on the ‘MEMDEBUG‘ flag it will reduce a performance 
>>> of Julia bit as with MEMDEBUG no memory pools are used and all allocation 
>>> is treated as big. 
>>>
>>>
>>> Apart from performance issue, I have few questions in my mind and I 
>>> would like to discuss those,
>>> 1. Apart from performance hit, is there any other functionality has 
>>> impacted due to turning on ‘MEMDEBUG’ flag OR what are side effects of 
>>> turning ‘MEMDEBUG’ flag on?
>>> 2. Should I use these settings (turning MEMDEBUG flag on) in production 
>>> environment or in release mode?
>>>
>>>
>>>
>>> -Mahesh 
>>>
>>> On Fri, Aug 14, 2015 at 10:03 PM, Jameson Nash >> > wrote:
>>>
 It's a JIT copy of a julia function named "new". The last time this 
 error popped up, it was due to an error in the free_page function logic to 
 compute whether it was safe to free the current page (since PPC using 
 large 
 pages). One place to check then is to ensure the invalid pointer hadn't 
 accidentally being deleted by an madvise(DONTNEED) for an unrelated page 
 free operations.

 Beyond that, I would suggest trying with the `MEMDEBUG` turned on in 
 options.h (which will also disable the `free_page` function).

 Also, when you have gdb running, there are many more useful things to 
 print than just the backtrace. For starters, I would suggest looking at 
 `disassembly` and `info registers`. Also, go `up` on the stack trace and 
 look at `jl_(f->linfo)`, `jl_(jl_uncompress_ast(f->linfo, 
 f->linfo->ast))`, 
 and `jl_(args[0])` / `jl_(args[1])`


 On Fri, Aug 14, 2015 at 9:07 AM Mahesh Waidande  wrote:

> Hi All, 
>
> I am working on building/porting Julia on ppc64le architecture. I am 
> using Ubuntu 14.10 on top of ppc64le hardware, while compiling Julia 
> code(master branch) I am getting segmentation fault. I tried to debug 
> segmentation fault with tools like gdb/vgdb , valgrind , electric-fence 
> etc. but I not able to find a root cause of it. I need some 
> help/pointers/suggestions on how I resolve it. 
>
> Here are some details which will help you to diagnose a problem,
>
> 1. Machine details : 
> $ uname -a
> Linux pts00433-vm1 3.16.0-30-generic #40-Ubuntu SMP Mon Jan 12 
> 22:07:11 UTC 2015 ppc64le ppc64le ppc64le 

[julia-users] Re: small bugging(probably very noob) error please help

2015-07-21 Thread Patrick O'Leary
The and operator is spelled .

On Tuesday, July 21, 2015 at 10:53:50 AM UTC-5, Elsha Robin wrote:


 well i thought i should give julia a whirl since it got so much chatter 
 around it , so i thought i could start of slow and simple and i run into a 
 tree sort of ... so the problem is that i compared the results of two 
 functions in an if statement with an and operator but it spiting some very 
 basic error which i cant figure out what to do with ... so here is the code 
 in question ..

 
 a(i)=20*i - i^2
 b(j)=50 *j - j^2
 c(k)=40 *k - k^2
 for i in 20:50
 for j in i:50
 if a(20)==b(50) and c(j)==a(k)
 print (got it)
 break
 end
 end
 end
 __

 the error spit out is this ...

 syntax: extra token c after end of expression
 while loading In[11], in expression starting on line 6
 i can figure that the roblem has to do something with comparing the outputs 
 of functions using looping variables variable { c(j)==a(k) } ... but i dont 
 understand why that should be a problem ... so either ways can some one give 
 me the correct syntax or direct me to a specific DOC with 
 the similar example ... thanks in advance .



[julia-users] Re: small bugging(probably very noob) error please help

2015-07-21 Thread Patrick O'Leary
The Google Groups view of this thread is very slow to update. All together 
now!

On Tuesday, July 21, 2015 at 11:02:30 AM UTC-5, Patrick O'Leary wrote:

 The and operator is spelled .

 On Tuesday, July 21, 2015 at 10:53:50 AM UTC-5, Elsha Robin wrote:


 well i thought i should give julia a whirl since it got so much chatter 
 around it , so i thought i could start of slow and simple and i run into a 
 tree sort of ... so the problem is that i compared the results of two 
 functions in an if statement with an and operator but it spiting some very 
 basic error which i cant figure out what to do with ... so here is the code 
 in question ..

 
 a(i)=20*i - i^2
 b(j)=50 *j - j^2
 c(k)=40 *k - k^2
 for i in 20:50
 for j in i:50
 if a(20)==b(50) and c(j)==a(k)
 print (got it)
 break
 end
 end
 end
 __

 the error spit out is this ...

 syntax: extra token c after end of expression
 while loading In[11], in expression starting on line 6
 i can figure that the roblem has to do something with comparing the outputs 
 of functions using looping variables variable { c(j)==a(k) } ... but i dont 
 understand why that should be a problem ... so either ways can some one give 
 me the correct syntax or direct me to a specific DOC with 
 the similar example ... thanks in advance .



[julia-users] Re: SOLVED: ERROR: `start` has no method matching start(::Nothing)

2015-07-20 Thread Patrick O'Leary
Julia is returning the value of `println(foo..)` from the function 
`foo()`; the value of the block expression is the value of the last 
expression in the block. It's turtles all the way down, so `println()` also 
returns a value in the same way, etc.

The error is because to destructure the return value into `a` and `b`, 
Julia iterates the return value. The return value of `foo`, which is the 
same as the return value from `println`, is something which is not iterable.

I agree that the error message isn't very helpful, but it's also hard to 
say how one would fix that.

On Monday, July 20, 2015 at 2:09:00 PM UTC-5, Kaj Wiik wrote:

 I started to get a strange error while debugging my code, here's a 
 simplified example:

 julia function foo(a)
println(foo..)
end
 foo (generic function with 1 method)

 julia a = foo(2)
 foo..

 julia a,b = foo(2)
 foo..
 ERROR: `start` has no method matching start(::Nothing)


 So, the problem was a missing return value, it is strange that missing one 
 value did not give error but two values It took a quite long time to 
 track this down. Perhaps a bit more informative error message would be 
 possible...?

 Cheers,
 Kaj



[julia-users] Re: SOLVED: ERROR: `start` has no method matching start(::Nothing)

2015-07-20 Thread Patrick O'Leary
That addresses one class of problem, but the two problems I see from Kaj's 
original post wouldn't be addressed: one, that the incorrect return value 
from foo() is silently accepted in the no-destructuring case, and two, that 
when destructuring, you still get an error message about iteration, even 
though it's not clear what is failing to be iterated.

On Monday, July 20, 2015 at 2:36:06 PM UTC-5, Simon Danisch wrote:

 This could actually be solved by having something like this in base:

 start{T}(x::T) = error($T is not iterable. Try implementing start(::$T), 
 done(::$T, state), next(::$T, state))

 The basic problem here is, that Julia has weakly enforced interfaces, 
 which often ends in no method errors instead of something meaningfull for 
 people who don't know the implicit interfaces.

 Am Montag, 20. Juli 2015 21:09:00 UTC+2 schrieb Kaj Wiik:

 I started to get a strange error while debugging my code, here's a 
 simplified example:

 julia function foo(a)
println(foo..)
end
 foo (generic function with 1 method)

 julia a = foo(2)
 foo..

 julia a,b = foo(2)
 foo..
 ERROR: `start` has no method matching start(::Nothing)


 So, the problem was a missing return value, it is strange that missing 
 one value did not give error but two values It took a quite long time 
 to track this down. Perhaps a bit more informative error message would be 
 possible...?

 Cheers,
 Kaj



Re: [julia-users] Multiple lines statement?

2015-06-18 Thread Patrick O'Leary
Busier I agree with, but it's marginal; grouping is lightweight as syntax 
goes. Parens (1) already work, (2) consistently mean keep these things 
together in a variety of computing environments, (3) have match 
highlighting support in many editors which make it easy, given one end of 
the parenthesized subexpression, to find the other end. So I'm not sure I 
agree with the latter, especially if indentation is used effectively.

On Thursday, June 18, 2015 at 6:53:54 AM UTC-5, Christoph Ortner wrote:


 I think parenthesis are ok, but only just. They make the code busier and 
 more difficult to read. 
 Christoph


 On Tuesday, 16 June 2015 01:21:45 UTC+1, David Gold wrote:

 @Ben: as has been noted elsewhere in this thread, you can use parens to 
 this end:

 julia function foo(a, b, c, d, e, f)
if (a  b 
   || c  d 
   || e  f) 
 println(Foo for you.) 
end 
end 
 foo (generic function with 1 method) 

 julia foo(1, 2, 3, 4, 6, 5) 
 Foo for you.


 Is there a reason this is significantly worse than what you described?

 On Monday, June 15, 2015 at 5:54:56 PM UTC-4, Ben Arthur wrote:

 in addition to adhering to mathematical typsetting conventions, 
 permitting binary operators to be on the following line would make it 
 easier to comment out sub-expressions.  for example,

 if ab
   || cd
   || ef
 end

 could become

 if ab
   # || cd
   || ef
 end

 i'm not advocating for a mandatory line continuation character.  that 
 would be terrible.  but changing julia to look at the preceding line if the 
 current line doesn't make sense by itself would be great.

 ben



 On Monday, June 1, 2015 at 3:35:50 PM UTC-4, Christoph Ortner wrote:


 Just to reiterate a comment I made above: the convention in 
 mathematical typesetting is 
b
 + c
 and not
b + 
  c

 this is the main reason I have (more than once) fallen into this trap. 
 Anyhow, I will try to use brackets for a while and see how I like it.

 Christoph



[julia-users] Re: X.2=X*0.2, easy to make mistake.

2015-06-17 Thread Patrick O'Leary
Changing this would be breaking--syntax that currently works (even if you 
don't expect it to) wouldn't work anymore. If someone is actually using 
this syntax, then we'd break their code on a release which is billed as a 
minor maintenance release. That's not going to work.

There may be a Lint.jl check for this, though? It sounds familiar.

On Wednesday, June 17, 2015 at 5:12:31 PM UTC-5, Art Kuo wrote:

 That's great that it's fixed in 0.4, but even in 0.3.X I would still label 
 it inconsistent behavior, or perhaps even a bug. Why should this happen:

 *julia **x.2 == x*.2*

 *true*

 *julia **x0.2 == x*0.2*

 *ERROR: x0 not defined*

 *julia **x2 == x*2*

 *ERROR: x2 not defined*

 It seems consistent that .2X == .2*X, 0.2X == 0.2*X, 2X == 2*X, so it is 
 fine if the number occurs before the variable. But not if the number occurs 
 after, so I agree with the proposal to ban X.2, meaning trigger an error. 
 Shouldn't this be the case for 0.3 versions as well?


 On Wednesday, June 17, 2015 at 9:14:24 AM UTC-4, Seth wrote:



 On Wednesday, June 17, 2015 at 8:04:11 AM UTC-5, Jerry Xiong wrote:

 Today I spend many time to find a bug in my code. It is turn out that I 
 mistakenly wrote sum(X,2) as sum(X.2). No any error information is reported 
 and Julia regarded X.2 as X*0.2. The comma , is quite close to dot . in 
 the keyboard and looks quite similar in some fonts. As there is no any 
 error occur, this bug will be dangerous. Also, it is not intuitive to 
 understand X.2 is X*0.2. I think maybe it is better to forbid syntax like 
 X.2 but only allowed .2X. 


 This appears to be fixed in 0.4:

 julia x = 100
 100

 julia x.2
 ERROR: syntax: extra token 0.2 after end of expression

 julia sum(x.2)
 ERROR: syntax: missing comma or ) in argument list

 julia f(x) = x.2
 ERROR: syntax: extra token 0.2 after end of expression

 julia f(x) = sum(x.2)
 ERROR: syntax: missing comma or ) in argument list

  



Re: [julia-users] Re: issue defining a type's attribute..

2015-06-10 Thread Patrick O'Leary
immutable TUnit
# other fields...
pt_l::Matrix{Float64} #alias for Array{Float64, 2}
# other fields...
end

Some confusion in your other comments; none of the fields of TUnit are 
defined as vectors (which is to say, Vector{T} where T is the element 
type), either? It sounds like perhaps they should be but my brain is 
currently not completely following your code.

On Wednesday, June 10, 2015 at 7:16:10 AM UTC-5, Michela Di Lullo wrote:

 I still get an error..

 Let's try again.. 

 num_PC=7

 immutable TUnit
 Node::Int16
 a::Float64
 b::Float64
 c::Float64
 inf::Float64
 sup::Float64
 ton::Int16
 toff::Int16
 storia0::Int16
 pt0::Float64
 rampa_up::Float64
 rampa_dwn::Float64
 rampa_up_str::Float64
 rampa_dwn_str::Float64
 SUC_C::Float64
 tau_max::Int16
 pt_l::Float64
 storia0UP::Float64
 storia0DW::Float64
 u0::Float64
 ton0::Float64
 toff0::Float64
 end

 generators = readdlm($MyDataPath/gen * $NumBuses * _ * 
 $GenInstance * .dat) *#matrix of dimension 6*16*
 nodeload   = readdlm($MyDataPath/nodeload * $NumBuses * _ * 
 $NodeloadInstance * .dat)

 gmax=int(size(generators,1))
 hmax=int(size(nodeload,2))
 inf=generators[1:size(generators,1),5]
 sup=generators[1:size(generators,1),6]
 storia0=generators[1:size(generators,1),9]
 ton=generators[1:size(generators,1),7]
 toff=generators[1:size(generators,1),8]

 #here below I'm computing some other quantities, I want to add as 
 attributes of the object TUnit, 
 #together with the ones in generators:

 *pt_l=([inf[j]+(r-1)*(sup[j]-inf[j])/(num_PC-1) for j=1:gmax, 
 r=1:num_PC]) #matrix of dimensions (gmax=6)*(num_PC=7)*
 *storia0UP =  [max(storia0[j],0) for j=1:gmax] #vector of dimensions 
 (gmax=6)*1*
 *storia0DW = -[min(storia0[j],0) for j=1:gmax] **#vector of 
 dimensions (gmax=6)*1*
 *u0=[(storia0[j]  0 ? 1 : 0) for j=1:gmax] **#vector of dimensions 
 (gmax=6)*1*
 *ton0 = int16([min(hmax, max(0,(ton[j]-storia0UP[j])*u0[j])) for 
 j=1:gmax]) **#vector of dimensions (gmax=6)*1*
 *toff0= int16([min(hmax, max(0,(toff[j]-storia0DW[j])*(1-u0[j]))) for 
 j=1:gmax]) **#vector of dimensions (gmax=6)*1*

 generators = cat(2, generators, pt_l, storia0UP, storia0DW, u0, ton0, 
 toff0) #matrix of dimensions (gmax=6)*28

 TUnitS = Array(TUnit,size(generators,1))

 for j in 1:size(generators,1)
 TUnitS[j]   = TUnit(generators[j,1:end]...)
 end

 and I get:

 *ERROR: `TUnit` has no method matching TUnit(::Float64, ::Float64, 
 ::Float64, ::Float64, ::Float64, ::Float64, ::Float64, ::Float64, 
 ::Float64, ::Float64, ::Float64, ::Float64, ::Float64, ::Float64, 
 ::Float64, ::Float64, ::Float64, ::Float64, ::Float64, ::Float64, 
 ::Float64, ::Float64, ::Float64, ::Float64, ::Float64, ::Int64, ::Int16, 
 ::Int16)*

 * in anonymous at no file:5*

 How could I specify that pt_l, attribute of the object TUnit, is a Matrix 
 and not a vector? 

 Thanks in advance for any suggestion, 

 Michela 





 Il giorno martedì 9 giugno 2015 18:54:56 UTC+2, Tom Breloff ha scritto:

 Seems like this would be much easier in a loop, rather than creating lots 
 of temporaries only to extract them out later.
 tunits = TUnit[]
 for i in size(generators,1)
  pt_I = # create vector
  storia0UP = # create Float64
  # set other vars
  push!(tunits, TUnit(generators[j,:]..., pt_I, storia0UP, other vars))
 end


 On Tuesday, June 9, 2015 at 11:37:37 AM UTC-4, Michela Di Lullo wrote:



 num_PC=7
 generators = readdlm($MyDataPath/gen * $NumBuses * _ * 
 $GenInstance * .dat)
 
 inf=generators[1:size(generators,1),5]
 sup=generators[1:size(generators,1),6]
 pt_l=([inf[j]+(r-1)*(sup[j]-inf[j])/(num_PC-1) for j=1:gmax, 
 r=1:num_PC])
 storia0=generators[1:size(generators,1),9]
 ton=generators[1:size(generators,1),7]
 toff=generators[1:size(generators,1),8]
 storia0UP =  [max(storia0[j],0) for j=1:gmax]
 storia0DW = -[min(storia0[j],0) for j=1:gmax]
 u0=[(storia0[j]  0 ? 1 : 0) for j=1:gmax]
 ton0 = int16([min(hmax, max(0,(ton[j]-storia0UP[j])*u0[j])) for 
 j=1:gmax])
 toff0= int16([min(hmax, max(0,(toff[j]-storia0DW[j])*(1-u0[j]))) for 
 j=1:gmax])

 generators = cat(2, generators, pt_l, storia0UP, storia0DW, u0, 
 ton0, toff0)

 immutable TUnit
 Node::Int16
 a::Float64
 b::Float64
 c::Float64
 inf::Float64
 sup::Float64
 ton::Int16
 toff::Int16
 storia0::Int16
 pt0::Float64
 rampa_up::Float64
 rampa_dwn::Float64
 rampa_up_str::Float64
 rampa_dwn_str::Float64
 SUC_C::Float64
 tau_max::Int16
 pt_l
 storia0UP::Float64
 storia0DW::Float64
 u0::Float64
 ton0::Float64
 toff0::Float64
 end

 TUnitS = Array(TUnit,size(generators,1))

 for j in 1:size(generators,1)
 TUnitS[j]   = TUnit(generators[j,1:end]...)
 end

 



[julia-users] Re: Debugging Julia code via gbd/llddb

2015-06-08 Thread Patrick O'Leary
This is being worked on (and will use lldb), but it's still under very 
heavy development. A bit of information: 
https://groups.google.com/forum/#!topic/julia-dev/gcZ5dZJni5o

On Monday, June 8, 2015 at 6:56:48 AM UTC-5, axsk wrote:

 I wonder whether it is possible to debug Julia code using gdb/lldb.
 So far I have only used different IDEs debuggers, and thus have no 
 experience with gdb/lldb. I tried searching for this, but only came across 
 debuging Julia itself / Julias C-code, but nothing specific to debugging 
 the Julia code (or I was not able to understand it)

 So my question is, given the code:

 x=1
 print(x)

 Can I debug this using gdb/lldb, inserting a breakpoint after the first 
 line, then dropping to the REPL and changing x to 2, resuming the program 
 and getting the updated 2 printed?

 If so, could you please describe along this simple example how to do it :)?



Re: [julia-users] Sublists in Julia's doc strings

2015-06-08 Thread Patrick O'Leary
*shrugs shoulders*

I pretty much am horrible at API design. For a non-performance-critical 
API, keyword arguments would be the recommended approach. The old-school 
Julia pre-kwarg dispatch pyramid (with n-argument forms dispatching to the 
n+1 argument form with a default n+1th argument) will perform better but is 
often pretty hard to use and a pain in the butt to write. An option type 
with kwarg constructor might be the best compromise (since you only pay the 
kwarg penalty once to construct the option structure) but I have no idea 
whether that's been done and/or has caught on.

On Saturday, June 6, 2015 at 12:02:36 AM UTC-5, Scott Jones wrote:

 Well, not sure if it is a bug, just a difference in Markdown variants 
 (that’s the big problem with Markdown, IMO, there are a number of slightly 
 different
 versions), or something that just wasn’t implemented yet.

 Although, your comment, made clear something to me… although originally 
 the check_string function was a wrapper around C code, and so had a 
 very C-style interface, I was pushed by the Julian community to do a pure 
 Julia implementation, so there’s no reason to be bound by C API.

 What would you recommend, as a more Julia interface, where you have a set 
 of options (in this case, all boolean flags)?
 (I wonder if one of the more Julian ways might even be more efficient, if 
 it allowed the compiler to generate a more specific version of the function)

 Thanks,
 Scott


 Ah, that wasn't clear. Might just be another enhancement request for the 
 Markdown parser, then?

 On Friday, June 5, 2015 at 12:28:53 PM UTC-5, Scott Jones wrote:

 I was not asking about that, but rather that Julia's Markdown handling 
 doesn't seem to be handling sublists as the Markdown examples I've seen...
 which I showed in the doc string...

 Sent from my iPhone

 On Jun 5, 2015, at 5:02 PM, Patrick O'Leary patrick.ole...@gmail.com 
 wrote:

 Since this is a very C-style interface, perhaps `man 2 open` which uses 
 that style of option flag will get you somewhere?

 On Friday, June 5, 2015 at 7:47:49 AM UTC-5, Scott Jones wrote:

 I've been trying to write documentation for my Julia functions in a way 
 that others will find acceptable,
 but will still be useful to me.
 I have a keyword argument, options, that has several different possible 
 values or'ed in.
 I would like to describe that, but it doesn't come out nicely:

 @doc docSilly function

 long description
 ### Input Arguments:
 1. abc  Description of argument
 2. def  Description of argument

 ### Keyword Argument:
 * options

   **option1  Description of option 1
   **option2  Description of option 2

 ### Returns:
   * A UTF32String

 ### Throws:
   * ArgumentError
  - foo





Re: [julia-users] Re: Sublists in Julia's doc strings

2015-06-05 Thread Patrick O'Leary
Ah, that wasn't clear. Might just be another enhancement request for the 
Markdown parser, then?

On Friday, June 5, 2015 at 12:28:53 PM UTC-5, Scott Jones wrote:

 I was not asking about that, but rather that Julia's Markdown handling 
 doesn't seem to be handling sublists as the Markdown examples I've seen...
 which I showed in the doc string...

 Sent from my iPhone

 On Jun 5, 2015, at 5:02 PM, Patrick O'Leary patrick.ole...@gmail.com 
 wrote:

 Since this is a very C-style interface, perhaps `man 2 open` which uses 
 that style of option flag will get you somewhere?

 On Friday, June 5, 2015 at 7:47:49 AM UTC-5, Scott Jones wrote:

 I've been trying to write documentation for my Julia functions in a way 
 that others will find acceptable,
 but will still be useful to me.
 I have a keyword argument, options, that has several different possible 
 values or'ed in.
 I would like to describe that, but it doesn't come out nicely:

 @doc docSilly function

 long description
 ### Input Arguments:
 1. abc  Description of argument
 2. def  Description of argument

 ### Keyword Argument:
 * options

   **option1  Description of option 1
   **option2  Description of option 2

 ### Returns:
   * A UTF32String

 ### Throws:
   * ArgumentError
  - foo




[julia-users] Re: Sublists in Julia's doc strings

2015-06-05 Thread Patrick O'Leary
Since this is a very C-style interface, perhaps `man 2 open` which uses 
that style of option flag will get you somewhere?

On Friday, June 5, 2015 at 7:47:49 AM UTC-5, Scott Jones wrote:

 I've been trying to write documentation for my Julia functions in a way 
 that others will find acceptable,
 but will still be useful to me.
 I have a keyword argument, options, that has several different possible 
 values or'ed in.
 I would like to describe that, but it doesn't come out nicely:

 @doc docSilly function

 long description
 ### Input Arguments:
 1. abc  Description of argument
 2. def  Description of argument

 ### Keyword Argument:
 * options

   **option1  Description of option 1
   **option2  Description of option 2

 ### Returns:
   * A UTF32String

 ### Throws:
   * ArgumentError
  - foo




[julia-users] Re: Constructing a block diagonal matrix from an array of square matrices

2015-06-03 Thread Patrick O'Leary
For sparse arrays, you can use `blkdiag()`.

For v0.3, the method David posted is probably the best approach.

Starting in v0.4, you can call `cat([1,2], matrices...)` where the 
`matrices` variable is your array of arrays. Note that the splat is 
undesirable if this container is quite large.

On Wednesday, June 3, 2015 at 12:07:19 PM UTC-5, Marc Gallant wrote:

 If I have an array of square matrices of different sizes; e.g.,

 3-element Array{Array{Float64,2},1}:
  2x2 Array{Float64,2}:
  0.539932  0.429322
  0.623487  0.0397795
  2x2 Array{Float64,2}:
  0.35508   0.700551
  0.768214  0.954056
  3x3 Array{Float64,2}:
  0.953354  0.453831   0.991583
  0.159975  0.116518   0.355275
  0.791447  0.0104295  0.151609


 Where the number of elements may be much larger than 3 (e.g., 1000), how 
 do I construct a block diagonal matrix, where the blocks are the matrices 
 in the array? For the array above, this would be

 7x7 Array{Float64,2}:
  0.539932  0.429322   0.0   0.0   0.0   0.00.0
  0.623487  0.0397795  0.0   0.0   0.0   0.00.0
  0.0   0.00.35508   0.700551  0.0   0.00.0
  0.0   0.00.768214  0.954056  0.0   0.00.0
  0.0   0.00.0   0.0   0.953354  0.453831   0.991583
  0.0   0.00.0   0.0   0.159975  0.116518   0.355275
  0.0   0.00.0   0.0   0.791447  0.0104295  0.151609



Re: [julia-users] Re: Converting AbstractString to char with AWS julia v0.3 - 0.4

2015-05-29 Thread Patrick O'Leary
Can you find the relevant file in /etc/apt/sources.list.d/ and post its 
contents? (It'll be the one with julia in the name.)

On Thursday, May 28, 2015 at 4:14:57 PM UTC-5, Andrew B. Martin wrote:

 I'm using Ubuntu 14.04.1.

 As far as I can remember, I copied and pasted from the documentation I 
 just linked to for juliareleases.

 I just ran sudo apt-get remove julia, and then reinstalled, definitely 
 following the juliareleases instructions, and wound up with julia Version 
 0.4.0-dev+5043 (2015-05-27 15:36 UTC), again.



Re: [julia-users] Re: Converting AbstractString to char with AWS julia v0.3 - 0.4

2015-05-29 Thread Patrick O'Leary
On Friday, May 29, 2015 at 9:22:23 AM UTC-5, Andrew B. Martin wrote:

 When I would sudo apt-get install julia before, I guess apt-get would 
 default to julianightlies over juliareleases.


The nightly has a newer version number, so apt's resolver will go that way.

If you want to be able to switch between them, you can keep both repos in 
the list, then select manually which version to install, using something 
like aptitude. (Aptitude is pretty much the first thing I install on any 
Debian-based system.) 


[julia-users] Re: Teaching Julia to an 8 year old (and a 12 year old)

2015-05-07 Thread Patrick O'Leary
On Thursday, May 7, 2015 at 7:53:08 AM UTC-5, Alan Edelman wrote:

 My children have been watching me play with julia for years now.
 They are now 12 and 14.  No question that manipulate is the way to start.
 (and by the way that's true for adults too)

 You can use manipulate with anything, just a matter of imagination, I'd 
 think.


Scott, some context: manipulate is a part of Interact.jl 
(https://github.com/JuliaLang/Interact.jl), and designed to integrate with 
the IJulia Notebook.


[julia-users] Re: scope of the expression (made with Expr)

2015-05-06 Thread Patrick O'Leary
Your Exprs are sad because they're being passed to `eval`. Exprs dream only 
of being spliced into the AST by a macro and compiled in context!

(I'm assuming that your actual need is more sophisticated than this. If so, 
you probably want to use a proper macro for what you're trying to do--see 
the metaprogramming section of the user manual 
http://julia.readthedocs.org/en/latest/manual/metaprogramming/, and pay 
particular attention to the role of `esc()`. If you provide a little more 
detail at a higher level on what you're trying to solve, there might be an 
even simpler solution.)

Patrick

On Wednesday, May 6, 2015 at 7:58:10 PM UTC-5, Evan Pu wrote:

 Hi, just a quick question on the scope/environment of an expression.

 I am using the Expr construct inside a function to create an expression, 
 and I would then want to use this expression in various different places. 
 Here's a short example

 First we have a simple function funky that takes in a function ff, and 
 applies it to two symbolic arguments :x and :y
 julia function funky(ff::Function)
  Expr(:call, ff, :x, :y)
end
 funky (generic function with 1 method)


 We define x and y to be 1 and 2 respectively
 julia x,y = 1,2
 (1,2)


 No surprise here, when funky is passed with the function +, it adds x 
 and y together
 julia eval(funky(+))
 3


 However, now I want to use this expression with different bindings for x 
 and y, I attempt the following
 julia function funfun(x, y)
  eval(funky(+))
end
 funfun (generic function with 1 method)


 Which doesn't seem to work. Here the result is sitll 3, presumably it is 
 using the earlier bindings of x=1, y=2
 julia funfun(4,5)
 3

 I would really liked the last result to be 9.


 So far it seems the expression uses static scoping and it's looking up the 
 binding for :x and :y in the frame generated by calling funky, and not 
 finding it there, it looks up :x and :y in the 
 global frame, which is 1 and 2.
 Is there a way to bind the symbols in an expression to different things? 
 is there like an environment dictionary where we can mess with, or is 
 there some clean way of doing it?

 Much much much appreciated!!!
 --evan



[julia-users] Re: ANN: LLLplus: lattice reduction, closest vector point, VBLAST

2015-05-05 Thread Patrick O'Leary
On Tuesday, May 5, 2015 at 9:06:50 PM UTC-5, Christian Peel wrote:

 I have a question for the BLAS gurus:  I can use the BLAS function 
B[:,lx] = axpy!(-mu, B[:,k], B[:,lx])
 to accelerate the code
   B[:,lx]   = B[:,lx]   - mu * B[:,k] 
 but what I wanted to do was simply
   axpy!(-mu, B[:,k], B[:,lx])
 I guess that there is some pass-by-value problem that makes the in-place 
 nature of axpy! not work on columns of a matrix.  Any suggestions for 
 improving this?  See line  47 of 
 https://github.com/christianpeel/LLLplus.jl/blob/master/src/lll.jl for 
 the original code.


The slice operation creates a temporary array, so the copy is mutated, then 
thrown away. I'm not sure if ArrayViews (from ArrayViews.jl) will work with 
axpy!(), but you might want to check. 


[julia-users] Re: Julia v0.3.8

2015-05-04 Thread Patrick O'Leary
On Monday, May 4, 2015 at 7:28:28 AM UTC-5, Sisyphuss wrote:

 Thanks for your work!

 But why after I build it, it shows 0.3.9-pre+6?


If you are building from a git checkout directly from the release-0.3 
branch, there are commits made immediately after the tag--for instance, to 
increment the version number. There have also been several new backports 
since then.

If you wish to build exactly 0.3.8, you will need to check out its git tag 
first (git checkout v0.3.8, or download the source tarball.

Patrick


[julia-users] Is there a way to get a printf() or sprintf() function in Julia?

2015-05-03 Thread Patrick O'Leary
There are a few issues that get a bit into why this is harder than it looks. 
I'll start you off at https://github.com/JuliaLang/julia/issues/5866, and you 
can see where things stand now over at 
https://github.com/JuliaLang/julia/issues/10610. If you search around (maybe 
here, maybe the dev list, don't recall exactly) I think there's at least one 
mailing list thread as well that has further discussion.

[julia-users] Space sensitive macro parsing in v0.4

2015-05-03 Thread Patrick O'Leary
https://github.com/JuliaLang/julia/issues/4233

[julia-users] Re: Comparisons Two Same Code in Julia But some difference results

2015-05-01 Thread Patrick O'Leary
That post is from 2013. Output from @time has gotten more detailed since 
then to include the note about GC when it occurs.

Not sure about the performance difference--comparing across both different 
machines and Julia versions is going to be tricky.

On Friday, May 1, 2015 at 2:10:37 PM UTC-5, Kenan KARAGÜL wrote:

 Hi everybody,

 I read an article Writing Type-Stable Code in Julia from John Myles 
 White in this website:

 http://www.johnmyleswhite.com/notebook/2013/12/06/writing-type-stable-code-in-julia/

 I have got two different results.

 1) GC time is very high, John's has zero.
 2) John's run time 50x faster, my run 5x faster.

 Code run screenshoot in here.
 Thank you very much your contributions.
 Kenan






Re: [julia-users] Re: Performance variability - can we expect Julia to be the fastest (best) language?

2015-05-01 Thread Patrick O'Leary
On Friday, May 1, 2015 at 3:25:50 AM UTC-5, Steven Sagaert wrote:

 I think the performance comparisons between Julia  Python are flawed. 
 They seem to be between standard Python  Julia but since Julia is all 
 about scientific programming it really should be between SciPi  Julia. 
 Since SciPi uses much of the same underlying libs in Fortran/C the 
 performance gap will be much smaller and to be really fair it should be 
 between numba compiled SciPi code  julia. I suspect the performance will 
 be very close then (and close to C performance).

 Similarly the standard benchmark (on the opening page of julia website) 
 between R  julia is also flawed because it takes the best case scenario 
 for julia (loops  mutable datastructures)  the worst case scenario for R. 
 When the same R program is rewritten in vectorised style it beat julia see 
 https://matloff.wordpress.com/2014/05/21/r-beats-python-r-beats-julia-anyone-else-wanna-challenge-r/
 .


All benchmarks are flawed in that sense--a single benchmark can't tell you 
everything. The Julia performance benchmarks are testing algorithms 
expressed in the langauges themselves. It is not a test of foreign-function 
interfaces and BLAS implementations, so the benchmarks don't test that. 
This has been discussed at length--as one example, see 
https://github.com/JuliaLang/julia/issues/2412.


Re: [julia-users] Re: Newbie help... First implementation of 3D heat equation solver VERY slow in Julia

2015-04-30 Thread Patrick O'Leary
On Thursday, April 30, 2015 at 11:29:15 AM UTC-5, Ángel de Vicente wrote:

 Angel de Vicente angel.vicente.garr...@gmail.com writes: 

  Viral Shah vi...@mayin.org writes: 
  You may see some better performance with julia 0.4-dev. The other 
  thing to do that is easy is to start julia with the -O option that 
  enables some more optimizations in the JIT, that may or may not help. 
  
  thanks for the tip. the -O option works only in 0.4, right? 

 sorry about the repeated messages (was configuring access to julia-users 
 from a non-gmail account, thought it was not working and resent it, 
 while apparently I just had to wait a few hours for the message to get 
 through) 


Don't worry about it; I didn't realize that another version had been posted 
when I approved this message. But that email address should be cleared to 
post now. Sorry!


[julia-users] Re: Help to optimize a loop through a list

2015-04-28 Thread Patrick O'Leary
On Tuesday, April 28, 2015 at 11:17:55 AM UTC-5, Ronan Chagas wrote:

 Sorry, my mistake. Every problem is gone when I change

 nf::Integer

 to

 nf::Int64

 in type MGEOStructure.

 I didn't know that such thing would affect the performance this much...

 Sorry about that,
 Ronan


No problem. For future reference (or others coming to this thread in the 
future): 
http://julia.readthedocs.org/en/latest/manual/performance-tips/#avoid-containers-with-abstract-type-parameters

It is somewhat confusing that Integer is an abstract type. If you're not 
sure, you can check with the `isleaftype()` method.

help? isleaftype
search: isleaftype

Base.isleaftype(T)

   Determine whether T is a concrete type that can have instances,
   meaning its only subtypes are itself and None (but T itself
   is not None).

julia isleaftype(Integer)
false

julia isleaftype(Int)
true


Re: [julia-users] Yet Another String Concatenation Thread (was: Re: Naming convention)

2015-04-28 Thread Patrick O'Leary
On Tuesday, April 28, 2015 at 11:56:29 AM UTC-5, Scott Jones wrote:

 People who might realize, after becoming acquainted with Julia, for 
 general computing, or maybe for some light usage of the math packages, 
 might much rather have understandable names available, so they don't always 
 have to run to the manual...
 With decent code completion in your editor, I don't think it takes any 
 more typing...


There's a tradeoff. Reading is more common than writing--which at first 
makes long names sound appealing. But long names can also obscure 
algorithms. So you want names to be long enough to be unambiguous, but 
short enough that code can look like the algorithm you're implementing. 
Support for Unicode in identifiers is huge in that respect, and it is nice 
to write a (non-CompSci, in my case) algorithm in Julia that looks 
remarkably like what's in the textbook. And someone else working in my 
domain--the people who are reviewing and modifying my code--can very 
quickly grok that.

If long names were unambiguously better, no one would pick on Java.


Re: [julia-users] Re: Newbie help... First implementation of 3D heat equation solver VERY slow in Julia

2015-04-28 Thread Patrick O'Leary
On Tuesday, April 28, 2015 at 1:38:02 PM UTC-5, Ángel de Vicente wrote:

 Fortran code: http://pastebin.com/nHn44fBa
 Julia code:http://pastebin.com/Q8uc0maL


We typically share snippets of code using https://gist.github.com/. It 
provides syntax highlighting for Julia code, integrated comments, and a 
number of other helpful features.

Patrick


Re: [julia-users] Yet Another String Concatenation Thread (was: Re: Naming convention)

2015-04-27 Thread Patrick O'Leary
On Monday, April 27, 2015 at 2:36:40 PM UTC-5, François Fayard wrote:

 Ok thanks. I did not think about normal. And my background is mathematics 
 (and I don't want to know Matlab ;-) ). Imagine how puzzling it could be 
 for many people.

 It totally violates the Style Guide which claims: conciseness is 
 valued, but avoid abbreviation (indexin() rather than indxin()) as it 
 becomes difficult to remember whether and how particular words are 
 abbreviated.


The chronology goes more like, we had a method named `sprandn`, then 
someone wrote a style guide which contradicts it. As you correctly 
guessed/looked up this name is from the MATLAB branch of the Julia family 
tree.

I suspect sparse matrix users will grab pitchforks if you force them to 
prepend the full word sparse to every one of their method names. (I live 
in a dense matrix universe and have no dog in that fight.)


[julia-users] Re: Ode solver thought, numerical recipes

2015-04-27 Thread Patrick O'Leary
On Monday, April 27, 2015 at 2:52:26 PM UTC-5, Alex wrote:

 On Monday, 27 April 2015 21:05:33 UTC+2, François Fayard  wrote: 
  If one implements y = y + delta_t * dy_dt in the code with a Vector, you 
 create a lot of heap arrays that has a huge cost (It seems that ODE.jl is 
 doing that by the way). So you are forced to loop over the vector elements, 
 which does not look really good. How about implementing a 
 add_linear_combination!(y, delta_t, dy_dt) which would make things easier 
 to extend for any kind of Vector. This is exactly what people from odeint 
 have been doing and it looks like a nice idea. If you want to go crazy you 
 can event implement a MPI vector whose memory is scattered around different 
 nodes. Then you just implement the add_linear_combination for your array, 
 and you are done. 

 The problem is that `y = y + delta_t * dy_dt` currently makes two copies, 
 as you have observed. There is an open discussion about in-place assignment 
 operators [1], which would help. There is also Devectorize.jl [2]. However, 
 I agree that having something like add!(y, x, beta) would be nice and I 
 think this also came up in some discussion. Funnily enough we just recently 
 discussed this strategy for ODE.jl [3] ... 


There is 
http://julia.readthedocs.org/en/latest/stdlib/linalg/#Base.LinAlg.BLAS.axpy! 
if you're using BLAS-compatible array types.


[julia-users] Re: constant-type global

2015-04-27 Thread Patrick O'Leary
On Monday, April 27, 2015 at 7:32:59 AM UTC-5, Sisyphuss wrote:

 In the documentation:
 Currently, type declarations cannot be used in global scope, e.g. in the 
 REPL, since Julia does not yet have constant-type globals. 
 But we already have 
 ```
 const pi = 3.14
 ```
 Isn't it a constant-type global?


It is not just a constant-type global, though. It is also a constant-value 
global. A type-declared global variable would need to relax that 
restriction, which is not something that is currently supported. See also 
https://github.com/JuliaLang/julia/issues/964 (it's already linked in 
#8870).


Re: [julia-users] Yet Another String Concatenation Thread (was: Re: Naming convention)

2015-04-27 Thread Patrick O'Leary
On Monday, April 27, 2015 at 5:21:27 PM UTC-5, Scott Jones wrote:

 Why can't we have our cake and eat it too?

 I'd suggest that all of these methods be given maximally understandable 
 names...
 such as sparse_random_normal for sprandn.
 Can't you then simply define sprandn as an alias for sparse_random_normal?
 Would there be maybe a to have only the long names exported normally, and 
 then setting up the aliases if you want them...


I'm not sure what that buys you if everyone uses the aliases (the long name 
goes unused). Or if anyone uses the aliases (you have to realize these are 
both the same method). Or if no one uses the aliases (then why have them?).


Re: [julia-users] Yet Another String Concatenation Thread (was: Re: Naming convention)

2015-04-27 Thread Patrick O'Leary
On Monday, April 27, 2015 at 11:19:31 AM UTC-5, François Fayard wrote:

 I just found that the project Coq is using ++ for string concatenation. It 
 has the advantage of not overloading + and still be similar to Python's +. 
 What do you think ?


(And Haskell, as discussed in innumerable prior conversations on this 
topic.)

Should we start a julia-stringconcat mailing list? This topic generates a 
lot of heat, clearly, and has been derailing a number of otherwise 
interesting threads, and I would like to read the !stringconcatenation (or 
maybe !string_concatenation, or !strcat...I'm trying to bring this back 
around to the original subject...) parts.

Not trying to pick on you specifically; I'd start a new thread but it would 
inevitably become another discussion on concatenation syntax. (This might 
be the Julia discussion corollary to the Second Law of Julia Type Dynamics 
I expressed way back in 
https://github.com/JuliaLang/julia/issues/524#issuecomment-4409760)

Alright, got that off my chest. Feeling better now.

Did Python have this much trouble when they introduced ''.join([...])?


Re: [julia-users] Yet Another String Concatenation Thread (was: Re: Naming convention)

2015-04-27 Thread Patrick O'Leary
See also https://github.com/JuliaLang/julia/issues/11030

(They came up with julia-infix-operator-debates for the alternative mailing 
list, which is a better suggestion.)

On Monday, April 27, 2015 at 11:37:34 AM UTC-5, Patrick O'Leary wrote:

 On Monday, April 27, 2015 at 11:19:31 AM UTC-5, François Fayard wrote:

 I just found that the project Coq is using ++ for string concatenation. 
 It has the advantage of not overloading + and still be similar to Python's 
 +. What do you think ?


 (And Haskell, as discussed in innumerable prior conversations on this 
 topic.)

 Should we start a julia-stringconcat mailing list? This topic generates a 
 lot of heat, clearly, and has been derailing a number of otherwise 
 interesting threads, and I would like to read the !stringconcatenation (or 
 maybe !string_concatenation, or !strcat...I'm trying to bring this back 
 around to the original subject...) parts.

 Not trying to pick on you specifically; I'd start a new thread but it 
 would inevitably become another discussion on concatenation syntax. (This 
 might be the Julia discussion corollary to the Second Law of Julia Type 
 Dynamics I expressed way back in 
 https://github.com/JuliaLang/julia/issues/524#issuecomment-4409760)

 Alright, got that off my chest. Feeling better now.

 Did Python have this much trouble when they introduced ''.join([...])?



Re: [julia-users] Yet Another String Concatenation Thread (was: Re: Naming convention)

2015-04-27 Thread Patrick O'Leary
On Monday, April 27, 2015 at 2:18:50 PM UTC-5, François Fayard wrote:

 Back to the original discussion, I just came across sprandn in another 
 thread. How the hell do you want someone to have a feeling of this 
 function? I still have no idea what the n is for.


Providing information only--random samplers with an n suffix mean the 
sample is from a normal distribution. Breaking it down:

sp prefix: this is a method for sparse matrices

rand: that produces random samples

n: from a normal distribution 


[julia-users] Re: Custom Array type

2015-04-25 Thread Patrick O'Leary
On Saturday, April 25, 2015 at 12:05:04 PM UTC-5, Marcus Appelros wrote:

 Which is exactly what should be possible to avoid, if we anyhow have to 
 define all the functions what is the meaning in descending from 
 AbstractArray? The usefulness of having an abstract Component is to make 
 concrete instances that retain all the abstract functionality.


There is no generic way to implement all these methods, though. In a 
traditional object-oriented environment, these would be abstract 
methods--define these and then you can get more things for free.

There is a generic implementation of `length()` you can get if you define 
`size()`:

length(t::AbstractArray) = prod(size(t))::Int

I believe you get `eltype()` and `ndims()` for free, too, looking at the 
definitions in abstractarray.jl.


Re: [julia-users] Float32() or float32() ?

2015-04-24 Thread Patrick O'Leary
The master branch of the git repository is currently version 0.4-dev, which 
is in an unstable development phase. The relevant downloads are at the 
bottom of http://julialang.org/downloads/ under Nightly Builds.

On Friday, April 24, 2015 at 5:47:37 AM UTC-5, Sisyphuss wrote:

 Thanks Tomas!

 By the way, I built Julia from the source, and I got the version 0.3. Do 
 you know how I can get the version 0.4?



 On Friday, April 24, 2015 at 12:21:17 PM UTC+2, Tomas Lycken wrote:

 To be concise: Yes, and yes :)

 Instead of `float32(x)` and the like, 0.4 uses constructor methods 
 (`Float32(x)` returns a `Float32`, just as `Foo(x)` returns a `Foo`...).

 // T

 On Friday, April 24, 2015 at 11:40:02 AM UTC+2, Sisyphuss wrote:

 I mean is there a syntax change from version 0.3 to version 0.4?

 Is float32()-like minuscule conversion going to be deprecated?



 On Friday, April 24, 2015 at 9:58:18 AM UTC+2, Tim Holy wrote:

 I'm not sure what your question is, but the documentation is correct in 
 both 
 cases. You can also use the Compat package, which allows you to write 
 x = @compat Float32(y) 
 even on julia 0.3. 

 --Tim 

 On Friday, April 24, 2015 12:35:01 AM Sisyphuss wrote: 
  To convert a number to the type Float32, 
  
  In 0.3 doc: float32() 
  In 0.4 doc: Float32() 



[julia-users] Re: Macro with varargs

2015-04-24 Thread Patrick O'Leary
On Friday, April 24, 2015 at 2:59:01 AM UTC-5, Kuba Roth wrote:

 Thanks, 
 So in order to create the quote/end block you use:
   newxpr = Expr(:block)
 And the newxpr.args is the array field of the QuoteNode which stores each 
 expression?


Yeah, I think there might be an easier way to do it, but I couldn't think 
of it at the time :D The main goal was to introduce the macroexpand() 
function, which I hope you find useful!

Patrick


[julia-users] Re: Help to optimize a loop through a list

2015-04-24 Thread Patrick O'Leary
It's helpful if you can post the full code; gist.github.com is a good place 
to drop snippets.

From what I can see here, there are two things:

(1) No idea if this is in global scope. If so, that's a problem.

(2) push!() will grow and allocate as needed. It does overallocate so you 
won't get a new allocation on every single push, but if you know how big 
the final result will be, you can use the sizehint() method to avoid the 
repeated reallocations.

Please read the Performance Tips page for a longer description of (1) and 
other information:
http://julia.readthedocs.org/en/latest/manual/performance-tips/


On Friday, April 24, 2015 at 7:47:38 AM UTC-5, Ronan Chagas wrote:

 Hi guys!

 I am coding MGEO algorithm into a Julia module.
 MGEO is a multiobjective evolutionary algorithm that was proposed by a 
 researcher at my institution.

 I have already a version of MGEO in C++ (
 https://github.com/ronisbr/mgeocpp).

 The problem is that Julia version is very slow compared to C++.
 When I tried a dummy example, it took 0.6s in C++ and 60s in Julia.

 After some analysis, I realized that the problem is the loop through a 
 list.
 The algorithm store a list of points (the Pareto frontier) and for every 
 iteration the algorithm must go through every point in this list comparing 
 each one with the new candidate to be at the frontier.
 The problem is that, for this dummy example, the process is repeated 
 128,000 times and the number of points in the frontier at the end is 6,000+.

 Each point in the list is an instance of this type:

 type ParetoPoint
 # Design variables.
 vars::Array{Float64,1}
 # Objective functions.
 f::Array{Float64, 1}
 end

 I am creating the list (the Pareto frontier) as follows

 paretoFrontier = ParetoPoint[]

 and pushing new points using

 push!(paretoFrontier, candidatePoint)

 in which candidatePoint is an instance of ParetoPoint.

 Can anyone help me to optimize this code?

 Thanks,
 Ronan



Re: [julia-users] Float32() or float32() ?

2015-04-24 Thread Patrick O'Leary
It's fine to play with, and there are definitely worthwhile improvements, 
just want to be clear that there's still a fair number of impactful changes 
being made, and bugs to be squished.

On Friday, April 24, 2015 at 10:49:38 AM UTC-5, Sisyphuss wrote:

 Thanks, Patrick !

 Frightened by the word unstable, I do not dare to use it anymore. 
 Expecting the version 0.4 to be released soon! 

 On Fri, Apr 24, 2015 at 3:13 PM, Patrick O'Leary patrick.ole...@gmail.com
  wrote:

 The master branch of the git repository is currently version 0.4-dev, 
 which is in an unstable development phase. The relevant downloads are at 
 the bottom of http://julialang.org/downloads/ under Nightly Builds.


 On Friday, April 24, 2015 at 5:47:37 AM UTC-5, Sisyphuss wrote:

 Thanks Tomas!

 By the way, I built Julia from the source, and I got the version 0.3. Do 
 you know how I can get the version 0.4?



 On Friday, April 24, 2015 at 12:21:17 PM UTC+2, Tomas Lycken wrote:

 To be concise: Yes, and yes :)

 Instead of `float32(x)` and the like, 0.4 uses constructor methods 
 (`Float32(x)` returns a `Float32`, just as `Foo(x)` returns a `Foo`...).

 // T

 On Friday, April 24, 2015 at 11:40:02 AM UTC+2, Sisyphuss wrote:

 I mean is there a syntax change from version 0.3 to version 0.4?

 Is float32()-like minuscule conversion going to be deprecated?



 On Friday, April 24, 2015 at 9:58:18 AM UTC+2, Tim Holy wrote:

 I'm not sure what your question is, but the documentation is correct 
 in both 
 cases. You can also use the Compat package, which allows you to write 
 x = @compat Float32(y) 
 even on julia 0.3. 

 --Tim 

 On Friday, April 24, 2015 12:35:01 AM Sisyphuss wrote: 
  To convert a number to the type Float32, 
  
  In 0.3 doc: float32() 
  In 0.4 doc: Float32() 




Re: [julia-users] in-place fn in a higher order fn

2015-04-23 Thread Patrick O'Leary
It's part of #3440, the compiler optimization metabug: function-valued 
argument inlining

https://github.com/JuliaLang/julia/issues/3440

On Thursday, April 23, 2015 at 9:34:48 AM UTC-5, Mauro wrote:

 Thanks!  In that case, I'll file an issue then to get this noted.  Also, 
 I think there is no (general) issue on the bad performance of higher 
 order functions.  Should I file that too? 

 On Thu, 2015-04-23 at 15:52, Jameson Nash vtjn...@gmail.com wrote: 
  The short answer is that there is a certain set of optimizations that 
 have 
  been implemented in Julia, but still a considerable set that has not 
 been 
  implemented. This falls into the category of optimizations that have not 
  been implemented. Pull requests are always welcome (although I do not 
  recommend this one as a good beginner / up-for-grabs issue). 
  
  On Thu, Apr 23, 2015 at 9:18 AM Mauro mauro...@runbox.com wrote: 
  
  It is well know that Julia struggles with type inference in higher 
 order 
  functions.  This usually leads to slow code and memory allocations. 
  There are a few hacks to work around this.  Anyway, the question I have 
  is: Why can't Julia do better with in-place functions? 
  
  In short, a higher-order function like this: 
  
  function f(fn!,ar) 
  for i=1:n 
  fn!(ar, i) # fn! updates ar[i] somehow, returns nothing 
  nothing# to make sure output of f is discarded 
  end 
  end 
  
  has almost as bad a performance (runtime and allocation-wise) as 
  
  function g(fn,ar) 
  for i=1:n 
  ar[i] = fn(ar[i]) 
  end 
  end 
  
  A in-depth, ready to run example is here: 
  https://gist.github.com/mauro3/f17da10247b0bad96f1a 
  Including output of @code_warntype. 
  
  So, why is Julia allocating memory when running f?  Nothing of f gets 
  assigned to anything. 
  
  Would this be something which is fixable more easily than the whole of 
  the higher-order performance issues?  If so, is there an issue for 
 this? 
  
  Having good in-place higher order functions would go a long way with 
  numerical computations.  Thanks! 
  



[julia-users] Re: Macro with varargs

2015-04-23 Thread Patrick O'Leary
On Thursday, April 23, 2015 at 2:36:45 PM UTC-5, Kuba Roth wrote:

 This is my first  time writing macros in Julia. I've read related docs but 
 could not find an example which works with the arbitrary number of 
 arguments. 
 So in my example below the args... works correctly with string literals 
 but for the passed variables it returns their names and not the values.


Here's the result of the last thing you called (note that I don't even have 
testB and testC defined!)

julia macroexpand(:(@echo testB testC))
:(for #6#x = (:testB,:testC) # line 3:
print(#6#x, )
end)

What ends up in `args` is the argument tuple to the macro. Typically, you 
wouldn't process that in the final output--otherwise you could just use a 
function! Instead, you'd splice each argument individually (`$(args[1])`, 
`$(args[2])`, etc.) using a loop in the macro body, with each element of 
the loop emitting more code, then gluing the pieces together at the end.

Style notes: Typically, no space between function/macro name and formal 
arguments list. Multiline expressions are easier to read in `quote`/`end` 
blocks.

Anyways, here's one way to do sort of what you want in a way that requires 
a macro (though I still wouldn't use one for this! Didactic purposes only!):

macro unrolled_echo(args...)
newxpr = Expr(:block) # empty block to hold multiple statements
append!(newxpr.args, [:(print($arg,  )) for arg in args]) # the 
arguments to the :block node are a list of Exprs
newxpr # return the constructed expression
end


[julia-users] Re: susceptance matrix

2015-04-23 Thread Patrick O'Leary
On Thursday, April 23, 2015 at 6:51:05 AM UTC-5, Michela Di Lullo wrote:

 I'm trying to make it but it's not working because of the indexes. I don't 
 know how to declare the parameter branch_x indexed by (n,b_from,b_to).


I'm not sure what this indexing expression is supposed to do; branch_x is 
defined as a vector so providing three arguments doesn't make sense; each 
indexing argument indexes along a dimension of the array, and branch_x is 
one dimensional.

We might be missing better documentation of indexing expressions; all I can 
find right now is 
http://julia.readthedocs.org/en/release-0.3/stdlib/arrays/#indexing-assignment-and-concatenation;
 
getindex(X, ...) is the desugared form of X[...].

It looks like you might be trying to extract a range, in which case you can 
construct it with colon:

branch_x[b_from:b_to]

but that throws n away...if n is supposed to be a stride, then

branch_x[b_from:n:b_to]

...but without understanding the AMPL expression, I'm just stabbing 
possible answers at you.

(If someone who understands AMPL comes along, they might be able to help 
better/faster.) 


[julia-users] Re: Help understanding ASCIIString vs String

2015-04-22 Thread Patrick O'Leary
(The crusade continues)

Never fear though, this doesn't mean you have to write more code! Julia 
supports the use of type variables to express generics. So in your case, 
instead of:

function func(a::Params, b::String, c::Dict{String, Array{Int, 1}}, 
d::Dict{String, Array{Int, 1}})
  ...
end

which has the aforementioned invariance problem, you can write

function func{T:String}(a::Params, b::T, c::Dict{T, Array{Int, 1}}, 
d::Dict{T, Array{Int, 1}})
  ...
end

which defines a family of methods for any subtype of String. These methods 
are called, in Julia terms, parametric methods, and are discussed in the 
manual here: 
http://docs.julialang.org/en/release-0.3/manual/methods/#parametric-methods

Hope this helps,
Patrick


On Wednesday, April 22, 2015 at 8:54:12 AM UTC-5, Avik Sengupta wrote:

 So a one line answer to this is julia container types are invariant. 

 Lets take this step by step

 julia f(x::String) = I am $x
 f (generic function with 2 methods)

 julia f(abc)
 I am abc

 julia g(x::Dict) = I am a dict of type: $(typeof(x))
 g (generic function with 1 method)

 julia g(Dict(abc=1))
 I am a dict of type: Dict{ASCIIString,Int64}

 julia h(x::Dict{String, Int}) = I am a dict of {String=Int}
 h (generic function with 1 method)

 julia h(Dict(abc=1))
 ERROR: MethodError: `h` has no method matching h(::Dict{ASCIIString,Int64})


 Basically, while an ASCIIString is a subtype of the abstract type 
 String , a Dict{ASCIIString, Int} is not a subtype of Dict{String, Int}

 See here for more: 
 http://docs.julialang.org/en/release-0.3/manual/types/?highlight=contravariant#man-parametric-types

 Regards
 -
 Avik

 On Wednesday, 22 April 2015 14:14:06 UTC+1, Test This wrote:


 I defined a function 

 function func(a::Params, b::String, c::Dict{String, Array{Int, 1}}, 
 d::Dict{String, Array{Int, 1}})
   ...
 end

 When I run the program, calling this function with func(paramvalue, H, 
 d1, d2), I get an error saying func has no method matching
 (::Params, ::ASCIIString, ::Dict{ASCIIString,Array{Int64,1}}, 
 ::Dict{ASCIIString,Array{Int64,1}})

 The code works if I change String to ASCIIString in the function 
 definition. But I thought (and the REPL seems to agree) that 
 any value of type ASCIIString is also of type String. Then, why am I 
 getting this error. 

 Thanks in advance for your help.



[julia-users] Re: Argument Placeholder

2015-04-22 Thread Patrick O'Leary
Note that the underscore is only a convention--it is also a legal identifier.

Something I hadn't thought about before, and would be worth checking, is 
whether this could cause type stability issues. They're all dead stores, so 
they shouldn't, but I'm not sure if anyone has explicitly checked for that.

Re: [julia-users] Newbie question about function name(::Type syntax

2015-04-21 Thread Patrick O'Leary
I'd go with 
https://julia.readthedocs.org/en/latest/manual/types/#singleton-types ?

On Tuesday, April 21, 2015 at 12:13:03 PM UTC-5, Tamas Papp wrote:

 I think it is implied that you can do this: there are quite a few 
 examples in the manual, eg 
 https://julia.readthedocs.org/en/latest/manual/types/#value-types 

 Best, 

 Tamas 

 On Tue, Apr 21 2015, Scott Jones scott.paul.jo...@gmail.com wrote: 

  Ah, thanks for the *very* quick reply.  That’s quite useful. 
  Did I somehow miss the explanation in the documentation (of 0.4), or 
 does 
  that need to be added to the documentation of methods? 
  
  Thanks, 
  Scott 
  
  On Tuesday, April 21, 2015 at 11:41:45 AM UTC-4, Stefan Karpinski wrote: 
  
  It means that the argument doesn't get a local name but the method is 
 only 
  called if the argument in that position matches the type on the RHS of 
 the 
  :: 
  
  On Tue, Apr 21, 2015 at 11:37 AM, Scott Jones scott.pa...@gmail.com 
  javascript: wrote: 
  
  Just what does it mean, if there is a type but no formal parameter 
 name, 
  in a function definition? 
  I tried to find it in the documentation, but nothing came up... 
  
  Thanks, 
  Scott 
  
  
  



Re: [julia-users] Re: Unexpected behaviour when running parallelized code on single worker

2015-04-20 Thread Patrick O'Leary
If you didn't see, this is now fixed (by 
https://github.com/JuliaLang/julia/pull/10877). Thanks for the report!

On Wednesday, April 8, 2015 at 10:15:39 AM UTC-5, Nils Gudat wrote:

 Apologies again for being a little slow (mentally now, not in terms of 
 response time); by trying an Array you mean running the code in single-core 
 mode and using an Array instead of a SharedArray? Running it in parallel 
 with a regular Array (unsurprisingly) doesn't work. So I have:

 Single core: initializing result* as Arrays works, initializing as 
 SharedArrays gives unexpected result
 Multi-core: initializing result* as Arrays returns empty Arrays, 
 initializing as SharedArrays works

 I'm still not sure whether this is a bug or the intended behaviour of 
 SharedArrays, i.e. whether SharedArrays maybe just aren't meant to work in 
 this situation when not using multiple processes. I'm a bit reluctant to 
 file an issue simply because my understanding of the workings of 
 SharedArray is limited, so I thought I'd ask for help here first.



[julia-users] Re: Advice on vectorizing functions

2015-04-14 Thread Patrick O'Leary
On Tuesday, April 14, 2015 at 1:38:21 PM UTC-5, SixString wrote:

 Eliding the types completely results in warnings about method ambiguity.


Yes, of course--good catch. 


[julia-users] Re: Advice on vectorizing functions

2015-04-14 Thread Patrick O'Leary


On Monday, April 13, 2015 at 10:38:28 PM UTC-5, ele...@gmail.com wrote:

 Why does this version result in complaints about no matching method for 
 (::Array(Float64,1), ::Int64)?  super(Float64) is FloatingPoint, and 
 ldexp() has methods for all subtypes of FloatingPoint paired with Int.

 But Float64 : FloatingPoint does not imply Array{Float64} : 
 Array{FloatingPoint}, see 
 http://en.wikipedia.org/wiki/Covariance_and_contravariance_%28computer_science%29#Arrays
  
 for why.

 Cheers
 Lex


Reference in the Julia manual: 
http://julia.readthedocs.org/en/latest/manual/types/#parametric-composite-types

I'm starting on a quixotic quest to follow up on every mention of 
invariance in method signatures to show how to deal with it, because on its 
face it sounds limiting--you have to duplicate all that code? This isn't 
generic at all! I'm never going to use Julia again!

Never fear.

function ldexp!{T:FloatingPoint}(x:Array{T}, e::Int)
...
end

This is a generic parametric method that handles the entire family of types 
which are Arrays with elements which are subtypes of FloatingPoint.

Note that you can probably elide the types completely, here--it probably 
suffices for your application to define:

function ldexp!(x, e)
...
end

And let Julia handle the type specialization for you.


[julia-users] Re: Unexpected behaviour when running parallelized code on single worker

2015-04-08 Thread Patrick O'Leary
I understand *why* you're using them; the question is whether they're 
broken or not. One way to figure that out is to not use them and see if you 
still get the weird behavior.

(No worries on the response speed, this is your issue after all!)

On Wednesday, April 8, 2015 at 7:57:54 AM UTC-5, Nils Gudat wrote:

 Hi Patrick, 

 Sorry for the late reply (Easter and all that)! The problem with using 
 Arrays in the code above is that they're not available for remote processes 
 to write on, so I'd have to come up with a more complicated structure of 
 passing computations and results around. The whole appeal of SharedArrays 
 to me is that they make this unnecessary and therefore lend itself 
 perfectly to facilitate easy parallelization of the kind of problems 
 economists are often facing (basically filling large Arrays with the 
 results of lots and lots of simple maximization problems). One way to get 
 around both problems would be to initialize the Array as

 nprocs()  1 ? result1 = SharedArray(Float64, (3,3,3)) : result1 = 
 Array(Float64, (3,3,3))

 The main reason for my original post here was to see whether anyone could 
 explain why SharedArrays behave in this way when used on one process and 
 whether this was potentially a bug (or at least something that should be 
 mentioned in the docs).



Re: [julia-users] Re: Unexpected behaviour when running parallelized code on single worker

2015-04-08 Thread Patrick O'Leary
Thank you Tim for explaining that more clearly. This morning's reply was 
ENOCOFFEE :D

Please do file an issue, Nils, and thanks for investigating further.

On Wednesday, April 8, 2015 at 10:03:04 AM UTC-5, Tim Holy wrote:

 Sounds like a bug, but I think what Patrick was trying to say is that it 
 would 
 help to test it with Arrays first just to make sure there's not some 
 problem 
 with your code. Assuming that it works for Arrays but not for 
 SharedArrays, 
 then you should probably file an issue. 

 Best, 
 --Tim 

 On Wednesday, April 08, 2015 05:57:54 AM Nils Gudat wrote: 
  Hi Patrick, 
  
  Sorry for the late reply (Easter and all that)! The problem with using 
  Arrays in the code above is that they're not available for remote 
 processes 
  to write on, so I'd have to come up with a more complicated structure of 
  passing computations and results around. The whole appeal of 
 SharedArrays 
  to me is that they make this unnecessary and therefore lend itself 
  perfectly to facilitate easy parallelization of the kind of problems 
  economists are often facing (basically filling large Arrays with the 
  results of lots and lots of simple maximization problems). One way to 
 get 
  around both problems would be to initialize the Array as 
  
  nprocs()  1 ? result1 = SharedArray(Float64, (3,3,3)) : result1 = 
  Array(Float64, (3,3,3)) 
  
  The main reason for my original post here was to see whether anyone 
 could 
  explain why SharedArrays behave in this way when used on one process and 
  whether this was potentially a bug (or at least something that should be 
  mentioned in the docs). 



[julia-users] Re: Julia on Vagrant

2015-04-06 Thread Patrick O'Leary
This hasn't been updated in a while, so it's probably just broken (I don't 
think it sees much use). `jlmake` is just an alias, defined in 
/home/vagrant/.bash_aliases during provisioning, which sets as many of the 
USE_SYSTEM_dep variables to true as it can. Dependencies have probably 
gotten shuffled around.

The provisioning script is defined as a heredoc at the beginning of the 
Vagrantfile--take a look at it. Either the installed packages 
(https://github.com/JuliaLang/julia/blob/cd653bb04ffe00cc12b6e3a732332f5d70a5873f/contrib/vagrant/Vagrantfile#L19)
 
or the list of USE_SYSTEM flags 
(https://github.com/JuliaLang/julia/blob/cd653bb04ffe00cc12b6e3a732332f5d70a5873f/contrib/vagrant/Vagrantfile#L8)
 
probably needs to be adjusted.

On Sunday, April 5, 2015 at 6:34:14 PM UTC-5, Christian Goldammer wrote:

 Hi,

 I'm currently trying to build Julia on Vagrant, with no background in 
 makefiles. I'm following the instructions on 
 https://github.com/JuliaLang/julia/tree/master/contrib/vagrant. All the 
 steps in `Vagrant up` seem to work successfully. How do I then build Julia? 
 From the documentation, it looks like I should just go to `~/julia` and 
 type `jlmake`. However, when I do this, I get

 ```
 vagrant@vagrant-ubuntu-trusty-64:~/julia$ jlmake
 : No such file or directory
 : No such file or directory
 /bin/sh: 2: ./configure: not found
 make[1]: *** [libuv/config.status] Error 127
 make: *** [julia-deps] Error 2
 ```

 Any feedback is appreciated. Thanks, Chris.



Re: [julia-users] Extreme garbage collection time

2015-04-04 Thread Patrick O'Leary
Silly me, ignoring all the commented out lines assuming they were 
comments...yes, this is almost certainly it.

On Saturday, April 4, 2015 at 3:24:50 AM UTC-5, Tim Holy wrote:

 Devectorization should never slow anything down. If it does, then you have 
 some other problem. Here, M is a global variable, and that's probably 
 what's 
 killing you. Performance tip #1: 
 http://docs.julialang.org/en/latest/manual/performance-tips/ 

 --Tim 

 On Friday, April 03, 2015 09:43:51 AM Adam Labadorf wrote: 
  Hi, 
  
  I am struggling with an issue related to garbage collection taking up 
 the 
  vast majority (99%) of compute time on a simple nested for loop. Code 
  excerpt below: 
  
  # julia version 0.3.7 
  # counts is an MxN matrix of Float64 
  # N=15000 
  # M=108 
  # h_cols and c_cols are indices \in {1:M} 
  using HypothesisTests, ArrayViews 
  ratios = Array(Float64,M) 
  function compute!(S,tot_i::Int64,i::Int64,j::Int64) 
  ratios = view(counts,:,i)./view(counts,:,j) 
  #for k=1:M 
  #  ratios[k] = counts[k,i]/counts[k,j] 
  #end 
  #ratios = counts[:,i]./counts[:,j] 
  t = UnequalVarianceTTest(ratios[h_cols],ratios[c_cols]) 
  S[tot_i] = (pvalue(t),i,j) 
  end 
  
  for i=1:N-1 
@time for j=(i+1):N 
  tot_i += 1 
  compute!(S,tot_i,i,j) 
end 
  end 
  
  The loop begins fast, output from time: 
  
  elapsed time: 1.023850054 seconds (62027220 bytes allocated) 
  elapsed time: 0.170916977 seconds (45785624 bytes allocated) 
  elapsed time: 0.171598156 seconds (45782760 bytes allocated) 
  elapsed time: 0.173866309 seconds (45779896 bytes allocated) 
  elapsed time: 0.170267172 seconds (45777032 bytes allocated) 
  elapsed time: 0.171754713 seconds (45774168 bytes allocated) 
  elapsed time: 0.170110142 seconds (45771304 bytes allocated) 
  elapsed time: 0.175199053 seconds (45768440 bytes allocated) 
  elapsed time: 0.179893161 seconds (45765576 bytes allocated) 
  elapsed time: 0.212172824 seconds (45762712 bytes allocated) 
  elapsed time: 0.252750549 seconds (45759848 bytes allocated) 
  elapsed time: 0.254874855 seconds (45756984 bytes allocated) 
  elapsed time: 0.231003319 seconds (45754120 bytes allocated) 
  elapsed time: 0.235060195 seconds (45751256 bytes allocated) 
  elapsed time: 0.235379355 seconds (45748392 bytes allocated) 
  elapsed time: 0.927622743 seconds (45746168 bytes allocated, 77.65% gc 
 time) 
  elapsed time: 0.9132931 seconds (45742664 bytes allocated, 78.35% gc 
 time) 
  
  But as soon as it starts doing gc the % time spent in increases almost 
  indefinitely, output from time much later: 
  
  elapsed time: 0.174122929 seconds (36239160 bytes allocated) 
  elapsed time: 18.535572658 seconds (36236168 bytes allocated, 99.22% gc 
  time) 
  elapsed time: 19.189478819 seconds (36233176 bytes allocated, 99.26% gc 
  time) 
  elapsed time: 21.812889439 seconds (36230184 bytes allocated, 99.35% gc 
  time) 
  elapsed time: 22.182467723 seconds (36227192 bytes allocated, 99.30% gc 
  time) 
  elapsed time: 0.16984 seconds (36224200 bytes allocated) 
  
  The inner loop, despite iterating over fewer and fewer indices has 
  massively increased the gc, and therefore overall, execution time. I 
 have 
  tried many things, including creating the compute function, 
 devectorizing 
  the ratios calculation (which really slowed things down), using view and 
  sub in various places, profiling with --trace-allocation=all but I can't 
  figure out why this happens or how to fix it. Doing gc for 22 seconds 
  obviously kills the performance, and since there are about 22M 
 iterations 
  this is prohibitive. Can anyone suggest something I can try to improve 
 the 
  performance here? 
  
  Thanks, 
  Adam 



Re: [julia-users] Re: macros generating macros in modules

2015-04-04 Thread Patrick O'Leary
Let me ask the question slightly differently. Without constraining yourself 
to legal Julia syntax, what would you like to go in, and what do you want 
to come out? I get the feeling there's a design here that doesn't have this 
complexity--my intuition is that a generic function can be used to drive 
this just fine (called within a macro to facilitate the final splice.) But 
that's still hard to concretely suggest. 

On Saturday, April 4, 2015 at 9:39:57 PM UTC-5, Abe Schneider wrote:

 I don't understand how what I'm trying to do falls outside the domain of 
 transforming code. My main goal is to take a user-defined grammar and 
 create relevant code.

 More to the point, I think the general ability to generate expressions 
 based on functions/namespaces is also solidly in the domain of transforms. 
 I can imagine plenty of use-cases for language-bindings, communication, etc.

 Yes, there are many simple macros one can create, which are useful, but 
 there exists many useful things one can do with macros that can get more 
 complex.

 A

 On Saturday, April 4, 2015 at 10:11:25 PM UTC-4, Jameson wrote:

 I think the underlying issue may be that you are trying to have the macro 
 do too much, when you should instead be just doing source-code transforms 
 and preprocessing. One such example of a great use of a macro is simply:

 macro special_foo(syntax_tree)
 return quote
 special_foo( $(QuoteNode(syntax_tree) )
 end
 end


 On Sat, Apr 4, 2015 at 8:05 PM Abe Schneider abe.sc...@gmail.com wrote:

 The issue I'm dealing with is that I have a macro that I want to pass a 
 list of functions that the macro can employ for parsing grammars. The macro 
 itself exists already, and it currently has its own static list of 
 functions. I'm trying to figure out a way to allow the user of the library 
 to customize it.

 For example:
 @grammar foo begin
   start = list(integer, r, )
 end

 where `list` is a user supplied function. The issue that I keep running 
 into is that there doesn't seem to be any easy way of just passing in a 
 function to a macro. Turning an expression into a function is problematic, 
 because `eval`s are: (a) only work in the modules namespace, and (b) are 
 frowned on.

 I've thought about manually looking up the symbol name in the various 
 namespaces, but while I've found methods to list the functions, I haven't 
 found a way to do the reverse.

 Ideally, it would be nice to generate an expression with `call` that 
 took a symbol and a namespace, that way no lookups would have to be done.

 Having hit a dead end in this direction, it occurred to me that it might 
 be possible to do something like this:
 @set_parsers(list, etc)
 @grammar foo begin
 ...
 end


 where '@set_parsers` would generate the `@grammar` macro with the static 
 list it already has plus whatever else the user adds in.

 Which works, except for the whole module issue.


 Thanks,
 A


 On Saturday, April 4, 2015 at 12:41:27 PM UTC-4, Patrick O'Leary wrote:

 On Saturday, April 4, 2015 at 9:04:10 AM UTC-5, Abe Schneider wrote:

 I should start off, not entirely sure this is an okay thing to do with 
 Julia. Suppose I want to create a macro that generates another macro...


 I'm not sure whether this should work or not, but either way I'm not 
 sure how it's any more expressive than macros alone? Can you describe what 
 it is you want to do at a high level in a little more detail? There might 
 be another way. 



[julia-users] Re: macros generating macros in modules

2015-04-04 Thread Patrick O'Leary
On Saturday, April 4, 2015 at 9:04:10 AM UTC-5, Abe Schneider wrote:

 I should start off, not entirely sure this is an okay thing to do with 
 Julia. Suppose I want to create a macro that generates another macro...


I'm not sure whether this should work or not, but either way I'm not sure 
how it's any more expressive than macros alone? Can you describe what it is 
you want to do at a high level in a little more detail? There might be 
another way. 


[julia-users] Re: Unexpected behaviour when running parallelized code on single worker

2015-04-04 Thread Patrick O'Leary
Have you tried this with a normal Array instead of a SharedArray for the 
outputs? (There may or may not be other problems and/or bugs in Julia, but 
I do know that SharedArray is considered experimental.)

On Saturday, April 4, 2015 at 9:20:48 AM UTC-5, Nils Gudat wrote:

 There's something weird happening in my recently parallelized code. When 
 running it without adding some worker processes first, the results are 
 completely off and after some investigation I found that this was due to 
 assignment operations going wrong - results of computations were assigned 
 to different Arrays than the intended ones. A small working example 
 illustrating the point:

 x1 = linspace(1, 3, 3)
 x2 = linspace(1, 3, 3)
 x3 = linspace(1, 3, 3)

 function getresults(x1::Array, x2::Array, x3::Array)
   result1 = SharedArray(Float64, (3,3,3))
   result2 = similar(result1)
   result3 = similar(result1)
   
   @sync @parallel for a=1:3
 for b=1:3
   for c=1:3
 result1[a,b,c] = x1[a]*x2[b]*x3[c]
 result2[a,b,c] = sqrt(x1[a]*x2[b]*x3[c])
 result3[a,b,c] = (x1[a]*x2[b]*x3[c])^2
   end
 end
   end
   return sdata(result1), sdata(result2), sdata(result3)
 end

 (r1,r2,r3) = getresults(x1, x2, x3)

 nprocs()==CPU_CORES || addprocs(CPU_CORES-1)
 (r1_par,r2_par,r3_par) = getresults(x1, x2, x3)


 When I run this on my system  (v0.3.6), the parallelized version works as 
 intended, while running the code without adding workers first gives the 
 expected results for r1 and r3, but r2 holds the same results as r3. The 
 behaviour in my original problem was similar, the code returns three 
 Arrays, but running it without additional workers those Arrays all return 
 the same contents.

 Is there something in the @sync or @parallel macros that causes this? How 
 should a code be written to ensure that it works both with one and multiple 
 cores?



[julia-users] Re: Extreme garbage collection time

2015-04-03 Thread Patrick O'Leary
This might be an algorithm that benefits from the GC improvements in 
0.4--with the caveat that 0.4 is still a work in progress and has a variety 
of syntax changes (and possibly more to come) you might want to give that a 
try.

On Friday, April 3, 2015 at 8:22:49 PM UTC-5, Adam Labadorf wrote:

 Hi,

 I am struggling with an issue related to garbage collection taking up the 
 vast majority (99%) of compute time on a simple nested for loop. Code 
 excerpt below:

 # julia version 0.3.7
 # counts is an MxN matrix of Float64
 # N=15000
 # M=108
 # h_cols and c_cols are indices \in {1:M}
 using HypothesisTests, ArrayViews
 ratios = Array(Float64,M)
 function compute!(S,tot_i::Int64,i::Int64,j::Int64)
 ratios = view(counts,:,i)./view(counts,:,j)
 #for k=1:M
 #  ratios[k] = counts[k,i]/counts[k,j]
 #end
 #ratios = counts[:,i]./counts[:,j]
 t = UnequalVarianceTTest(ratios[h_cols],ratios[c_cols])
 S[tot_i] = (pvalue(t),i,j)
 end

 for i=1:N-1
   @time for j=(i+1):N
 tot_i += 1
 compute!(S,tot_i,i,j)
   end
 end

 The loop begins fast, output from time:

 elapsed time: 1.023850054 seconds (62027220 bytes allocated)
 elapsed time: 0.170916977 seconds (45785624 bytes allocated)
 elapsed time: 0.171598156 seconds (45782760 bytes allocated)
 elapsed time: 0.173866309 seconds (45779896 bytes allocated)
 elapsed time: 0.170267172 seconds (45777032 bytes allocated)
 elapsed time: 0.171754713 seconds (45774168 bytes allocated)
 elapsed time: 0.170110142 seconds (45771304 bytes allocated)
 elapsed time: 0.175199053 seconds (45768440 bytes allocated)
 elapsed time: 0.179893161 seconds (45765576 bytes allocated)
 elapsed time: 0.212172824 seconds (45762712 bytes allocated)
 elapsed time: 0.252750549 seconds (45759848 bytes allocated)
 elapsed time: 0.254874855 seconds (45756984 bytes allocated)
 elapsed time: 0.231003319 seconds (45754120 bytes allocated)
 elapsed time: 0.235060195 seconds (45751256 bytes allocated)
 elapsed time: 0.235379355 seconds (45748392 bytes allocated)
 elapsed time: 0.927622743 seconds (45746168 bytes allocated, 77.65% gc 
 time)
 elapsed time: 0.9132931 seconds (45742664 bytes allocated, 78.35% gc time)

 But as soon as it starts doing gc the % time spent in increases almost 
 indefinitely, output from time much later:

 elapsed time: 0.174122929 seconds (36239160 bytes allocated)
 elapsed time: 18.535572658 seconds (36236168 bytes allocated, 99.22% gc 
 time)
 elapsed time: 19.189478819 seconds (36233176 bytes allocated, 99.26% gc 
 time)
 elapsed time: 21.812889439 seconds (36230184 bytes allocated, 99.35% gc 
 time)
 elapsed time: 22.182467723 seconds (36227192 bytes allocated, 99.30% gc 
 time)
 elapsed time: 0.16984 seconds (36224200 bytes allocated)

 The inner loop, despite iterating over fewer and fewer indices has 
 massively increased the gc, and therefore overall, execution time. I have 
 tried many things, including creating the compute function, devectorizing 
 the ratios calculation (which really slowed things down), using view and 
 sub in various places, profiling with --trace-allocation=all but I can't 
 figure out why this happens or how to fix it. Doing gc for 22 seconds 
 obviously kills the performance, and since there are about 22M iterations 
 this is prohibitive. Can anyone suggest something I can try to improve the 
 performance here?

 Thanks,
 Adam



Re: [julia-users] Union types and contravariance, question

2015-04-02 Thread Patrick O'Leary
As a general rule, whenever you are tempted to insert an abstract type 
(including type unions) into the type parameter position of a formal method 
argument--which is to say, `x::SomeType{ThisTypeIsAbstract}`, you want to 
transform that into a type parameter of the method instead, so 
`f{T:ThisTypeIsAbstract}(..., x::SomeType{T}, ...)`.

On Thursday, April 2, 2015 at 3:37:13 PM UTC-5, Michael Francis wrote:

 Nice, that will go a long way to solving the issue I have. 

 Thanks!

 On Thursday, April 2, 2015 at 4:30:57 PM UTC-4, Patrick O'Leary wrote:

 However, you can define

 f{T:Union(Int64, Float64)}(v::Vector{T}) = v

 to cover that case with a single method definition.

 On Thursday, April 2, 2015 at 3:28:34 PM UTC-5, Michael Francis wrote:

 Thanks, that is what I suspected. 

 On Thursday, April 2, 2015 at 4:13:31 PM UTC-4, Mauro wrote:

 Types in Julia are invariant: 

 If T1:T2 is true 
 then A1{T1}:A1{T2} is false (unless T1==T2). 

 Now setting T1=Float64 and T2=Union( Int64, Float64) means 
 Vector{Float64}:Vector{Union( Int64, Float64)} is false.  Thus the no 
 method error. 

 On Thu, 2015-04-02 at 21:32, Michael Francis mdcfr...@gmail.com 
 wrote: 
  Is the following expected behavior for union types, I'm assuming yes 
 ? 
  
  v = Union( Int64,Float64 )[] 
  push!( v, 1.2) 
  push!( v, 1)   # works 
  
  f( v::Union(Int64,Float64)...) = v 
  f( [ 1.2 ]...) # Works 
  
  f( v::Vector{Union( Int64, Float64)}) = v 
  f( [ 1.2 ])# 'f' has not method matching f(::Array{Float64,1}) 
  
  At first glance I'd expect the last case to work as well. 



Re: [julia-users] Union types and contravariance, question

2015-04-02 Thread Patrick O'Leary
However, you can define

f{T:Union(Int64, Float64)}(v::Vector{T}) = v

to cover that case with a single method definition.

On Thursday, April 2, 2015 at 3:28:34 PM UTC-5, Michael Francis wrote:

 Thanks, that is what I suspected. 

 On Thursday, April 2, 2015 at 4:13:31 PM UTC-4, Mauro wrote:

 Types in Julia are invariant: 

 If T1:T2 is true 
 then A1{T1}:A1{T2} is false (unless T1==T2). 

 Now setting T1=Float64 and T2=Union( Int64, Float64) means 
 Vector{Float64}:Vector{Union( Int64, Float64)} is false.  Thus the no 
 method error. 

 On Thu, 2015-04-02 at 21:32, Michael Francis mdcfr...@gmail.com wrote: 
  Is the following expected behavior for union types, I'm assuming yes ? 
  
  v = Union( Int64,Float64 )[] 
  push!( v, 1.2) 
  push!( v, 1)   # works 
  
  f( v::Union(Int64,Float64)...) = v 
  f( [ 1.2 ]...) # Works 
  
  f( v::Vector{Union( Int64, Float64)}) = v 
  f( [ 1.2 ])# 'f' has not method matching f(::Array{Float64,1}) 
  
  At first glance I'd expect the last case to work as well. 



[julia-users] Re: Better way of assigning diagonal values to sparse matrices?

2015-04-02 Thread Patrick O'Leary
I suppose a diagonal matrix is the worst-case for compressed storage.

Convert to coordinate format, insert the diagonal elements, and then 
convert back to CSC?

On Thursday, April 2, 2015 at 3:55:38 PM UTC-5, Seth wrote:

 The creation of the new sparse matrix (via spdiagm) doubles the amount of 
 memory, unfortunately. I had to kill it past 26 GB.


 On Thursday, April 2, 2015 at 1:36:48 PM UTC-7, Patrick O'Leary wrote:

 Perhaps try:

 z = e + spdiagm(fill(Inf, 5), 0, 5, 5)

 I can't try a matrix that large, I don't have the memory on this 
 system--I tested with 5000, though, and it is massively faster.

 On Thursday, April 2, 2015 at 3:21:37 PM UTC-5, Seth wrote:



 Consider the following:

 julia @time e = sprandn(5,5,0.3);
 elapsed time: 95.837541848 seconds (17178 MB allocated, 0.08% gc time in 
 4 pauses with 1 full sweep)

 then:

 julia function doinf!(e::SparseMatrixCSC)
for i = 1:size(e)[1]
e[i,i] = Inf
end
end
 doinf! (generic function with 1 method)

 julia @time z = doinf!(e);


 ...still going after about 20 minutes. I know sparse matrix insertion is 
 expensive, but is there a better way of doing this? (I don't have the 
 storage to make this a dense matrix.) I'm thinking it might be faster to 
 write my own function to assign random number or Inf within two loops, but 
 I'm surprised that the doinf! is taking so long.



[julia-users] Re: Julia development for Arm

2015-04-02 Thread Patrick O'Leary
On Wednesday, April 1, 2015 at 12:32:16 PM UTC-5, Viral Shah wrote:

 There is no definite timeframe. Julia does start up on ARMv7 already, and 
 I can use it to do a number of computations. It passes about 90% of the 
 tests.

 What makes our ARM port tough is the lack of easy access to ARM machines, 
 that people can login to and develop. A server class ARM machine accessible 
 online would greatly help speed the port.

 -viral


Hacker News delivers for a modest fee: https://www.scaleway.com/
2 Eurocents an hour or 10 Euro/mo, 4 cores, 2 GB RAM...not sure what kind 
of cores, they do mention it's ARMv7 (so 32-bit), though. Appears to have a 
datacenter in France.


[julia-users] Re: Better way of assigning diagonal values to sparse matrices?

2015-04-02 Thread Patrick O'Leary
Perhaps try:

z = e + spdiagm(fill(Inf, 5), 0, 5, 5)

I can't try a matrix that large, I don't have the memory on this system--I 
tested with 5000, though, and it is massively faster.

On Thursday, April 2, 2015 at 3:21:37 PM UTC-5, Seth wrote:



 Consider the following:

 julia @time e = sprandn(5,5,0.3);
 elapsed time: 95.837541848 seconds (17178 MB allocated, 0.08% gc time in 4 
 pauses with 1 full sweep)

 then:

 julia function doinf!(e::SparseMatrixCSC)
for i = 1:size(e)[1]
e[i,i] = Inf
end
end
 doinf! (generic function with 1 method)

 julia @time z = doinf!(e);


 ...still going after about 20 minutes. I know sparse matrix insertion is 
 expensive, but is there a better way of doing this? (I don't have the 
 storage to make this a dense matrix.) I'm thinking it might be faster to 
 write my own function to assign random number or Inf within two loops, but 
 I'm surprised that the doinf! is taking so long.



[julia-users] Re: Array construction types question

2015-04-01 Thread Patrick O'Leary
It's very helpful to note what your expected result is when asking a 
question like this--I'm not clear what isn't working as expected, here. As 
far as I can tell all the inferred types are correct, though the second one 
and the final one could be narrower.

On Wednesday, April 1, 2015 at 10:02:37 AM UTC-5, Michael Francis wrote:

 If I run the following, I get the results show to the right (in comments), 
 it appears array construction fails to raise to the common 
 parent type under certain conditions, is there a way round 
 this? Alternatively where is this code implemented ? 

 abstract Foo{K}
 type Wow{K,V} : Foo{K} end 
 type Bar{K,V} : Foo{K} end

 a = Wow{Int64, Int64}()
 b = Wow{Int64, Float64}()
 c = Bar{Int64, Int64}()
 d = Bar{Int64, String}()

 println( ** )
 println( typeof( [ a ]))  #Array{Wow{Int64,Int64},1}
 println( typeof( [ a, b ]))   #Array{Wow{K,V},1}
 println( typeof( [ a, c ]))   #Array{Foo{Int64},1}
 println( typeof( [ a, b, c ]))#Array{Foo{Int64},1}
 println( typeof( [ a, c, b ]))#Array{Foo{Int64},1}
 println( typeof( [ a, b, c, d ])) #Array{Foo{K},1}



[julia-users] Re: Help with parallelization

2015-03-20 Thread Patrick O'Leary
Try making the grids formal arguments to solveall():

function solveall(agrid, bgrid, cgrid, dgrid)
   ...
end

@time solveall(agrid, bgrid, cgrid, dgrid)

Then you should be able to switch the loop you're parallelizing over.

You probably also need a @sync somewhere to ensure all the workers are done 
before returning.

On Friday, March 20, 2015 at 11:07:00 AM UTC-5, Nils Gudat wrote:

 I'm still having problems understanding the basic concepts of 
 parallelization in Julia. It seems to me that the examples in the 
 documentation and those that I found elsewhere on the web don't really 
 reflect my usage case, so I'm wondering whether I'm approaching the problem 
 from the right angle. I've written a short piece of code that illustrates 
 what I'm trying to do; basically it's a large number of small calculations, 
 the results of which have to be stored in one large matrix.
 Here's the example:

 addprocs(3)

 agrid = linspace(1,4,4)
 bgrid = linspace(-1.05, 1.05, 30)
 cgrid = linspace(-0.1, 0.1, 40)
 dgrid = linspace(0.5, 1000, 40)

 result = SharedArray(Float64, (size(agrid,1), size(bgrid,1), 
 size(cgrid,1), size(dgrid,1)), pids=procs())

 @everywhere function calculate(a,b,c,d)
   quadgk(cos, -b*10π, c*10π)[1] + quadgk(sin, -b*10π, c*10π)[1]*d
 end

 function solveall()
   for a = 1:length(agrid)
 for b = 1:length(bgrid)
   for c = 1:length(cgrid)
 @parallel for d = 1:length(dgrid)
   result[a,b,c,d] = calculate(agrid[a], bgrid[b], cgrid[c], 
 dgrid[d])
 end
   end
 end
   end
   return result
 end

 @time solveall()

 Unfortunately, the speedup from parallelizing the inner loop isn't great 
 (going from ~9s to ~7.5s on my machine), so I'm wondering whether this is 
 actually the best way of implementing the parallelization. My originaly 
 idea was to somehow parallelize the outer loop, so that each processor 
 returns a 30x40x40 array, but I don't see how I can get the worker 
 processors to run the inner loops correctly.

 Any input would be greatly appreciated, as I've been tyring to parallelize 
 this for a while and seem to be at a point where I'm just getting more 
 confused now the harder I try.



[julia-users] Re: Help with parallelization

2015-03-20 Thread Patrick O'Leary
On Friday, March 20, 2015 at 11:48:40 AM UTC-5, Nils Gudat wrote:

 Thanks, that's a great suggestion! Writing:

 function solveall(agrid, bgrid, cgrid, dgrid)
   @sync @parallel for a = 1:length(agrid)
 ...
   end
   return result
 end

 @time solveall(agrid, bgrid, cgrid, dgrid)

 Reduces the time to ~4.3s, just about half the time of the single core 
 implementation!


It's possible that you might eke a little more out by using SharedArrays, 
but I'm not sure of all the tradeoffs. I do know in the problem I tried it 
for, I didn't derive any benefit from that approach.


[julia-users] Re: Error or supposed to happen? compile

2015-03-19 Thread Patrick O'Leary
Since string.jl is part of the bootstrap, there are things that won't be 
available yet when that part of the image is built. It looks like in this 
case STDOUT, which is the default target for println, hasn't been defined 
yet.

What are you trying to do with this print? There may be another solution.

On Thursday, March 19, 2015 at 1:53:24 PM UTC-5, Julia User wrote:

 I added a print statement to a function in Base.string.jl

 something like:

 function searchindex(s::ByteString, t::ByteString, i::Integer=1)
 println(Line 305)
 if length(t) == 1
 search(s, t[1], i)
 else
 searchindex(s.data, t.data, i)
 end
 end



 When I recompile julia I get an Error: Is that suppose to happen ?

 CC ui/repl.o
 LINK usr/bin/julia
 JULIA usr/lib/julia/sys0.o
 exports.jl
 base.jl
 reflection.jl
 build_h.jl
 version_git.jl
 c.jl
 options.jl
 promotion.jl
 tuple.jl
 range.jl
 expr.jl
 error.jl
 bool.jl
 number.jl
 int.jl
 operators.jl
 pointer.jl
 refpointer.jl
 rounding.jl
 float.jl
 complex.jl
 rational.jl
 abstractarray.jl
 subarray.jl
 array.jl
 subarray2.jl
 functors.jl
 bitarray.jl
 intset.jl
 dict.jl
 set.jl
 hashing.jl
 iterator.jl
 simdloop.jl
 reduce.jl
 inference.jl
 osutils.jl
 char.jl
 ascii.jl
 utf8.jl
 utf16.jl
 utf32.jl
 iobuffer.jl
 string.jl
 utf8proc.jl
 regex.jl
 pcre.jl
 base64.jl
 io.jl
 iostream.jl
 libc.jl
 libdl.jl
 env.jl
 path.jl
 intfuncs.jl
 nullable.jl
 task.jl
 show.jl
 stream.jl
 uv_constants.jl
 socket.jl
 stat.jl
 fs.jl
 process.jl
 multimedia.jl
 grisu.jl
 file.jl
 methodshow.jl
 floatfuncs.jl
 math.jl
 float16.jl
 cartesian.jl
 multidimensional.jl
 primes.jl
 reducedim.jl
 ordering.jl
 collections.jl
 sort.jl
 version.jl
 gmp.jl
 mpfr.jl
 combinatorics.jl
 hashing2.jl
 dSFMT.jl
 random.jl
 printf.jl
 serialize.jl
 multi.jl
 managers.jl
 loading.jl
 poll.jl
 mmap.jl
 sharedarray.jl
 datafmt.jl
 deepcopy.jl
 interactiveutil.jl
 replutil.jl
 test.jl
 meta.jl
 i18n.jl
 help.jl
 Terminals.jl
 LineEdit.jl
 REPLCompletions.jl
 REPL.jl
 client.jl
 markdown/Markdown.jl
 docs.jl
 error during bootstrap:
 LoadError(at sysimg.jl line 233: LoadError(at docs.jl line 213: 
 UndefVarError(var=:STDOUT)))
 jl_throw at /julia_src/usr/bin/../lib/libjulia.so (unknown line)
 unknown function (ip: -296211804)
 unknown function (ip: -296145600)
 unknown function (ip: -296147390)
 unknown function (ip: -296144131)
 jl_load at /julia_src/usr/bin/../lib/libjulia.so (unknown line)
 include at boot.jl:250
 anonymous at sysimg.jl:151
 unknown function (ip: -296209515)
 unknown function (ip: -296215837)
 unknown function (ip: -296145600)
 unknown function (ip: -296147390)
 unknown function (ip: -296144131)
 jl_load at /julia_src/usr/bin/../lib/libjulia.so (unknown line)
 unknown function (ip: 4203722)
 unknown function (ip: 4203045)
 unknown function (ip: 4202790)
 __libc_start_main at /usr/lib/libc.so.6 (unknown line)
 unknown function (ip: 4199657)
 unknown function (ip: 0)

 Makefile:168: recipe for target '/julia_src/usr/lib/julia/sys0.o' failed
 make[1]: *** [/julia_src/usr/lib/julia/sys0.o] Error 1
 Makefile:82: recipe for target 'julia-sysimg-release' failed
 make: *** [julia-sysimg-release] Error 2
 ...



[julia-users] Re: Overloading Base functions or creating new ones?

2015-03-16 Thread Patrick O'Leary
On Monday, March 16, 2015 at 10:09:40 AM UTC-5, David van Leeuwen wrote:

 Related to this question: what if you want to use the name of a base 
 function for your type, where the meaning is _not_ related, but there is no 
 sensible function that would have that meaning for your type?  

 E.g., in GaussianMixtures, I used `map(::GMM, ...)` for 
 maximum-a-posteriori adaptation of the GMM.  I can't see ::GMM in the role 
 of a ::Function, so I personally do not mind this re-use of an existing 
 function name from Base.  Others may disagree wholeheartedly. 

 ---david


Don't `import Base.map`, and leave `map` off of your exports list. This 
means users have to use the fully-qualified `GuassianMixtures.map`, but 
there is no risk of confusion.

(Though I am not sure that I would read map as a shorthand for maximum a 
posteriori--maxpost, maybe?)

Patrick


[julia-users] Re: Unable to find were it breaks IUP (0.4 after 0.4.0-dev+3703)

2015-03-16 Thread Patrick O'Leary
Also, if you could please check the test old build that Tony Kelman posted 
on that issue, and see if you can reproduce with that build.

On Monday, March 16, 2015 at 8:04:26 AM UTC-5, Kristoffer Carlsson wrote:

 This is likely related to https://github.com/JuliaLang/julia/issues/10249

 Can you try a simple gc() command on that machine and see if it crashes. 
 If it does, it is most likely the same.

 On Monday, March 16, 2015 at 4:06:14 AM UTC+1, J Luis wrote:

 The changes referred in 
 https://groups.google.com/forum/?fromgroups=#!topic/julia-users/qi6HpMrAS_A 
 broke IUP.jl in a way that I simply cannot even identify where the breakage 
 occurs.

 I managed to reproduce the crash with only one instruction, but even than 
 makes no sense to me. The next two lines crash julia

 julia using im_view_
 julia a=IUP.IupCanvas()

 It can be seen in 
 https://github.com/joa-quim/IUP.jl/blob/master/src/libiup.jl#L480 that 
 the wrapper is OK and if fact if I comment the two lines (that leave in 
 another module)

 https://github.com/joa-quim/IUP.jl/blob/master/src/IUP_CD.jl#L349
 https://github.com/joa-quim/IUP.jl/blob/master/src/IUP_CD.jl#L350

 that are include from the module im_view_ than I get an error of missing 
 function (normal, since I commented its inclusions) but otherwise all seam 
 to work.
 So I'm left with this back trace of which I'm unable to extract any 
 useful information.

 (On Win64 with a nightly from 2 days ago)

 Please submit a bug report with steps to reproduce this fault, and any 
 error messages that follow (in their entirety). Thanks.
 Exception: EXCEPTION_ACCESS_VIOLATION at 0x6bfb2b56 -- 
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_gc_collect at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown line)
 allocobj at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown line)
 jl_init_tasks at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown line)
 jl_alloc_cell_1d at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown line)
 jl_f_new_expr at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown line)
 inlineable at inference.jl:2372
 jl_apply_generic at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown line)
 inlining_pass at inference.jl:2914
 jl_apply_generic at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown line)
 inlining_pass at inference.jl:2809
 jl_apply_generic at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown line)
 typeinf_uncached at inference.jl:1745
 jlcall_typeinf_uncached_111 at  (unknown line)
 jl_apply_generic at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown line)
 typeinf at inference.jl:1386
 jlcall_typeinf_84 at  (unknown line)
 jl_apply_generic at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown line)
 typeinf at inference.jl:1345
 jl_apply_generic at V:\Julia-0.4.0-dev
 ...



[julia-users] Re: Unable to find were it breaks IUP (0.4 after 0.4.0-dev+3703)

2015-03-16 Thread Patrick O'Leary
Interesting--it certainly has the same symptoms as #10249, since you get 
the same backtrace I was getting, but is harder for you to trigger for some 
reason.

On Monday, March 16, 2015 at 12:37:52 PM UTC-5, J Luis wrote:

 I can't reproduce #20249. Tried with lots of gc() and nothing, all went 
 well. 


 segunda-feira, 16 de Março de 2015 às 13:04:26 UTC, Kristoffer Carlsson 
 escreveu:

 This is likely related to https://github.com/JuliaLang/julia/issues/10249

 Can you try a simple gc() command on that machine and see if it crashes. 
 If it does, it is most likely the same.

 On Monday, March 16, 2015 at 4:06:14 AM UTC+1, J Luis wrote:

 The changes referred in 
 https://groups.google.com/forum/?fromgroups=#!topic/julia-users/qi6HpMrAS_A 
 broke IUP.jl in a way that I simply cannot even identify where the breakage 
 occurs.

 I managed to reproduce the crash with only one instruction, but even 
 than makes no sense to me. The next two lines crash julia

 julia using im_view_
 julia a=IUP.IupCanvas()

 It can be seen in 
 https://github.com/joa-quim/IUP.jl/blob/master/src/libiup.jl#L480 that 
 the wrapper is OK and if fact if I comment the two lines (that leave in 
 another module)

 https://github.com/joa-quim/IUP.jl/blob/master/src/IUP_CD.jl#L349
 https://github.com/joa-quim/IUP.jl/blob/master/src/IUP_CD.jl#L350

 that are include from the module im_view_ than I get an error of missing 
 function (normal, since I commented its inclusions) but otherwise all seam 
 to work.
 So I'm left with this back trace of which I'm unable to extract any 
 useful information.

 (On Win64 with a nightly from 2 days ago)

 Please submit a bug report with steps to reproduce this fault, and any 
 error messages that follow (in their entirety). Thanks.
 Exception: EXCEPTION_ACCESS_VIOLATION at 0x6bfb2b56 -- 
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_profile_is_running at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown 
 line)
 jl_gc_collect at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown line)
 allocobj at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown line)
 jl_init_tasks at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown line)
 jl_alloc_cell_1d at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown line)
 jl_f_new_expr at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown line)
 inlineable at inference.jl:2372
 jl_apply_generic at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown line)
 inlining_pass at inference.jl:2914
 jl_apply_generic at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown line)
 inlining_pass at inference.jl:2809
 jl_apply_generic at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown line)
 typeinf_uncached at inference.jl:1745
 jlcall_typeinf_uncached_111 at  (unknown line)
 jl_apply_generic at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown line)
 typeinf at inference.jl:1386
 jlcall_typeinf_84 at  (unknown line)
 jl_apply_generic at V:\Julia-0.4.0-dev\bin\libjulia.dll (unknown line)
 typeinf at inference.jl:1345
 jl_apply_generic at V:\Julia-0.4.0-dev
 ...



Re: [julia-users] Parallel for-loops

2015-03-13 Thread Patrick O'Leary
On Friday, March 13, 2015 at 10:20:10 AM UTC-5, Pieter Barendrecht wrote:

 Overall, I'm a bit surprised that using more than 3 or 4 workers does not 
 decrease the running time. Any ideas? I'm using Julia 0.3.6 on a 64bit Arch 
 Linux system, Intel(R) Core(TM) i7-3630QM CPU @ 2.40GHz. 


At four workers, you now have a process occupying every physical core 
(assuming the scheduler is doing what we want), plus your main coordinating 
process which is also occuping one of those four cores but presumably not 
doing any simultaneous computation. Many workloads do not see significant 
acceleration from hyperthreading; if this is such a workload, adding more 
workers won't give you any more speedup, and as René mentions overhead can 
start to dominate.

Patrick


[julia-users] Re: how to paste png into ipython julia notebook?

2015-03-12 Thread Patrick O'Leary
You can do this with Images.jl. Example: 
http://htmlpreview.github.io/?https://github.com/timholy/Images.jl/blob/master/ImagesDemo.html

On Wednesday, March 11, 2015 at 11:05:07 AM UTC-5, Edward Chen wrote:

 from IPython.display import Image
 Image(filename='image.png')

 doesn't seem to work

 Thanks!
 -Ed



[julia-users] Re: 1 - 0.8

2015-03-12 Thread Patrick O'Leary
Julia does not try to hide the complexities of floating-point 
representations, so this is expected. There's a brief section in the manual 
[1] which lists some references on this topic--I personally recommend 
reading What Every Computer Scientist Should Know About Floating-Point 
Arithmetic 
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.102.244rep=rep1type=pdf,
 
but the other references are good, too.

[1] 
http://julia.readthedocs.org/en/latest/manual/integers-and-floating-point-numbers/#background-and-references
 
http://julia.readthedocs.org/en/latest/manual/integers-and-floating-point-numbers/#background-and-references

On Thursday, March 12, 2015 at 3:40:45 PM UTC-5, Hanrong Chen wrote:

 julia 1-0.8
 0.19996

 Is this a bug?



Re: [julia-users] I don't have BLAS functions.

2015-03-09 Thread Patrick O'Leary
BLAS.axpy! works fine (with no other import/using statements) in both 0.3 
(at least by 0.3.4) and 0.4-dev.

On Monday, March 9, 2015 at 5:58:28 AM UTC-5, Tobias Knopp wrote:

 Maybe it would be better to export the BLAS module instead of putting the 
 individual function into Base? Like we have Pkg.init() one would have 
 BLAS.axpy!

 Am Montag, 9. März 2015 11:22:23 UTC+1 schrieb Mauro:

 On my system I need to qualify it with Base, so Base.axpy! works. 

 On Mon, 2015-03-09 at 11:20, Daniel Carrera dcar...@gmail.com wrote: 
  Hello, 
  
  The Julia documentation says that you can access some BLAS functions: 
  
  
 http://docs.julialang.org/en/release-0.3/stdlib/linalg/?module-Base.LinAlg.BLAS#module-Base.LinAlg.BLAS
  
  
  However, I don't seem to have any of those in my system: 
  
  julia help(axpy!) 
  ERROR: axpy! not defined 
  
  julia using BLAS 
  ERROR: BLAS not found 
   in require at loading.jl:47 
  
  julia Pkg.add(BLAS) 
  ERROR: unknown package BLAS 
   in wait at task.jl:51 
   in sync_end at ./task.jl:311 
   in add at pkg/entry.jl:319 
   in add at pkg/entry.jl:71 
   in anonymous at pkg/dir.jl:28 
   in cd at ./file.jl:20 
   in __cd#228__ at ./pkg/dir.jl:28 
   in add at pkg.jl:20 
  
  julia 
  
  
  Does anybody know what's going on? What do I need to do to get the BLAS 
  functions? 
  
  Cheers, 
  Daniel. 



Re: [julia-users] Re: Functions in degrees

2015-03-04 Thread Patrick O'Leary
On Tuesday, March 3, 2015 at 2:57:28 PM UTC-6, Stuart Brorson wrote:

 Since types should be used sparingly, I don't think it a good idea to 
 replace convenience functions like sind and cosd with a type-driven 
 invocation mechanism -- ordinary users will be confused.


You also can't replace all of them, since the inverse functions can't be 
dispatched in this way. Anecdotally, I use atan2d more than any of the 
other *d functions, typically as a step on the way to making a plot. 


[julia-users] Re: Using ModX: Can scope avoid collisions improve readability?

2015-03-03 Thread Patrick O'Leary
On Monday, March 2, 2015 at 7:45:02 PM UTC-6, MA Laforge wrote:

 Your comment sounds alot like what Stefan said:
 https://groups.google.com/d/msg/julia-users/UvBff9QVKaA/P10-LRLezCUJ

 I admit I don't fully appreciate why this is a *technical* problem.  Most 
 scoping rules would dictate that you should be referring to the *most 
 local* version of the value.  In your case bar would come from module Foo.


One counterargument to that is that it makes the most local thing 
nonlocal--it's an unexported part of Foo. The export list is in part a 
declaration of I'm okay with these things sharing namespace if you want, 
and if you did `export bar` from Foo, then `import Foo`, you'd get what you 
wanted.

(There's a side discussion about things happening inside functions; I'm not 
concerned about that here because that can only make things more 
complicated.)


[julia-users] Re: Using ModX: Can scope avoid collisions improve readability?

2015-03-02 Thread Patrick O'Leary
On Saturday, February 28, 2015 at 11:06:38 AM UTC-6, MA Laforge wrote:

 C++ provides using namespace X to make available the contents of X to 
 the current scope.  This even works on un-named scopes within a function:


(etc.)

I know this is something that's come up before, and I think rejected--I 
think because it causes conflicts with multiple dispatch? But I can't seem 
to find the thread(s).

I can't create a hard conflict in a quick mental search for an example, but 
I can create some level of confusion:

module Foo
bar::Int = 1
end

module Baz
bar::Float64 = 27.3
with module Foo # not current Julia syntax
bar #which bar is this?
end
end


[julia-users] Re: splice! function. Instead of removing a single item, remove a collection of items stored in an array.

2015-02-24 Thread Patrick O'Leary
On Monday, February 23, 2015 at 9:08:46 PM UTC-6, Aero_flux wrote:

 Restarting Julia, adding functions and running @time gives a longer 
 elapsed time for both. Rerunning @time returns tiny values.


The first time you run a Julia method with a given set of argument types, 
you incur the cost of compilation. On subsequent invocations, you get 
cached, already compiled code.

Patrick


[julia-users] Re: linking Julia with C to improve performances

2015-02-24 Thread Patrick O'Leary
On Tuesday, February 24, 2015 at 7:00:34 AM UTC-6, Sean Marshallsay wrote:

 As Milan says, you shouldn't need C unless your program requires something 
 like LAPACK or BLAS.


Even then, many BLAS and LAPACK calls are already wrapped in Julia and can 
be used directly. 


[julia-users] Re: World of Julia

2015-02-23 Thread Patrick O'Leary
On Sunday, February 22, 2015 at 5:50:09 PM UTC-6, Jiahao Chen wrote:

 World of Julia is now updated in time for the 2015 Oscars.

 336 contributors to JuliaLang/julia
 545 packages
 695 devs total

 IJulia notebook: 
 https://github.com/jiahao/ijulia-notebooks/blob/master/2014-06-30-world-of-julia.ipynb

 Nbviewer link: 
 http://nbviewer.ipython.org/urls/raw.github.com/jiahao/ijulia-notebooks/master/2014-06-30-world-of-julia.ipynb


A small suggestion: it may be more insightful if you remove merge commits; 
I would say I'm definitely overcounted in the current results.


Re: [julia-users] Re: -1 days old master

2015-02-19 Thread Patrick O'Leary
For what it's worth, I did see Oscar make a comment on an issue this 
morning which confirms that his clock is off.

On Thursday, February 19, 2015 at 12:53:31 AM UTC-6, Ivar Nesje wrote:

 Git stores committer time with timezone, and we ask git for a UNIX 
 timestamp and compare to the current system unix timestamp on startup, so 
 I'm pretty sure we do things correctly. 

 If the system clock is wrong, there is not much we can do.



Re: [julia-users] isa should be user friendlier

2015-02-17 Thread Patrick O'Leary
On Tuesday, February 17, 2015 at 10:42:01 AM UTC-6, Abel Siqueira wrote:

  What Tim was trying to point out is that `isa` a builtin, and many things
 depend on it being able to differentiate a scalar from an array.

 However, if you, in your code, wants to ignore this, you can define
 a function that does what you want, and it won't break anything because
 only your code depends on it.


Before you get too into defining your own stuff--eltype() is already 
sufficiently generic:

julia eltype([1.0]) == Float64
true

julia eltype(1.0) == Float64
true


[julia-users] Re: using ccall

2015-02-15 Thread Patrick O'Leary

On Sunday, February 15, 2015 at 11:54:34 AM UTC-6, Seth wrote:

 Maybe I'm missing something, but isn't that result what you would expect 
 by calling foo with (1,2,3,4)? The sum is 10.


The response was not from Josh, who posted the original question. Dominique 
is showing that at least someone gets the expected answer. The question is 
what might be different in Josh's situation. 


Re: [julia-users] Re: Memcpy with multidimensional arrays?

2015-02-10 Thread Patrick O'Leary
I wouldn't either, but copy!() is probably clearer, and at extremely low 
risk of ever meaning anything else.

On Monday, February 9, 2015 at 8:14:09 PM UTC-6, Jameson wrote:

 I wouldn't be surprised if `x[:] = y` dispatched to the `copy!` function 
 and was actually exactly equivalent.

 On Mon Feb 09 2015 at 7:38:32 PM Kirill Ignatiev 
 kirill.ignat...@gmail.com wrote:

 On Monday, 9 February 2015 19:07:37 UTC-5, Patrick O'Leary wrote:

 I think you can go with 

 copy!(x, y)


 That's it, thank you.
  



[julia-users] Re: Juno+Julia installation

2015-02-09 Thread Patrick O'Leary
I don't believe it's the same issue, but no matter--it is *an* issue. What 
happens if you run



*git --git-dir=/Users/ericshain/.julia/.cache/Stats merge-base 
78f5810a78fa8bee684137d703d21eca3b1d8c78 
8208e29af9f80ef633e50884ffb17cb25a9f5113*directly from a command line 
(outside of Julia)?

On Sunday, February 8, 2015 at 11:13:30 AM UTC-6, Eric S wrote:

 I think I have the same issue (in Yosemite). I can't process the 
 Pkg.update():

*_**   _ **_**(_)**_** |  A fresh approach to technical 
 computing*

   *(_)** | **(_)* *(_)**|  Documentation: 
 http://docs.julialang.org http://docs.julialang.org*

 *   _ _   _| |_  __ _   |  Type help() for help.*

 *  | | | | | | |/ _` |  |*

 *  | | |_| | | | (_| |  |  Version 0.3.5 (2015-01-08 22:33 UTC)*

 * _/ |\__'_|_|_|\__'_|  |  Official http://julialang.org/ 
 http://julialang.org/ release*

 *|__/   |  x86_64-apple-darwin13.4.0*


 *julia **Pkg.update()*

 *INFO: Updating METADATA...*

 *ERROR: failed process: Process(`git 
 --git-dir=/Users/ericshain/.julia/.cache/Stats merge-base 
 78f5810a78fa8bee684137d703d21eca3b1d8c78 
 8208e29af9f80ef633e50884ffb17cb25a9f5113`, ProcessExited(1)) [1]*

 * in readbytes at 
 /Applications/Julia-0.3.5.app/Contents/Resources/julia/lib/julia/sys.dylib*

 * in readchomp at pkg/git.jl:24*

 * in installed_version at 
 /Applications/Julia-0.3.5.app/Contents/Resources/julia/lib/julia/sys.dylib*

 * in installed at 
 /Applications/Julia-0.3.5.app/Contents/Resources/julia/lib/julia/sys.dylib*

 * in update at 
 /Applications/Julia-0.3.5.app/Contents/Resources/julia/lib/julia/sys.dylib*

 * in anonymous at pkg/dir.jl:28*

 * in cd at 
 /Applications/Julia-0.3.5.app/Contents/Resources/julia/lib/julia/sys.dylib*

 * in __cd#227__ at 
 /Applications/Julia-0.3.5.app/Contents/Resources/julia/lib/julia/sys.dylib*

 * in update at 
 /Applications/Julia-0.3.5.app/Contents/Resources/julia/lib/julia/sys.dylib*


 Git seems to work fine on my Mac.

 Eric

 On Saturday, February 7, 2015 at 12:54:14 PM UTC-6, Jeremy Cavanagh wrote:


 Hi Joseph,

 I'm no expert, but, I managed to install Juno on Mac OS X 10.9 with no 
 problem. Have you tried installing Julia first then Juno. If you and your 
 students are working on Apple Macs I would recommend installing Julia via 
 Homebrew because it then becomes easier to maintain any available updates.

 An alternative approach is to install LightTable (after Julia) and then 
 add the Juno package from within LightTable. Also, don't forget to run 
 Pkg.update() from the Julia repl before running LightTable. This is the 
 method I use on linux and Mac because there was a problem with Juno for me 
 on linux. I think it was because they do not provide a fedora version or a 
 generic linux binary. The only difference between Juno and LightTable is at 
 the start up but essentially they are exactly the same. This approach is 
 also more flexible since it works on all flavours of linux, OS X and 
 Windows.

 I hope this is of some help, good luck.




[julia-users] Re: Memcpy with multidimensional arrays?

2015-02-09 Thread Patrick O'Leary
I think you can go with 

copy!(x, y)

Quoting the built-in help for reference:

help? copy!
INFO: Loading help data...
Base.copy!(dest, src)

   Copy all elements from collection src to array dest.
   Returns dest.

Base.copy!(dest, do, src, so, N)

   Copy N elements from collection src starting at offset
   so, to array dest starting at offset do. Returns
   dest.

On Monday, February 9, 2015 at 5:54:26 PM UTC-6, Kirill Ignatiev wrote:

 Given two Float64 arrays x and y of the same shape, is x[:]=y the correct 
 way to *efficiently* copy all values from one array into the other?

 Basically, I would like the equivalent of memcpy, just copying memory, and 
 doing no extra work.



[julia-users] Re: Juno+Julia installation

2015-02-06 Thread Patrick O'Leary
On Friday, February 6, 2015 at 3:07:35 PM UTC-6, Joseph Cullen wrote:

 I have searched for solutions, but none of them seem to work.


It would help to know (roughly) what you've tried.

Agreeing to the Xcode/iOS license requires admin privileges, please re-run 
as root via sudo.

This sounds suspicious.


[julia-users] Re: poly2mask in Julia?

2015-01-28 Thread Patrick O'Leary
On Wednesday, January 28, 2015 at 4:05:07 PM UTC-6, Andrew McLean wrote:

 [Apologies if you see this post twice, it's been a number of hours since 
 my original post and it hasn't appeared.]


Sorry about that--I have no idea how it sat in the queue for so long. I 
discarded the original post and kept only the repost. Further posts should 
appear immediately.

Patrick 


[julia-users] Re: Help grokking a certain unsupported or misplaced expression

2015-01-26 Thread Patrick O'Leary
Using JuliaParser.jl as a reference, it loks like  is in the class 
unary_and_binary_ops, as well as syntactic_binary_ops. The | operator is 
not in either of these classes. At least one reason for the difference in 
lowering to the AST is that  is also an addressof-like operator in the 
context of a ccall. I suspect, however, that the inconsistency in usage of 
the surface syntax can be regarded as a bug--you should go ahead and file 
an issue.

On Monday, January 26, 2015 at 6:28:49 AM UTC-6, Gabriel Mitchell wrote:

 Here are two statements, one written with chained binary operations the 
 other in prefix notation 

 [true,true] | [true,false] | [false,true]
 2-element Array{Bool,1}:
  true
  true
 |([true,true], [true,false],[false,true])
 2-element Array{Bool,1}:
  true
  true


 Now I can also do the first for  as in
 [true,true]  [true,false]  [false,true]
 2-element Array{Bool,1}:
  false
  false

 However if I try this in the prefix notation

 ([true,true], [true,false],[false,true])

 unsupported or misplaced expression 
 while loading In[32], in expression starting on line 1

 Looking a little closer, I find that the parser does this

 :(|([true,true], [true,false],[false,true])).head
 :call
 :(([true,true], [true,false],[false,true])).head
 :

 Can someone tell me, is this behavior for '' on purpose? If so can 
 someone point to some documentation so I can read about said purpose? I was 
 interested in making calls in the function prefix notation so that i can 
 write something like '(bunchofarrays...)'. This works for '|' but not for 
 '' on my version (0.3.4). Or if there is a more idiomatic way to achieve 
 this effect I'd also like to here.



Re: [julia-users] Dan Luu's critique of Julia: needs commented code, better testing, error handling, and bugs fixed.

2014-12-29 Thread Patrick O'Leary
On Monday, December 29, 2014 4:13:35 PM UTC-6, Jeff Bezanson wrote:

 But one 
 feature we could stand to add is asserting properties that must be 
 true for all arguments, and running through lots of combinations of 
 instances.


Anyone who is interested in this is welcome to use 
https://github.com/pao/QuickCheck.jl as a starting point.


[julia-users] Re: Is everything visible within a module by default?

2014-12-06 Thread Patrick O'Leary
Yes. Exporting the name causes it to be available in the global namespace 
with using. All names are visible in the fully qualified namespace.

On Saturday, December 6, 2014 5:34:32 PM UTC-6, Petr Krysl wrote:

  When I qualify the names of functions or types within a module that I am 
 using with the name of the module, as for instance 

 fens1=FENodeSetModule.FENodeSet(xyz=[X[:,1] X[:,2]]),

  I can access everything, even though the module does not export anything.

 Is that as expected?

 Thanks,

 Petr



[julia-users] Re: Is Dict{ASCIIString, Any} not a superset of Dict{ASCIIString, Float64}?

2014-11-26 Thread Patrick O'Leary
On Wednesday, November 26, 2014 6:01:56 PM UTC-6, Test This wrote:

 I have the following function defined to check whether a record exists in 
 a Mongodb database (thanks a million for PyCall, which make it easy to use 
 pymongo 
 to interact with mongodb in julia).

 function doesrecexist(collectn::PyObject, rec::Dict{ASCIIString,Any})
 # checks if record exists.
 r = collectn[:find_one](rec)
 return r
 end 


 If I define 

 rec = [p = 0.3] 

 julia recognizes it as Dict{ASCIIString, Float64}. Then if I do, 

 doesrecexist(collectn, rec), I get an error saying *ERROR: `doesrecexist` 
 has no method matching doesrecexist(::PyObject, 
 ::Dict{ASCIIString,Float64})*

 If I remove the type declaration for rec in doesrecexist, things work 
 fine. I have no other method defined for doesrecexist. Does it make sense 
 to get this error, given 
 that I intend to allow any values in the rec Dict by declaring Any? Is 
 there a workaround where I can declare the type for rec when defining the 
 function, without having 
 to define doesrecexist for Float values, Int values, String values etc.

 Thank you.


You've hit type invariance. In Julia, parametric types are invariant--that 
is, Dict{ASCIIString,Float64} is not a subtype of Dict{ASCIIString,Any}. 
However, you can use generic types instead:

function doesrecexist{T}(collectn::PyObject, rec::Dict{ASCIIString, T})
...
end

Which will match for all T. You can also restrict T to subtypes of a given 
type in the curly-braced section. For instance, if you wanted to guarantee 
that the type T was a subtype of Real:

function doesrecexist{T:Real}(...)
...
end

For more information on why we use invariant parametric types, search the 
list for parametric invariant or similar; there have been a few 
discussions on the topic.

Patrick


Re: [julia-users] Memory mapping composite type + fixed length strings

2014-11-23 Thread Patrick O'Leary
On Saturday, November 22, 2014 10:06:24 PM UTC-6, Joshua Adelman wrote:

 I just checked out StrPack and installed it. I think I have it working 
 properly in the case of a packed immutable type that just contains 
 numerical fields. I'm still not sure how to make things work to unpack 
 fixed length strings.


That's the least-tested codepath, I think.
 

 If my string in field e is a fixed-length of 4, then my composite type 
 looks like

 @struct immutable TestType2
 a::Float64
 b::Int32
 c::Float32
 d::Int16
 e::Array{Uint8,1}(4)   # also tried ASCIIString(4)
 end


That looks correct. Though I'm not sure where you would get pad bytes in 
here? Alignments all look native.

And then I know my data on file has 10 records, I can loop through as:

 data = Array(TestType2, (10,))

 f = open(test1.dat, r)
 for k = 1:10
 data[k] = unpack(f, TestType2, {a=8, b=4,c=4, d=2, 
 e=4}, align_packed, :NativeEndian)
 println(data[k])
 end

 But Array{Uint8,1}(4) results in corrupted values after the first record 
 is read in and ASCIIString(4) gives an error:

 TestType2(-1.2538434657806456,0,0.5783242f0,0,a)
 ERROR: invalid ASCII sequence


What is the binary data you're reading? Is there a sample file?

Where in the first record that appears to be read correctly (at least it 
 prints), the last field value should be abcd not a. Am I missing 
 something obvious? Also, is there a strategy for combining all of this with 
 a memory mapped file?


Probably. If you can get a stream, you can unpack it. You might need some 
combination of mmap_array() and IOBuffer().


Re: [julia-users] impressions and questions from a Matlab user

2014-11-23 Thread Patrick O'Leary
On Sunday, November 23, 2014 7:55:33 PM UTC-6, Stefan Karpinski wrote:

 11 seconds seems like an awfully long time. In the days of the slow REPL 
 when Julia compiled itself upon starting up, that's about how long it took. 
 What's your versioninfo? 


Windows doesn't ship with sys.dll, for what it's worth.


Re: [julia-users] Memory mapping composite type + fixed length strings

2014-11-22 Thread Patrick O'Leary
On Saturday, November 22, 2014 9:57:51 AM UTC-6, Isaiah wrote:

 Have you looked at StrPack.jl? It may have a packed option. Julia uses the 
 platform ABI padding rules for easy interop with C.


Yes, you can used the align_packed strategy.


Re: [julia-users] Re: Julia on an ARM Processor?

2014-11-18 Thread Patrick O'Leary
On Tuesday, November 18, 2014 1:35:39 PM UTC-6, Isaiah wrote:

 I'm not sure if this is mentioned explicitly in the links from Ivar, but: 
 we have only tested on ARMv7 (Samsung Chromebooks with Exynos 5 
 processors). I don't know if the LLVM JIT even supports anything lower than 
 v7.

 As we get closer to generalized static compilation capability, 
 cross-targeting should become more feasible. Clang supports ARMv5, so that 
 is probably the practical limit (leaving aside memory requirements, 
 although embedded covers a wide spectrum these days).


Going the other direction, I've recently acquired a Nexus 9, and once some 
OS-level development has been done in the community I'll take a shot at 
AArch64.


  1   2   3   >