[julia-users] Re: FastGauss.jl: fast computation of Gauss quadrature rules

2014-09-02 Thread Jason Merrill
Really impressive. It's cool to see people implementing so many state of 
the art algorithms in Julia.

On Monday, September 1, 2014 2:33:31 PM UTC-7, Alex Townsend wrote:

 I have written a package FastGauss.jl available here: 

 https://github.com/ajt60gaibb/FastGauss.jl

 to compute Gauss quadrature rules to 16-digit precision (so far Legendre, 
 Jacobi, Lobatto, Radau),
 aiming to be the fastest implementation in Julia. For example, this is how 
 long 
 it takes to compute 1,000,000 Gauss-Legendre nodes and weights: 

 tic(), GaussLegendre( 100 ); toc() 
 elapsed time: 0.336489122 seconds


 In my comparisons, GaussLegendre() is faster than Base.gauss() for n60 
 (nothing in it for 
 small n), but my implementation right now does not allow for higher 
 precision. I couldn't find a
 Gauss-Jacobi code in Base. 

 I am a Julia beginner (only been learning for 2 weeks) so I am assuming 
 the code can be 
 improved in a million and one ways. Please tell me if I've done something 
 that Julia does 
 not like. I am not sure if it is appropriate to make this an official 
 package.



[julia-users] Re: FastGauss.jl: fast computation of Gauss quadrature rules

2014-09-02 Thread Jason Merrill
On Monday, September 1, 2014 2:33:31 PM UTC-7, Alex Townsend wrote:

 I have written a package FastGauss.jl available here: 

 https://github.com/ajt60gaibb/FastGauss.jl


 I am a Julia beginner (only been learning for 2 weeks) so I am assuming 
 the code can be 
 improved in a million and one ways. Please tell me if I've done something 
 that Julia does 
 not like. I am not sure if it is appropriate to make this an official 
 package.

 
One thing to look out for is making sure your functions have consistent 
return types. E.g. in 
https://github.com/ajt60gaibb/FastGauss.jl/blob/91e2ac656b856876563d5aacf7b5a405e068b3da/src/GaussLobatto.jl#L4
 
you have

if ( n == 1 ) 
error(Lobatto undefined for n = 1.) 
elseif ( n == 2 ) 
x = ([-1.0,1.0],[1.0,1.0]) 
elseif ( n == 3 ) 
x = ([-1, 0, 1], [1.0, 4.0, 1.0]/3)
# ...

In the n==2 case, you're returning a tuple of two float vectors, but in the 
n==3 case, you're returning a tuple with one Int vector and one float 
vector.

This issue crops up in a few other places, including sometimes returning a 
number and other times returning a vector.

Another thing you might want to consider is devectorizing compound 
operations on vectors to avoid allocating containers to store intermediate 
results (either using https://github.com/lindahua/Devectorize.jl or 
manually). To pick one random place where this might be 
relevant: 
https://github.com/ajt60gaibb/FastGauss.jl/blob/5e3a8a2f9a7e327622bdd43f69bb712afcb16743/src/GaussJacobi.jl#L36

I'm not 100% sure whether this will be relevant to your API, but many high 
performance Julia libs expose mutating versions of functions (marked with 
an ! at the end) that normally return arrays, allowing the caller to 
instead pass in a preallocated array for the result to be stored in that 
could potentially be reused from call to call.

If a quadrature rule may in some cases be used only once, it might be nice 
if there were a way to apply the quadrature method to a function on the fly 
without ever generating the entire node and weight vector.




Re: [julia-users] Re: live plotting in PyPlot.jl?

2014-09-02 Thread Sheehan Olver

Also, there seems to be some weird behaviour at the edge of the plots: 
for example, even if my values all vanish at the edge, the plot didn’t unless I 
repeated the values.




On 2 Sep 2014, at 12:32 pm, Simon Danisch sdani...@gmail.com wrote:

 For the color, I guess you just want some mapping of the z value, right? I 
 should really implement that.
 I'm not sure what you mean by functions on a disc, you should probably 
 explain that to me!
 The black spots on the screen-shot definitely looks like what I'm talking 
 about... I should finally fix this!
 
 Am Dienstag, 19. August 2014 07:46:22 UTC+2 schrieb Sheehan Olver:
 Hi,
 
 Is there a way to force plotting in PyPlot.jl, to simulate animation?  Right 
 now if I do a for loop over a sequence of plots, it only outputs the last 
 plot.
 
 This is in IJulia running on OS X with matplotlib version 1.3.1 installed, 
 and pygui(true)
 
 Sheehan



[julia-users] lazy package/module loading?

2014-09-02 Thread Sheehan Olver

I'm trying to lazily include PyPlot in a module.  I tried the following 
code:

function foo()
require(PyPlot)
PyPlot.plot([1:10])
end


This works when its evaluated in the REPL, but not when included in a 
package.  Is there a methodology for doing this?


Re: [julia-users] Strange Slicing Behaviour

2014-09-02 Thread Tim Holy
Your example involves two tricky issues: slice behavior and the fact that, 
despite appearances, A += b is not in-place. See issues #3424, #3217, and 
precedents they link to.

I'd be interested in hearing more detail about how using slice gets nasty; as 
you say, from this example slice doesn't look so bad. In trying to fix this, we 
want to make sure we're aware of all the issues.

--Tim

On Monday, September 01, 2014 10:02:54 PM Christoph Ortner wrote:
 a = rand(3,3,3,3)
 b = rand(3,3)
 # this works:
 a[1,1,:,:] = slice(a,1,1,:,:)+b
 # this does not work:
 a[1,1,:,:] += b
 
 This example does not look so bad, but once you use expressive variable
 names and more dimensions it quickly gets very nasty. Because of it, I am
 doing less vectorisation than I would prefer.
 
 I know there is a lot of discussion on slicing on the Julia issues list, so
 I did not want to post another issue there.
 
 Is this likely to be resolved in future releases? Are there elegant
 alternatives?



Re: [julia-users] lazy package/module loading?

2014-09-02 Thread Tim Holy
require doesn't bring PyPlot into the namespace of your module. Try 
Main.PyPlot.plot([1:10]).

--Tim

On Tuesday, September 02, 2014 03:04:39 AM Sheehan Olver wrote:
 I'm trying to lazily include PyPlot in a module.  I tried the following
 code:
 
 function foo()
 require(PyPlot)
 PyPlot.plot([1:10])
 end
 
 
 This works when its evaluated in the REPL, but not when included in a
 package.  Is there a methodology for doing this?



[julia-users] Re: live plotting in PyPlot.jl?

2014-09-02 Thread Simon Danisch
I'll look into these issues! 
I don't really know what you mean by vanish, but as the edges are special 
cases and I'm not treating them as that, I'm not surprised that there are 
things going wrong ;)
The non uniform grids should be easy to implement. I'll give you an update 
when I've implemented them.

Am Dienstag, 19. August 2014 07:46:22 UTC+2 schrieb Sheehan Olver:

 Hi,

 Is there a way to force plotting in PyPlot.jl, to simulate animation? 
  Right now if I do a for loop over a sequence of plots, it only outputs the 
 last plot.

 This is in IJulia running on OS X with matplotlib version 1.3.1 installed, 
 and pygui(true)

 Sheehan



Re: [julia-users] lazy package/module loading?

2014-09-02 Thread Sheehan Olver

Works like a charm, thanks!

Sheehan


On 2 Sep 2014, at 8:32 pm, Tim Holy tim.h...@gmail.com wrote:

 require doesn't bring PyPlot into the namespace of your module. Try 
 Main.PyPlot.plot([1:10]).
 
 --Tim
 
 On Tuesday, September 02, 2014 03:04:39 AM Sheehan Olver wrote:
 I'm trying to lazily include PyPlot in a module.  I tried the following
 code:
 
 function foo()
require(PyPlot)
PyPlot.plot([1:10])
 end
 
 
 This works when its evaluated in the REPL, but not when included in a
 package.  Is there a methodology for doing this?
 



[julia-users] Re: starting up julia from mac terminal (the way you would for python by typing python)

2014-09-02 Thread Christoph Ortner
The file `~/.bash_profile'  should contain something like this:

export 
PATH=~/Dropbox/Admin/scripts:/Applications/Julia-0.3.0.app/Contents/MacOS:/Users/ortner/anaconda/bin:$PATH




On Tuesday, 2 September 2014 06:16:02 UTC+1, Anonymous wrote:

 I'm trying to figure out how to create an alias/shortcut whatever in the 
 mac terminal so that I can just type julia and julia will start up, just 
 the way python works when I type python.  It's something about a bash or 
 something.



[julia-users] Re: starting up julia from mac terminal (the way you would for python by typing python)

2014-09-02 Thread Christoph Ortner
The file `~/.bash_profile'  should contain something like this:

export 
PATH=~/Dropbox/Admin/scripts:/Applications/Julia-0.3.0.app/Contents/MacOS:/Users/ortner/anaconda/bin:$PATH

However, it will always open a new terminal, rather than opening julia in 
your current terminal. If anybody knows how to fix this, i would love to 
hear.

Christoph

On Tuesday, 2 September 2014 06:16:02 UTC+1, Anonymous wrote:

 I'm trying to figure out how to create an alias/shortcut whatever in the 
 mac terminal so that I can just type julia and julia will start up, just 
 the way python works when I type python.  It's something about a bash or 
 something.



[julia-users] Re: FastGauss.jl: fast computation of Gauss quadrature rules

2014-09-02 Thread Richard Dennis
Nice work.  You might also want to check out 
https://github.com/billmclean/GaussQuadrature.jl


On Monday, September 1, 2014 10:33:31 PM UTC+1, Alex Townsend wrote:

 I have written a package FastGauss.jl available here: 

 https://github.com/ajt60gaibb/FastGauss.jl

 to compute Gauss quadrature rules to 16-digit precision (so far Legendre, 
 Jacobi, Lobatto, Radau),
 aiming to be the fastest implementation in Julia. For example, this is how 
 long 
 it takes to compute 1,000,000 Gauss-Legendre nodes and weights: 

 tic(), GaussLegendre( 100 ); toc() 
 elapsed time: 0.336489122 seconds


 In my comparisons, GaussLegendre() is faster than Base.gauss() for n60 
 (nothing in it for 
 small n), but my implementation right now does not allow for higher 
 precision. I couldn't find a
 Gauss-Jacobi code in Base. 

 I am a Julia beginner (only been learning for 2 weeks) so I am assuming 
 the code can be 
 improved in a million and one ways. Please tell me if I've done something 
 that Julia does 
 not like. I am not sure if it is appropriate to make this an official 
 package.



[julia-users] CoordInterpGrid

2014-09-02 Thread Jude
Hi,

I want to use interpolation and have downloaded the Grid package but for 
some reason it won't allow me to use CoordInterpGrid. It works fine if I 
use interpGrid but when I try CoordIntepGrid it says  ERROR: 
CoordInterpGrid not defined. Does anyone know why this is and how I might 
get CoordInterpGrid to work? I have even tried it using the example 
provided by Tim Holy to make sure it was not a mistake of syntax but it 
still does not work. Any help would be really appreciated!

Cheers,
Jude


[julia-users] Holy freaking cow / Job well done

2014-09-02 Thread Carlo Kokoth
I remember checking Julia some time back, and thinking This looks almost 
perfect.

Macros (and homoiconicity), multiple dispatch, the Types section of the 
manual is geek-porn, speedy more than enough, support for distributed 
computation, yada, yada (yeah, I'm a programming language fetishist ;-).

I wished basically only two things, threading support and some way to cache 
compiled code. 

And today I checked what is happenning, and whadda-ya-know, 0.3.0 has been 
released, stating, among other things:

- System image caching for fast startup.
- Multi-process shared memory support. (multi-threading support is in 
progress and has been a major summer focus)

WoOOOoo. Haven't expected so fast progress.

Next very pleasant surprise was trying out GTK. I feared (sadly running on 
windows at work) that the installation would fail because it would try to 
compile the dependencies from sources and I don't have mingw on PATH by 
default, but instead binaries were downloaded. And then, after writing a 
bit of test code, it worked without a glitch.

I had more troubles with getting some packages work with python (no 
pre-compiled bins, and although I can manage, sometimes getting all the 
transitive dependencies is an afternoon of downloading yet another source 
archive, ./configure-ing, make-ing, digging into why it failed, patching, 
rinsing, repeating ...).

So right now, I'd give Julia about 6 points out of 5, (as 1.0 approaches, 
the score will most likely rise to 10 out of 5, don't skimp on the awesome 
;-).

In-fucking-credible work, thanks to everyone who helped to make it happen,
  C.K.


[julia-users] A bug?

2014-09-02 Thread xiongjieyi
julia versioninfo()
Julia Version 0.4.0-dev+323
Commit ecd039c* (2014-08-24 16:57 UTC)
Platform Info:
  System: Linux (x86_64-redhat-linux)
  CPU: Intel(R) Xeon(R) CPU E7- 4830  @ 2.13GHz
  WORD_SIZE: 64
  BLAS: libopenblas (USE64BITINT NO_AFFINITY NEHALEM)
  LAPACK: libopenblas
  LIBM: libopenlibm
  LLVM: libLLVM-3.3

julia map((x,y,z)-x+y+z,[1,2,3],[2,3,4],[3,4,5])
3-element Array{Int64,1}:
  6
  9
 12

julia map((x,y,z)-x+y+z,(1,2,3),(2,3,4),(3,4,5))
ERROR: BoundsError()
 in heads at tuple.jl:56
 in map at tuple.jl:59 (repeats 4 times)





Re: [julia-users] Strange Slicing Behaviour

2014-09-02 Thread Christoph Ortner
Tim - many thanks for the reply. To put this into context: I have 15y+ 
experience with matlab, and some limited experience with other languages 
(C,C++,Java,Fortran). 

Here is a code snippet that brought this up. It precomputes a lot of data 
that is then used in a variety of (non-standard) ways for Tight-Binding 
molecular dynamics. This is a quick and dirty first-attempt implementation 
to just get it to run.

 # the following arrays are generates elsewhere, d \in \{2,3\}, N is 
large, alpha real
 # R : d x N x N  array
 # E : N x N array

 # VERSION 1
 for a = 1:d
  hHamiltonian[a, a, :, :] = slice(hHamiltonian, a, a, :, :) - 
alpha * E
  for b = 1:d
   hHamiltonian[a,b,:,:] = slice(hHamiltonian, a, b, :, :) + 
alpha^2 * E .* slice(R, a, :, :) .* slice(R, b, :, :)
  end
 end

instead of what I would have liked to write:

 # VERSION 2
 for a = 1:d
  hHamiltonian[a, a, :, :] += -alpha * E
  for b = 1:d
   hHamiltonian[a, b, :, :] += alpha^2 * E .* R[a, :, :] .* 
R[b, :, :]
  end
 end


Granted, since writing the above post I read up on Comprehensions (first 
time I have used them, and quite like the result)

 # VERSION 3
 hHamiltonian = [ - alpha*E[m,n]*del[a,b] + alpha^2 * E[m,n] * R[a,m,n] 
* R[b,m,n]
 for a = 1:d, b=1:d, m = 1:N, n = 1:N]


I am quite happy with this last version, for the moment at least. Some 
points remain:
1. what is the performance of Comprehensions compared with vectorisation or 
straight for-loops?
2. The current slicing behaviour of Julia is just unexpectedly clunky. 
Whether or not VERSION 2 is good code to write, there are many instances 
where I would have written like this without a second thought. Another 
example is vectorised finite element assembly which looks very similar, but 
more complex.
3. VERSION 2 is still the most natural way to write for many people who do 
quick and dirty numerical experiments and don't want to think too much 
about good coding practises. These are the kind of people who would prefer 
the code to run for 2 days rather than 2 hours, if it means they spend 1/10 
of their time coding.

Any comments will be helpful. Thanks,
   - Christoph
 


On Tuesday, 2 September 2014 11:23:34 UTC+1, Tim Holy wrote:

 Your example involves two tricky issues: slice behavior and the fact that, 
 despite appearances, A += b is not in-place. See issues #3424, #3217, and 
 precedents they link to. 

 I'd be interested in hearing more detail about how using slice gets nasty; 
 as 
 you say, from this example slice doesn't look so bad. In trying to fix 
 this, we 
 want to make sure we're aware of all the issues. 

 --Tim 

 On Monday, September 01, 2014 10:02:54 PM Christoph Ortner wrote: 
  a = rand(3,3,3,3) 
  b = rand(3,3) 
  # this works: 
  a[1,1,:,:] = slice(a,1,1,:,:)+b 
  # this does not work: 
  a[1,1,:,:] += b 
  
  This example does not look so bad, but once you use expressive variable 
  names and more dimensions it quickly gets very nasty. Because of it, I 
 am 
  doing less vectorisation than I would prefer. 
  
  I know there is a lot of discussion on slicing on the Julia issues list, 
 so 
  I did not want to post another issue there. 
  
  Is this likely to be resolved in future releases? Are there elegant 
  alternatives? 



[julia-users] Re: FastGauss.jl: fast computation of Gauss quadrature rules

2014-09-02 Thread Alex Townsend


On Tuesday, 2 September 2014 03:00:10 UTC-4, Jason Merrill wrote:

 On Monday, September 1, 2014 2:33:31 PM UTC-7, Alex Townsend wrote:

 I have written a package FastGauss.jl available here: 

 https://github.com/ajt60gaibb/FastGauss.jl


 I am a Julia beginner (only been learning for 2 weeks) so I am assuming 
 the code can be 
 improved in a million and one ways. Please tell me if I've done something 
 that Julia does 
 not like. I am not sure if it is appropriate to make this an official 
 package.

  
 One thing to look out for is making sure your functions have consistent 
 return types. E.g. in 
 https://github.com/ajt60gaibb/FastGauss.jl/blob/91e2ac656b856876563d5aacf7b5a405e068b3da/src/GaussLobatto.jl#L4
  
 you have

Thanks! I tried to get the return types consistent, but obviously missed a 
few. I've been trying to use @code_typed to tell me this 
information, but reading the output is a little difficult (at the moment). 


 if ( n == 1 ) 
 error(Lobatto undefined for n = 1.) 
 elseif ( n == 2 ) 
 x = ([-1.0,1.0],[1.0,1.0]) 
 elseif ( n == 3 ) 
 x = ([-1, 0, 1], [1.0, 4.0, 1.0]/3)
 # ...

 In the n==2 case, you're returning a tuple of two float vectors, but in 
 the n==3 case, you're returning a tuple with one Int vector and one float 
 vector.

 This issue crops up in a few other places, including sometimes returning a 
 number and other times returning a vector.

 Another thing you might want to consider is devectorizing compound 
 operations on vectors to avoid allocating containers to store intermediate 
 results (either using https://github.com/lindahua/Devectorize.jl or 
 manually). To pick one random place where this might be relevant: 
 https://github.com/ajt60gaibb/FastGauss.jl/blob/5e3a8a2f9a7e327622bdd43f69bb712afcb16743/src/GaussJacobi.jl#L36

Good point. This makes a _huge_ difference in speed. Thank you. I had heard 
that devectorize was a good thing in Julia, but I'm from MATLAB where it's 
the opposite. I'll go through the code and update. 


 I'm not 100% sure whether this will be relevant to your API, but many high 
 performance Julia libs expose mutating versions of functions (marked with 
 an ! at the end) that normally return arrays, allowing the caller to 
 instead pass in a preallocated array for the result to be stored in that 
 could potentially be reused from call to call.

OK. I'll have a closer look at some commands with the ! suffix. 


 If a quadrature rule may in some cases be used only once, it might be nice 
 if there were a way to apply the quadrature method to a function on the fly 
 without ever generating the entire node and weight vector.

Thank you very much Jason.
 


[julia-users] Could it be a bug of function map(fun,tuple1,tuple2,tuple3)?

2014-09-02 Thread xiongjieyi
julia versioninfo()
Julia Version 0.4.0-dev+323
Commit ecd039c* (2014-08-24 16:57 UTC)
Platform Info:
  System: Linux (x86_64-redhat-linux)
  CPU: Intel(R) Xeon(R) CPU E7- 4830  @ 2.13GHz
  WORD_SIZE: 64
  BLAS: libopenblas (USE64BITINT NO_AFFINITY NEHALEM)
  LAPACK: libopenblas
  LIBM: libopenlibm
  LLVM: libLLVM-3.3

julia map((x,y,z)-x+y+z,[1,2,3],[2,3,4],[3,4,5])
3-element Array{Int64,1}:
  6
  9
 12

julia map((x,y,z)-x+y+z,(1,2,3),(2,3,4),(3,4,5))
ERROR: BoundsError()
 in heads at tuple.jl:56
 in map at tuple.jl:59 (repeats 4 times)


Re: [julia-users] A bug?

2014-09-02 Thread Elliot Saba
Addition doesn't seem to be defined for tuples:

julia (1,2,3) + (4,5,6)
ERROR: `+` has no method matching +(::(Int64,Int64,Int64),
::(Int64,Int64,Int64))


The BoundsError is quite misleading however. Not sure what it's really
doing in there.  Tuples are pretty separated from arrays in Julia however,
in other languages such as Python I see tuples and lists used
interchangeably quite a bit, however in Julia the most common usage of
tuples I've found are in metaprogramming (collecting function arguments and
the like).  If you want to perform algebraic operations you should be using
Arrays that contain well defined types.  This is, after all, one reason
Julia is able to achieve high speeds; if Julia knows that every element in
your array is an Int64, or a Float32 or whatever you like, then an
operation such as you have above can be performed very quickly without
performing type-checking on each individual element in your array.
-E


On Tue, Sep 2, 2014 at 8:49 AM, xiongji...@gmail.com wrote:

 julia versioninfo()
 Julia Version 0.4.0-dev+323
 Commit ecd039c* (2014-08-24 16:57 UTC)
 Platform Info:
   System: Linux (x86_64-redhat-linux)
   CPU: Intel(R) Xeon(R) CPU E7- 4830  @ 2.13GHz
   WORD_SIZE: 64
   BLAS: libopenblas (USE64BITINT NO_AFFINITY NEHALEM)
   LAPACK: libopenblas
   LIBM: libopenlibm
   LLVM: libLLVM-3.3

 julia map((x,y,z)-x+y+z,[1,2,3],[2,3,4],[3,4,5])
 3-element Array{Int64,1}:
   6
   9
  12

 julia map((x,y,z)-x+y+z,(1,2,3),(2,3,4),(3,4,5))
 ERROR: BoundsError()
  in heads at tuple.jl:56
  in map at tuple.jl:59 (repeats 4 times)






Re: [julia-users] Holy freaking cow / Job well done

2014-09-02 Thread Elliot Saba
Well thank you very much for the exuberant message.  A lot of people have
put a lot of work into this release, and it's nice for everyone involved to
wake up to messages like this in their inbox. :)
-E




On Tue, Sep 2, 2014 at 7:38 AM, Carlo Kokoth carlo.kok...@gmail.com wrote:

 I remember checking Julia some time back, and thinking This looks almost
 perfect.

 Macros (and homoiconicity), multiple dispatch, the Types section of the
 manual is geek-porn, speedy more than enough, support for distributed
 computation, yada, yada (yeah, I'm a programming language fetishist ;-).

 I wished basically only two things, threading support and some way to
 cache compiled code.

 And today I checked what is happenning, and whadda-ya-know, 0.3.0 has been
 released, stating, among other things:

 - System image caching for fast startup.
 - Multi-process shared memory support. (multi-threading support is in
 progress and has been a major summer focus)

 WoOOOoo. Haven't expected so fast progress.

 Next very pleasant surprise was trying out GTK. I feared (sadly running on
 windows at work) that the installation would fail because it would try to
 compile the dependencies from sources and I don't have mingw on PATH by
 default, but instead binaries were downloaded. And then, after writing a
 bit of test code, it worked without a glitch.

 I had more troubles with getting some packages work with python (no
 pre-compiled bins, and although I can manage, sometimes getting all the
 transitive dependencies is an afternoon of downloading yet another source
 archive, ./configure-ing, make-ing, digging into why it failed, patching,
 rinsing, repeating ...).

 So right now, I'd give Julia about 6 points out of 5, (as 1.0 approaches,
 the score will most likely rise to 10 out of 5, don't skimp on the awesome
 ;-).

 In-fucking-credible work, thanks to everyone who helped to make it happen,
   C.K.



Re: [julia-users] CoordInterpGrid

2014-09-02 Thread Tim Holy
It's hard to say what's happening without more detail. Can you give us an 
explicit example of the commands you're trying to run? Also, what version of 
Julia are you using, and what version of Grid does Pkg.status() report?

--Tim

On Tuesday, September 02, 2014 04:21:59 AM Jude wrote:
 Hi,
 
 I want to use interpolation and have downloaded the Grid package but for
 some reason it won't allow me to use CoordInterpGrid. It works fine if I
 use interpGrid but when I try CoordIntepGrid it says  ERROR:
 CoordInterpGrid not defined. Does anyone know why this is and how I might
 get CoordInterpGrid to work? I have even tried it using the example
 provided by Tim Holy to make sure it was not a mistake of syntax but it
 still does not work. Any help would be really appreciated!
 
 Cheers,
 Jude



Re: [julia-users] Re: starting up julia from mac terminal (the way you would for python by typing python)

2014-09-02 Thread Elliot Saba
The julia executable inside *Julia-0.3.0.app/Contents/MacOS* is a wrapper
executable that launches a terminal and runs the true julia executable
inside it.  You can get at the true julia executable by adding
*Julia-0.3.0.app/Contents/resources/julia/bin* to your path.  Note that
Christoph's line is adding some extra paths to his PATH, not just Julia's.
 Just adding Julia's path to your PATH environment variable is done via:

*export
PATH=/Applications/Julia-0.3.0.app/Contents/resources/julia/bin:$PATH*

Assuming, of course, that your *Julia-0.3.0.app* file is in */Applications*.
 It doesn't need to be, you can run it from anywhere, but putting it there
is rather standard, I suppose.
-E


On Tue, Sep 2, 2014 at 8:28 AM, Christoph Ortner christophortn...@gmail.com
 wrote:

 The file `~/.bash_profile'  should contain something like this:

 export PATH=~/Dropbox/Admin/scripts:/Applications/Julia-0.3.0.app/
 Contents/MacOS:/Users/ortner/anaconda/bin:$PATH

 However, it will always open a new terminal, rather than opening julia in
 your current terminal. If anybody knows how to fix this, i would love to
 hear.

 Christoph


 On Tuesday, 2 September 2014 06:16:02 UTC+1, Anonymous wrote:

 I'm trying to figure out how to create an alias/shortcut whatever in the
 mac terminal so that I can just type julia and julia will start up, just
 the way python works when I type python.  It's something about a bash or
 something.




Re: [julia-users] Re: starting up julia from mac terminal (the way you would for python by typing python)

2014-09-02 Thread Rob Goedman
In my ~/.bash_profile I've defined a couple of aliases, like

alias julia4=Users/rob/Projects/Julia/julia/julia

and then type 'julia4' in the terminal to start Julia in that same terminal.

The first part, Users/rob/Projects/Julia/julia/, is the path to the directory 
where I make Julia0.4.

Similarly, the 'julia' alias points to the official v0.3 binary in 
/Applications/julia.app/.../bin/julia.

From memory, haven't got my laptop with me while traveling,
Regards,
Rob

Sent from Rob Goedman's iPad Mini


 On Sep 2, 2014, at 08:28, Christoph Ortner christophortn...@gmail.com wrote:
 
 The file `~/.bash_profile'  should contain something like this:
 
 export 
 PATH=~/Dropbox/Admin/scripts:/Applications/Julia-0.3.0.app/Contents/MacOS:/Users/ortner/anaconda/bin:$PATH
 
 However, it will always open a new terminal, rather than opening julia in 
 your current terminal. If anybody knows how to fix this, i would love to hear.
 
 Christoph
 
 On Tuesday, 2 September 2014 06:16:02 UTC+1, Anonymous wrote:
 I'm trying to figure out how to create an alias/shortcut whatever in the mac 
 terminal so that I can just type julia and julia will start up, just the 
 way python works when I type python.  It's something about a bash or 
 something.


Re: [julia-users] Re: starting up julia from mac terminal (the way you would for python by typing python)

2014-09-02 Thread Elliot Saba
Quick correction, that should be a capital r in *resources, *and there
should be a closing quote at the end of that command. E.g.:

*export
PATH=/Applications/Julia-0.3.0.app/Contents/Resources/julia/bin:$PATH*


On Tue, Sep 2, 2014 at 9:30 AM, Elliot Saba staticfl...@gmail.com wrote:

 The julia executable inside *Julia-0.3.0.app/Contents/MacOS* is a
 wrapper executable that launches a terminal and runs the true julia
 executable inside it.  You can get at the true julia executable by adding
 *Julia-0.3.0.app/Contents/resources/julia/bin* to your path.  Note that
 Christoph's line is adding some extra paths to his PATH, not just Julia's.
  Just adding Julia's path to your PATH environment variable is done via:

 *export
 PATH=/Applications/Julia-0.3.0.app/Contents/resources/julia/bin:$PATH*

 Assuming, of course, that your *Julia-0.3.0.app* file is in
 */Applications*.  It doesn't need to be, you can run it from anywhere,
 but putting it there is rather standard, I suppose.
  -E


 On Tue, Sep 2, 2014 at 8:28 AM, Christoph Ortner 
 christophortn...@gmail.com wrote:

 The file `~/.bash_profile'  should contain something like this:

 export PATH=~/Dropbox/Admin/scripts:/Applications/Julia-0.3.0.app/
 Contents/MacOS:/Users/ortner/anaconda/bin:$PATH

 However, it will always open a new terminal, rather than opening julia in
 your current terminal. If anybody knows how to fix this, i would love to
 hear.

 Christoph


 On Tuesday, 2 September 2014 06:16:02 UTC+1, Anonymous wrote:

 I'm trying to figure out how to create an alias/shortcut whatever in the
 mac terminal so that I can just type julia and julia will start up, just
 the way python works when I type python.  It's something about a bash or
 something.





Re: [julia-users] Holy freaking cow / Job well done

2014-09-02 Thread Stefan Karpinski
Yes, thanks. It's not the most glamorous work, but a huge amount of effort
has gone into making installation smooth, so it's pleasing that GTK just
worked for you.


On Tue, Sep 2, 2014 at 9:27 AM, Elliot Saba staticfl...@gmail.com wrote:

 Well thank you very much for the exuberant message.  A lot of people have
 put a lot of work into this release, and it's nice for everyone involved to
 wake up to messages like this in their inbox. :)
 -E




 On Tue, Sep 2, 2014 at 7:38 AM, Carlo Kokoth carlo.kok...@gmail.com
 wrote:

 I remember checking Julia some time back, and thinking This looks almost
 perfect.

 Macros (and homoiconicity), multiple dispatch, the Types section of the
 manual is geek-porn, speedy more than enough, support for distributed
 computation, yada, yada (yeah, I'm a programming language fetishist ;-).

 I wished basically only two things, threading support and some way to
 cache compiled code.

 And today I checked what is happenning, and whadda-ya-know, 0.3.0 has
 been released, stating, among other things:

 - System image caching for fast startup.
 - Multi-process shared memory support. (multi-threading support is in
 progress and has been a major summer focus)

 WoOOOoo. Haven't expected so fast progress.

 Next very pleasant surprise was trying out GTK. I feared (sadly running
 on windows at work) that the installation would fail because it would try
 to compile the dependencies from sources and I don't have mingw on PATH by
 default, but instead binaries were downloaded. And then, after writing a
 bit of test code, it worked without a glitch.

 I had more troubles with getting some packages work with python (no
 pre-compiled bins, and although I can manage, sometimes getting all the
 transitive dependencies is an afternoon of downloading yet another source
 archive, ./configure-ing, make-ing, digging into why it failed, patching,
 rinsing, repeating ...).

 So right now, I'd give Julia about 6 points out of 5, (as 1.0 approaches,
 the score will most likely rise to 10 out of 5, don't skimp on the awesome
 ;-).

 In-fucking-credible work, thanks to everyone who helped to make it happen,
   C.K.





[julia-users] help with ccall example

2014-09-02 Thread Alan Crawford
hi!

i'm trying to use ccall in the following example:

julia x = zeros(10) 
10-element Array{Float64,1}: 
0.0 
0.0 
0.0 
0.0 
0.0 
0.0 
0.0 
0.0 
0.0 
0.0

julia w = zeros(10) 
10-element Array{Float64,1}: 
0.0 
0.0 
0.0 
0.0 
0.0 
0.0 
0.0 
0.0 
0.0 
0.0 

julia n = 4 
4 
julia ccall( (:cc, sparselib), (Int, Ptr{Float64},Ptr{Float64}), (Int, 
Ptr{Float64},Ptr{Float64}),n,x,w) 
ERROR: error compiling anonymous: in anonymous: ccall: missing return type

the calling interface of the cc function is

void cc ( int n, double x[], double w[] )

ie it returns nothing.

i compiled this library and it seems to work:

julia ccall( (:cpu_time, sparselib), Int32, ())

2395765

thanks!





Re: [julia-users] A bug?

2014-09-02 Thread xiongjieyi
I don't think it is the problem of (+). Since

julia map((x...)-x,(1,2),(3,4),(5,6))
ERROR: BoundsError()
 in heads at tuple.jl:56
 in map at tuple.jl:59 (repeats 3 times)

has the same problem. I think the expected result should be 
((1,3,5),(2,4,6)).

also, 
julia map((x...)-x,[1,2],[3,4],[5,6])
2-element Array{(Int64,Int64,Int64),1}:
 (1,3,5)
 (2,4,6)

On Tuesday, September 2, 2014 3:24:39 PM UTC+2, Elliot Saba wrote:

 Addition doesn't seem to be defined for tuples:

 julia (1,2,3) + (4,5,6)
 ERROR: `+` has no method matching +(::(Int64,Int64,Int64), 
 ::(Int64,Int64,Int64))


 The BoundsError is quite misleading however. Not sure what it's really 
 doing in there.  Tuples are pretty separated from arrays in Julia however, 
 in other languages such as Python I see tuples and lists used 
 interchangeably quite a bit, however in Julia the most common usage of 
 tuples I've found are in metaprogramming (collecting function arguments and 
 the like).  If you want to perform algebraic operations you should be using 
 Arrays that contain well defined types.  This is, after all, one reason 
 Julia is able to achieve high speeds; if Julia knows that every element in 
 your array is an Int64, or a Float32 or whatever you like, then an 
 operation such as you have above can be performed very quickly without 
 performing type-checking on each individual element in your array.
 -E


 On Tue, Sep 2, 2014 at 8:49 AM, xiong...@gmail.com javascript: wrote:

 julia versioninfo()
 Julia Version 0.4.0-dev+323
 Commit ecd039c* (2014-08-24 16:57 UTC)
 Platform Info:
   System: Linux (x86_64-redhat-linux)
   CPU: Intel(R) Xeon(R) CPU E7- 4830  @ 2.13GHz
   WORD_SIZE: 64
   BLAS: libopenblas (USE64BITINT NO_AFFINITY NEHALEM)
   LAPACK: libopenblas
   LIBM: libopenlibm
   LLVM: libLLVM-3.3

 julia map((x,y,z)-x+y+z,[1,2,3],[2,3,4],[3,4,5])
 3-element Array{Int64,1}:
   6
   9
  12

 julia map((x,y,z)-x+y+z,(1,2,3),(2,3,4),(3,4,5))
 ERROR: BoundsError()
  in heads at tuple.jl:56
  in map at tuple.jl:59 (repeats 4 times)






Re: [julia-users] A bug?

2014-09-02 Thread Elliot Saba
Yeah, looks like the heads() tails() stuff in tuple.jl around line 60 is
getting confused because we've got nested tuples, so it doesn't recognize
when the tuple is actually empty.  Feel free to open an issue and ping me.
-E


Re: [julia-users] CoordInterpGrid

2014-09-02 Thread Jude
Hi Tim,

I am using Julia v0.3.0 and Grid 0.3.3

I have tried running with your example code just to see what is going on 
and 
yi = InterpGrid(y, BCnil, InterpQuadratic)   works fine.

But if I try using  z_2di = CoordInterpGrid((x,y), z_2d, BCnil, 
InterpQuadratic)
it says ERROR: CoordInterpGrid not defined.

Moreover, I tried using the InterpGrid command but replacing 
InterpQuadratic with InterpLinear and I get a similar error
 ERROR: InterpLinear not defined.

So I am confused as to why it works fine for InterpQuadratic but not 
InterpLinear and why CoordInterpGrid doesn't work at all!

Thanks!
Jude


On Tuesday, September 2, 2014 2:29:45 PM UTC+1, Tim Holy wrote:

 It's hard to say what's happening without more detail. Can you give us an 
 explicit example of the commands you're trying to run? Also, what version 
 of 
 Julia are you using, and what version of Grid does Pkg.status() report? 

 --Tim 

 On Tuesday, September 02, 2014 04:21:59 AM Jude wrote: 
  Hi, 
  
  I want to use interpolation and have downloaded the Grid package but for 
  some reason it won't allow me to use CoordInterpGrid. It works fine if I 
  use interpGrid but when I try CoordIntepGrid it says  ERROR: 
  CoordInterpGrid not defined. Does anyone know why this is and how I 
 might 
  get CoordInterpGrid to work? I have even tried it using the example 
  provided by Tim Holy to make sure it was not a mistake of syntax but it 
  still does not work. Any help would be really appreciated! 
  
  Cheers, 
  Jude 



Re: [julia-users] help with ccall example

2014-09-02 Thread Andreas Noack
When the function you are calling is void you should use Void as the return
type instead of the first (Int, Ptr{Float64},Ptr{Float64}), i.e.

ccall( (:cc, sparselib), Void, (Int, Ptr{Float64},Ptr{Float64}),n,x,w)

Med venlig hilsen

Andreas Noack


2014-09-02 9:04 GMT-04:00 Alan Crawford a.r.crawf...@gmail.com:

 hi!

 i'm trying to use ccall in the following example:

 julia x = zeros(10)
 10-element Array{Float64,1}:
 0.0
 0.0
 0.0
 0.0
 0.0
 0.0
 0.0
 0.0
 0.0
 0.0

 julia w = zeros(10)
 10-element Array{Float64,1}:
 0.0
 0.0
 0.0
 0.0
 0.0
 0.0
 0.0
 0.0
 0.0
 0.0

 julia n = 4
 4
 julia ccall( (:cc, sparselib), (Int, Ptr{Float64},Ptr{Float64}), (Int,
 Ptr{Float64},Ptr{Float64}),n,x,w)
 ERROR: error compiling anonymous: in anonymous: ccall: missing return type

 the calling interface of the cc function is

 void cc ( int n, double x[], double w[] )

 ie it returns nothing.

 i compiled this library and it seems to work:

 julia ccall( (:cpu_time, sparselib), Int32, ())

 2395765

 thanks!






Re: [julia-users] help with ccall example

2014-09-02 Thread Alan Crawford
Thanks, worked great.
Alan

On Tuesday, 2 September 2014 14:55:47 UTC+1, Andreas Noack wrote:

 When the function you are calling is void you should use Void as the 
 return type instead of the first (Int, Ptr{Float64},Ptr{Float64}), i.e.

 ccall( (:cc, sparselib), Void, (Int, Ptr{Float64},Ptr{Float64}),n,x,w)

 Med venlig hilsen

 Andreas Noack


 2014-09-02 9:04 GMT-04:00 Alan Crawford a.r.cr...@gmail.com javascript:
 :

 hi!

 i'm trying to use ccall in the following example:

 julia x = zeros(10) 
 10-element Array{Float64,1}: 
 0.0 
 0.0 
 0.0 
 0.0 
 0.0 
 0.0 
 0.0 
 0.0 
 0.0 
 0.0

 julia w = zeros(10) 
 10-element Array{Float64,1}: 
 0.0 
 0.0 
 0.0 
 0.0 
 0.0 
 0.0 
 0.0 
 0.0 
 0.0 
 0.0 

 julia n = 4 
 4 
 julia ccall( (:cc, sparselib), (Int, Ptr{Float64},Ptr{Float64}), (Int, 
 Ptr{Float64},Ptr{Float64}),n,x,w) 
 ERROR: error compiling anonymous: in anonymous: ccall: missing return type

 the calling interface of the cc function is

 void cc ( int n, double x[], double w[] )

 ie it returns nothing.

 i compiled this library and it seems to work:

 julia ccall( (:cpu_time, sparselib), Int32, ())

 2395765

 thanks!






[julia-users] Compressing .jld files

2014-09-02 Thread Douglas Bates
Now that the JLD format can handle DataFrame objects I would like to switch 
from storing data sets in .RData format to .jld format.  Datasets stored in 
.RData format are compressed after they are written.  The default 
compression is gzip.  Bzip2 and xz compression are also available.  The 
compression can make a substantial difference in the file size because the 
data values are often highly repetitive.

JLD is different in scope in that .jld files can be queried using external 
programs like h5ls and the files can have new data added or existing data 
edited or removed.  The .RData format is an archival format.  Once the file 
is written it cannot be modified in place.

Given these differences I can appreciate that JLD files are not compressed. 
 Nevertheless I think it would be useful to adopt a convention in the JLD 
module for accessing data from files with a .jld.xz or .jld.7z extension. 
 It could be as simple as uncompressing the files in a temporary directory, 
reading then removing, or it could be more sophisticated.  I notice that my 
versions of libjulia.so on an Ubuntu 64-bit system are linked against both 
libz.so and liblzma.so

$ ldd /usr/lib/x86_64-linux-gnu/julia/libjulia.so 
linux-vdso.so.1 =  (0x7fff5214f000)
libdl.so.2 = /lib/x86_64-linux-gnu/libdl.so.2 (0x7f62932ee000)
libz.so.1 = /lib/x86_64-linux-gnu/libz.so.1 (0x7f62930d5000)
libm.so.6 = /lib/x86_64-linux-gnu/libm.so.6 (0x7f6292dce000)
librt.so.1 = /lib/x86_64-linux-gnu/librt.so.1 (0x7f6292bc6000)
libpthread.so.0 = /lib/x86_64-linux-gnu/libpthread.so.0 
(0x7f62929a8000)
libunwind.so.8 = /usr/lib/x86_64-linux-gnu/libunwind.so.8 
(0x7f629278c000)
libstdc++.so.6 = /usr/lib/x86_64-linux-gnu/libstdc++.so.6 
(0x7f6292488000)
libgcc_s.so.1 = /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x7f6292272000)
libc.so.6 = /lib/x86_64-linux-gnu/libc.so.6 (0x7f6291eab000)
/lib64/ld-linux-x86-64.so.2 (0x7f62944b3000)
liblzma.so.5 = /lib/x86_64-linux-gnu/liblzma.so.5 (0x7f6291c89000)


AFAIK the user-level interface to gzip requires the GZip package.  Unless I 
have missed something (always a possibility) there is no user-level 
interface to liblzma in Julia.  If the library is going to be linked 
anyway, would it make sense to provide a user-level interface in Julia? 


[julia-users] Re: Compressing .jld files

2014-09-02 Thread Jake Bolewski
HDF5 supports pluggable compression schemes, so this seems like it should 
be handled within the hdf5 library.  The fastest seems to be blosc which is 
written by the PyTables author.  Although this is not shipped by default 
with HDF5, if we include it in the BinDeps builds for hdf5 it would be a 
nice compressed default format.

On Tuesday, September 2, 2014 11:30:39 AM UTC-4, Douglas Bates wrote:

 Now that the JLD format can handle DataFrame objects I would like to 
 switch from storing data sets in .RData format to .jld format.  Datasets 
 stored in .RData format are compressed after they are written.  The default 
 compression is gzip.  Bzip2 and xz compression are also available.  The 
 compression can make a substantial difference in the file size because the 
 data values are often highly repetitive.

 JLD is different in scope in that .jld files can be queried using external 
 programs like h5ls and the files can have new data added or existing data 
 edited or removed.  The .RData format is an archival format.  Once the file 
 is written it cannot be modified in place.

 Given these differences I can appreciate that JLD files are not 
 compressed.  Nevertheless I think it would be useful to adopt a convention 
 in the JLD module for accessing data from files with a .jld.xz or .jld.7z 
 extension.  It could be as simple as uncompressing the files in a temporary 
 directory, reading then removing, or it could be more sophisticated.  I 
 notice that my versions of libjulia.so on an Ubuntu 64-bit system are 
 linked against both libz.so and liblzma.so

 $ ldd /usr/lib/x86_64-linux-gnu/julia/libjulia.so 
 linux-vdso.so.1 =  (0x7fff5214f000)
 libdl.so.2 = /lib/x86_64-linux-gnu/libdl.so.2 (0x7f62932ee000)
 libz.so.1 = /lib/x86_64-linux-gnu/libz.so.1 (0x7f62930d5000)
 libm.so.6 = /lib/x86_64-linux-gnu/libm.so.6 (0x7f6292dce000)
 librt.so.1 = /lib/x86_64-linux-gnu/librt.so.1 (0x7f6292bc6000)
 libpthread.so.0 = /lib/x86_64-linux-gnu/libpthread.so.0 
 (0x7f62929a8000)
 libunwind.so.8 = /usr/lib/x86_64-linux-gnu/libunwind.so.8 
 (0x7f629278c000)
 libstdc++.so.6 = /usr/lib/x86_64-linux-gnu/libstdc++.so.6 
 (0x7f6292488000)
 libgcc_s.so.1 = /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x7f6292272000)
 libc.so.6 = /lib/x86_64-linux-gnu/libc.so.6 (0x7f6291eab000)
 /lib64/ld-linux-x86-64.so.2 (0x7f62944b3000)
 liblzma.so.5 = /lib/x86_64-linux-gnu/liblzma.so.5 (0x7f6291c89000)


 AFAIK the user-level interface to gzip requires the GZip package.  Unless 
 I have missed something (always a possibility) there is no user-level 
 interface to liblzma in Julia.  If the library is going to be linked 
 anyway, would it make sense to provide a user-level interface in Julia? 



[julia-users] Nonuniform arrays

2014-09-02 Thread Reid Atcheson
A common situation I run into in my finite element codes is the need to 
have offset indexed arrays. This can happen if a mesh consists of both 
straight-edges and curved edges triangles, then the offsets effectively 
tell you how many vertices are used in the representation of the curves and 
you can then interpolate them as you like. It also arises in the situation 
of p- adaptive finite elements, where the polynomial order may be 
variable over the domain.

In both of these situations it is desirable to maintain data contiguity 
because in the end things will be passed to many black-box linear algebra 
routines which often have as an implicit assumption that the data is 
contiguous in memory.

I have created a simple library for handling this situation here:

https://github.com/ReidAtcheson/NonuniformArray.jl

Is there perhaps a more appropriate way to handle this that I am unaware 
of? Would others find this kind of library useful, and if so perhaps know 
how it could be canonicalized to behave in a more proper Julia way (if it 
does not already).


Thank you,


-Reid


Re: [julia-users] Re: Compressing .jld files

2014-09-02 Thread Kevin Squire
+1 for blosc.  It's quite a nice bit of work, and if I remember correctly,
from the user's perspective, it's use is transparent.

Cheers,
   Kevin


On Tue, Sep 2, 2014 at 8:52 AM, Jake Bolewski jakebolew...@gmail.com
wrote:

 HDF5 supports pluggable compression schemes, so this seems like it should
 be handled within the hdf5 library.  The fastest seems to be blosc which is
 written by the PyTables author.  Although this is not shipped by default
 with HDF5, if we include it in the BinDeps builds for hdf5 it would be a
 nice compressed default format.


 On Tuesday, September 2, 2014 11:30:39 AM UTC-4, Douglas Bates wrote:

 Now that the JLD format can handle DataFrame objects I would like to
 switch from storing data sets in .RData format to .jld format.  Datasets
 stored in .RData format are compressed after they are written.  The default
 compression is gzip.  Bzip2 and xz compression are also available.  The
 compression can make a substantial difference in the file size because the
 data values are often highly repetitive.

 JLD is different in scope in that .jld files can be queried using
 external programs like h5ls and the files can have new data added or
 existing data edited or removed.  The .RData format is an archival format.
  Once the file is written it cannot be modified in place.

 Given these differences I can appreciate that JLD files are not
 compressed.  Nevertheless I think it would be useful to adopt a convention
 in the JLD module for accessing data from files with a .jld.xz or .jld.7z
 extension.  It could be as simple as uncompressing the files in a temporary
 directory, reading then removing, or it could be more sophisticated.  I
 notice that my versions of libjulia.so on an Ubuntu 64-bit system are
 linked against both libz.so and liblzma.so

 $ ldd /usr/lib/x86_64-linux-gnu/julia/libjulia.so
 linux-vdso.so.1 =  (0x7fff5214f000)
  libdl.so.2 = /lib/x86_64-linux-gnu/libdl.so.2 (0x7f62932ee000)
 libz.so.1 = /lib/x86_64-linux-gnu/libz.so.1 (0x7f62930d5000)
  libm.so.6 = /lib/x86_64-linux-gnu/libm.so.6 (0x7f6292dce000)
 librt.so.1 = /lib/x86_64-linux-gnu/librt.so.1 (0x7f6292bc6000)
  libpthread.so.0 = /lib/x86_64-linux-gnu/libpthread.so.0
 (0x7f62929a8000)
 libunwind.so.8 = /usr/lib/x86_64-linux-gnu/libunwind.so.8
 (0x7f629278c000)
  libstdc++.so.6 = /usr/lib/x86_64-linux-gnu/libstdc++.so.6
 (0x7f6292488000)
 libgcc_s.so.1 = /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x7f6292272000)
  libc.so.6 = /lib/x86_64-linux-gnu/libc.so.6 (0x7f6291eab000)
 /lib64/ld-linux-x86-64.so.2 (0x7f62944b3000)
  liblzma.so.5 = /lib/x86_64-linux-gnu/liblzma.so.5 (0x7f6291c89000)


 AFAIK the user-level interface to gzip requires the GZip package.  Unless
 I have missed something (always a possibility) there is no user-level
 interface to liblzma in Julia.  If the library is going to be linked
 anyway, would it make sense to provide a user-level interface in Julia?




[julia-users] #JuliaLang anime character design spec.

2014-09-02 Thread Takeshi Kimura
Hi there,

I decided to establish #JuliaLang anime character project in Japan.
Now we are welcome your comments in this thread about this character spec. 
(costume, etc.) if you are interested in this project.
The design specification deadline is September 10th in JST(+9:00).

For more details, see:
http://mechawooser.blogspot.jp/2014/09/julialang-anime-character-project.html

We realize that this character may be contain #JuliaLang logo or #JuliaLang 
3-circle icons, and in this case,
our anime character may be satisfy CC's derivative work of #JuliaLang logo.

I have two questions for julia-user:

Q1. What type of costume or appearance do you prefer for this character? Is 
this better for holding Logo in her arms?

Q2. Which type of CC License is suitable for this character itself?

Comments are welcome!

Best Regards,

 Takeshi KIMURA




[julia-users] how to turn off redirect_stdout() ?

2014-09-02 Thread Florian Oswald
hi!

this works fine:

julia rdstdout, wrstdout = redirect_stdout()
(Pipe(open, 0 bytes waiting),Pipe(open, 0 bytes waiting))


*run(`date`)**julia *
*s = readavailable(rdstdout)**Tue  2 Sep 2014 17:11:33 BST\n*


*but now I want to turn the redirection off again. how do I do this?*

thanks!


Re: [julia-users] Re: Compressing .jld files

2014-09-02 Thread Milan Bouchet-Valat
Le mardi 02 septembre 2014 à 09:03 -0700, Kevin Squire a écrit :
 +1 for blosc.  It's quite a nice bit of work, and if I remember
 correctly, from the user's perspective, it's use is transparent.
This really sounds great!

Looking forward to the day when Julia will allow working seamlessly with
DataFrames backed by compressed HDF5 files. This would be a killer
feature for people having a hard time working with large databases in
R. :-)


Regards



Re: [julia-users] Julia IDE

2014-09-02 Thread Miloslav Raus
Hi everybody,

IMNSHO, the best way to incorporate plots into the IDE is not [just] 
having to have them appear in a separate window, but the ability of the 
repl to display arbitrary graphical [and hopefully interactive] objects 
[or better yet, controls].

Let's make Julia the best smaltalk'o-lisp'o-python ever ;-)

Cheers

Dne čtvrtek, 18. července 2013 17:47:40 UTC+2 mikeb2012 napsal(a):

 Disagree on 'killer feature'.

 Until recently, I was a very long time user/fan exclusively of Matlab.  
 Over a decade an a half ago, the one singular feature of the (then crappy 
 almost debug free) Matlab IDE had nothing to do with the IDE per se, it 
 boiled down to one line 'plot(x,y)'  That was it, and that is still it for 
 me.  As an engineer and researcher I have to be able to provide insights, 
 and visualizations are key to that.  And the most frequent visualizations I 
 use are graphs, and not just dam 2-D plots but 3-D 
 scatterplots/surfaceplots/volumetric/etc.

 When Julia *incorporates *decent plotting in to an IDE, *then *I predict 
 it will attract a lot of new users, especially newbies to Matlab-like 
 languages.  And once you have a lot of newbs, then you'll get insight in to 
 what they want resulting in more new users, and a ground-swell of maturing 
 users.  And the latter will *then *want awesome debugging as they become 
 more expert.  And no, having the user separately load/use a plot package is 
 *not* a viable solution, it's a disincentive to newcomers.  

 To summarise: when I can download JuliaStudio (or any IDE) and blindly do 
 the following (as any raw newbie might) and not get an error, then Julia 
 will have 'arrived':

 julia x=[1,2,3]; 

 3-element Int32 Array:

  1

  2

  3


 julia y=[1,2,3];

 3-element Int32 Array:

  1

  2

  3


 julia plot(x,y)

 plot not defined


 Until then, Julia is just another language with an appealing (to me) syntax. 





[julia-users] Re: best way to run parallel julia in batch mode?

2014-09-02 Thread Travis Porco
thanks much. I have had to go down to 31 workers in some instances after 
all.

On Saturday, August 30, 2014 5:08:18 AM UTC-7, Florian Oswald wrote:

 sorry correct my call to

 julia -p 32 exper.jl  myout.out

 (without the first  , not sure that makes a difference)

 On Saturday, 30 August 2014 12:58:57 UTC+1, Florian Oswald wrote:

 @require should work for what you want. i usually run batch jobs like this

 julia -p 32  exper.jl  myout.out

 maybe give it a try?
 also, do you have 32 CPUs? not sure how stable this is if you use plenty 
 more processes than cores.

 here is a working example for a large cluster:
 https://github.com/floswald/parallelTest/tree/master/julia/iridis

 the setup is different, but you should be able to figure out from sge.jl 
 how I load the functions. make sure you are in the right directory?

 On Saturday, 30 August 2014 04:01:00 UTC+1, Travis Porco wrote:

 julia versioninfo()
 Julia Version 0.3.1-pre+405
 Commit 444fafe* (2014-08-27 20:11 UTC)
 Platform Info:
   System: Linux (x86_64-linux-gnu)
   CPU: Intel(R) Xeon(R) CPU E5-2670 0 @ 2.60GHz
   WORD_SIZE: 64
   BLAS: libopenblas (USE64BITINT DYNAMIC_ARCH NO_AFFINITY Sandybridge)
   LAPACK: libopenblas
   LIBM: libopenlibm
   LLVM: libLLVM-3.3


 On Friday, August 29, 2014 10:15:54 AM UTC-7, Travis Porco wrote:

 Hello--I'd like to be able to run something like this:
 nohup ../julia/julia -p 32  mscript.jl
 where inside mscript.jl, I would like each worker to read in and have 
 access to a large script (something like require(analysis.jl) )
 and then call a function defined in my own file, nside which various 
 pieces of a computation are done in parallel.
 Does anyone have a working example? Nothing I have tried has worked (I 
 must have just misunderstood the manual). 
 Thanks.



Re: [julia-users] CoordInterpGrid

2014-09-02 Thread Tim Holy
As it is, this makes no sense. Did you get errors when you said using Grid? 
What happens if you say Pkg.test(Grid). For that matter, what happens if, in 
the top-level julia directory, you say make testall? It sounds like 
something is broken.

--Tim

On Tuesday, September 02, 2014 06:55:10 AM Jude wrote:
 Hi Tim,
 
 I am using Julia v0.3.0 and Grid 0.3.3
 
 I have tried running with your example code just to see what is going on
 and
 yi = InterpGrid(y, BCnil, InterpQuadratic)   works fine.
 
 But if I try using  z_2di = CoordInterpGrid((x,y), z_2d, BCnil,
 InterpQuadratic)
 it says ERROR: CoordInterpGrid not defined.
 
 Moreover, I tried using the InterpGrid command but replacing
 InterpQuadratic with InterpLinear and I get a similar error
  ERROR: InterpLinear not defined.
 
 So I am confused as to why it works fine for InterpQuadratic but not
 InterpLinear and why CoordInterpGrid doesn't work at all!
 
 Thanks!
 Jude
 
 On Tuesday, September 2, 2014 2:29:45 PM UTC+1, Tim Holy wrote:
  It's hard to say what's happening without more detail. Can you give us an
  explicit example of the commands you're trying to run? Also, what version
  of
  Julia are you using, and what version of Grid does Pkg.status() report?
  
  --Tim
  
  On Tuesday, September 02, 2014 04:21:59 AM Jude wrote:
   Hi,
   
   I want to use interpolation and have downloaded the Grid package but for
   some reason it won't allow me to use CoordInterpGrid. It works fine if I
   use interpGrid but when I try CoordIntepGrid it says  ERROR:
   CoordInterpGrid not defined. Does anyone know why this is and how I
  
  might
  
   get CoordInterpGrid to work? I have even tried it using the example
   provided by Tim Holy to make sure it was not a mistake of syntax but it
   still does not work. Any help would be really appreciated!
   
   Cheers,
   Jude



Re: [julia-users] Compressing .jld files

2014-09-02 Thread Tim Holy
HDF5/JLD does support compression: 
https://github.com/timholy/HDF5.jl/blob/master/doc/hdf5.md#reading-and-writing-data

But it's not turned on by default. Matlab uses compression by default, and 
I've found it's a huge bottleneck in terms of performance 
(http://www.mathworks.com/matlabcentral/fileexchange/39721-save-mat-files-more-quickly).
 But perhaps there's a good middle ground. It would take someone 
doing a little experimentation to see what the compromises are.

--Tim

On Tuesday, September 02, 2014 08:30:39 AM Douglas Bates wrote:
 Now that the JLD format can handle DataFrame objects I would like to switch
 from storing data sets in .RData format to .jld format.  Datasets stored in
 .RData format are compressed after they are written.  The default
 compression is gzip.  Bzip2 and xz compression are also available.  The
 compression can make a substantial difference in the file size because the
 data values are often highly repetitive.
 
 JLD is different in scope in that .jld files can be queried using external
 programs like h5ls and the files can have new data added or existing data
 edited or removed.  The .RData format is an archival format.  Once the file
 is written it cannot be modified in place.
 
 Given these differences I can appreciate that JLD files are not compressed.
  Nevertheless I think it would be useful to adopt a convention in the JLD
 module for accessing data from files with a .jld.xz or .jld.7z extension.
  It could be as simple as uncompressing the files in a temporary directory,
 reading then removing, or it could be more sophisticated.  I notice that my
 versions of libjulia.so on an Ubuntu 64-bit system are linked against both
 libz.so and liblzma.so
 
 $ ldd /usr/lib/x86_64-linux-gnu/julia/libjulia.so
 linux-vdso.so.1 =  (0x7fff5214f000)
 libdl.so.2 = /lib/x86_64-linux-gnu/libdl.so.2 (0x7f62932ee000)
 libz.so.1 = /lib/x86_64-linux-gnu/libz.so.1 (0x7f62930d5000)
 libm.so.6 = /lib/x86_64-linux-gnu/libm.so.6 (0x7f6292dce000)
 librt.so.1 = /lib/x86_64-linux-gnu/librt.so.1 (0x7f6292bc6000)
 libpthread.so.0 = /lib/x86_64-linux-gnu/libpthread.so.0
 (0x7f62929a8000)
 libunwind.so.8 = /usr/lib/x86_64-linux-gnu/libunwind.so.8
 (0x7f629278c000)
 libstdc++.so.6 = /usr/lib/x86_64-linux-gnu/libstdc++.so.6
 (0x7f6292488000)
 libgcc_s.so.1 = /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x7f6292272000)
 libc.so.6 = /lib/x86_64-linux-gnu/libc.so.6 (0x7f6291eab000)
 /lib64/ld-linux-x86-64.so.2 (0x7f62944b3000)
 liblzma.so.5 = /lib/x86_64-linux-gnu/liblzma.so.5 (0x7f6291c89000)
 
 
 AFAIK the user-level interface to gzip requires the GZip package.  Unless I
 have missed something (always a possibility) there is no user-level
 interface to liblzma in Julia.  If the library is going to be linked
 anyway, would it make sense to provide a user-level interface in Julia?



Re: [julia-users] Re: Installing Julia on Mac

2014-09-02 Thread ronubi
Elliot Saba wrote:

 This is because your GCC is out of date. Brew upgrade and try again.


The trick seems to be 

brew link --overwrite gcc


Not sure what that does, but it allowed 


brew install julia


to work.


Re: [julia-users] A bug?

2014-09-02 Thread Elliot Saba
This was fixed by Jeff in 3ca78cdaae1027d4d92c2d8e22c5c4d26e922396
-E


On Tue, Sep 2, 2014 at 9:49 AM, Elliot Saba staticfl...@gmail.com wrote:

 Yeah, looks like the heads() tails() stuff in tuple.jl around line 60 is
 getting confused because we've got nested tuples, so it doesn't recognize
 when the tuple is actually empty.  Feel free to open an issue and ping me.
  -E



Re: [julia-users] Compressing .jld files

2014-09-02 Thread Kevin Squire
Just to hype blosc a little more, see

http://www.blosc.org/blosc-in-depth.html

The main feature is that data is chunked so that the compressed chunk size
fits into L1 cache, and is then decompressed and used there.  There are a
few more buzzwords (multithreading, simd) in the link above. Worth
exploring where this might be useful in Julia.

Cheers,
  Kevin

On Tuesday, September 2, 2014, Tim Holy tim.h...@gmail.com wrote:

 HDF5/JLD does support compression:

 https://github.com/timholy/HDF5.jl/blob/master/doc/hdf5.md#reading-and-writing-data

 But it's not turned on by default. Matlab uses compression by default, and
 I've found it's a huge bottleneck in terms of performance
 (
 http://www.mathworks.com/matlabcentral/fileexchange/39721-save-mat-files-more-quickly).
 But perhaps there's a good middle ground. It would take someone
 doing a little experimentation to see what the compromises are.

 --Tim

 On Tuesday, September 02, 2014 08:30:39 AM Douglas Bates wrote:
  Now that the JLD format can handle DataFrame objects I would like to
 switch
  from storing data sets in .RData format to .jld format.  Datasets stored
 in
  .RData format are compressed after they are written.  The default
  compression is gzip.  Bzip2 and xz compression are also available.  The
  compression can make a substantial difference in the file size because
 the
  data values are often highly repetitive.
 
  JLD is different in scope in that .jld files can be queried using
 external
  programs like h5ls and the files can have new data added or existing data
  edited or removed.  The .RData format is an archival format.  Once the
 file
  is written it cannot be modified in place.
 
  Given these differences I can appreciate that JLD files are not
 compressed.
   Nevertheless I think it would be useful to adopt a convention in the JLD
  module for accessing data from files with a .jld.xz or .jld.7z extension.
   It could be as simple as uncompressing the files in a temporary
 directory,
  reading then removing, or it could be more sophisticated.  I notice that
 my
  versions of libjulia.so on an Ubuntu 64-bit system are linked against
 both
  libz.so and liblzma.so
 
  $ ldd /usr/lib/x86_64-linux-gnu/julia/libjulia.so
  linux-vdso.so.1 =  (0x7fff5214f000)
  libdl.so.2 = /lib/x86_64-linux-gnu/libdl.so.2 (0x7f62932ee000)
  libz.so.1 = /lib/x86_64-linux-gnu/libz.so.1 (0x7f62930d5000)
  libm.so.6 = /lib/x86_64-linux-gnu/libm.so.6 (0x7f6292dce000)
  librt.so.1 = /lib/x86_64-linux-gnu/librt.so.1 (0x7f6292bc6000)
  libpthread.so.0 = /lib/x86_64-linux-gnu/libpthread.so.0
  (0x7f62929a8000)
  libunwind.so.8 = /usr/lib/x86_64-linux-gnu/libunwind.so.8
  (0x7f629278c000)
  libstdc++.so.6 = /usr/lib/x86_64-linux-gnu/libstdc++.so.6
  (0x7f6292488000)
  libgcc_s.so.1 = /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x7f6292272000)
  libc.so.6 = /lib/x86_64-linux-gnu/libc.so.6 (0x7f6291eab000)
  /lib64/ld-linux-x86-64.so.2 (0x7f62944b3000)
  liblzma.so.5 = /lib/x86_64-linux-gnu/liblzma.so.5 (0x7f6291c89000)
 
 
  AFAIK the user-level interface to gzip requires the GZip package.
 Unless I
  have missed something (always a possibility) there is no user-level
  interface to liblzma in Julia.  If the library is going to be linked
  anyway, would it make sense to provide a user-level interface in Julia?




[julia-users] Re: how to turn off redirect_stdout() ?

2014-09-02 Thread David Gonzales
i tried saving the old STDOUT before redirecting and then using the saved 
stream to redirect output back to normal. in code:

oldout = STDOUT
(rd,wr) = redirect_stdout()
println(hello world)
readline(rd)
redirect_stdout(oldout)
println(hello world)

(with the last line outputting to screen normally)

On Tuesday, September 2, 2014 7:25:05 PM UTC+3, Florian Oswald wrote:

 hi!

 this works fine:

 julia rdstdout, wrstdout = redirect_stdout()
 (Pipe(open, 0 bytes waiting),Pipe(open, 0 bytes waiting))


 *run(`date`)**julia *
 *s = readavailable(rdstdout)**Tue  2 Sep 2014 17:11:33 BST\n*


 *but now I want to turn the redirection off again. how do I do this?*

 thanks!



Re: [julia-users] Compressing .jld files

2014-09-02 Thread Stefan Karpinski
That looks pretty sweet. It seems to avoid a lot of the pitfalls of naively
compressing data files while still getting the benefits. It would be great
to support that in JLD, maybe even turned on by default.


On Tue, Sep 2, 2014 at 1:35 PM, Kevin Squire kevin.squ...@gmail.com wrote:

 Just to hype blosc a little more, see

 http://www.blosc.org/blosc-in-depth.html

 The main feature is that data is chunked so that the compressed chunk size
 fits into L1 cache, and is then decompressed and used there.  There are a
 few more buzzwords (multithreading, simd) in the link above. Worth
 exploring where this might be useful in Julia.

 Cheers,
   Kevin


 On Tuesday, September 2, 2014, Tim Holy tim.h...@gmail.com wrote:

 HDF5/JLD does support compression:

 https://github.com/timholy/HDF5.jl/blob/master/doc/hdf5.md#reading-and-writing-data

 But it's not turned on by default. Matlab uses compression by default, and
 I've found it's a huge bottleneck in terms of performance
 (
 http://www.mathworks.com/matlabcentral/fileexchange/39721-save-mat-files-more-quickly).
 But perhaps there's a good middle ground. It would take someone
 doing a little experimentation to see what the compromises are.

 --Tim

 On Tuesday, September 02, 2014 08:30:39 AM Douglas Bates wrote:
  Now that the JLD format can handle DataFrame objects I would like to
 switch
  from storing data sets in .RData format to .jld format.  Datasets
 stored in
  .RData format are compressed after they are written.  The default
  compression is gzip.  Bzip2 and xz compression are also available.  The
  compression can make a substantial difference in the file size because
 the
  data values are often highly repetitive.
 
  JLD is different in scope in that .jld files can be queried using
 external
  programs like h5ls and the files can have new data added or existing
 data
  edited or removed.  The .RData format is an archival format.  Once the
 file
  is written it cannot be modified in place.
 
  Given these differences I can appreciate that JLD files are not
 compressed.
   Nevertheless I think it would be useful to adopt a convention in the
 JLD
  module for accessing data from files with a .jld.xz or .jld.7z
 extension.
   It could be as simple as uncompressing the files in a temporary
 directory,
  reading then removing, or it could be more sophisticated.  I notice
 that my
  versions of libjulia.so on an Ubuntu 64-bit system are linked against
 both
  libz.so and liblzma.so
 
  $ ldd /usr/lib/x86_64-linux-gnu/julia/libjulia.so
  linux-vdso.so.1 =  (0x7fff5214f000)
  libdl.so.2 = /lib/x86_64-linux-gnu/libdl.so.2 (0x7f62932ee000)
  libz.so.1 = /lib/x86_64-linux-gnu/libz.so.1 (0x7f62930d5000)
  libm.so.6 = /lib/x86_64-linux-gnu/libm.so.6 (0x7f6292dce000)
  librt.so.1 = /lib/x86_64-linux-gnu/librt.so.1 (0x7f6292bc6000)
  libpthread.so.0 = /lib/x86_64-linux-gnu/libpthread.so.0
  (0x7f62929a8000)
  libunwind.so.8 = /usr/lib/x86_64-linux-gnu/libunwind.so.8
  (0x7f629278c000)
  libstdc++.so.6 = /usr/lib/x86_64-linux-gnu/libstdc++.so.6
  (0x7f6292488000)
  libgcc_s.so.1 = /lib/x86_64-linux-gnu/libgcc_s.so.1
 (0x7f6292272000)
  libc.so.6 = /lib/x86_64-linux-gnu/libc.so.6 (0x7f6291eab000)
  /lib64/ld-linux-x86-64.so.2 (0x7f62944b3000)
  liblzma.so.5 = /lib/x86_64-linux-gnu/liblzma.so.5 (0x7f6291c89000)
 
 
  AFAIK the user-level interface to gzip requires the GZip package.
 Unless I
  have missed something (always a possibility) there is no user-level
  interface to liblzma in Julia.  If the library is going to be linked
  anyway, would it make sense to provide a user-level interface in Julia?




Re: [julia-users] Re: how to turn off redirect_stdout() ?

2014-09-02 Thread Florian Oswald
brilliant! thanks!


On 2 September 2014 19:09, David Gonzales dvdgonzale...@gmail.com wrote:

 i tried saving the old STDOUT before redirecting and then using the saved
 stream to redirect output back to normal. in code:

 oldout = STDOUT
 (rd,wr) = redirect_stdout()
 println(hello world)
 readline(rd)
 redirect_stdout(oldout)
 println(hello world)

 (with the last line outputting to screen normally)

 On Tuesday, September 2, 2014 7:25:05 PM UTC+3, Florian Oswald wrote:

 hi!

 this works fine:

 julia rdstdout, wrstdout = redirect_stdout()
 (Pipe(open, 0 bytes waiting),Pipe(open, 0 bytes waiting))


 *run(`date`)**julia *
 *s = readavailable(rdstdout) **Tue  2 Sep 2014 17:11:33 BST\n*


 *but now I want to turn the redirection off again. how do I do this?*

 thanks!




Re: [julia-users] Strange Slicing Behaviour

2014-09-02 Thread Stefan Karpinski
Are slices in Julia any worse than in Matlab? If so, what does Matlab do
that's better? I agree that our current slicing needs improvements (they
are planned), but it is largely due to its Matlab heritage.


On Tue, Sep 2, 2014 at 8:54 AM, Christoph Ortner christophortn...@gmail.com
 wrote:

 Tim - many thanks for the reply. To put this into context: I have 15y+
 experience with matlab, and some limited experience with other languages
 (C,C++,Java,Fortran).

 Here is a code snippet that brought this up. It precomputes a lot of data
 that is then used in a variety of (non-standard) ways for Tight-Binding
 molecular dynamics. This is a quick and dirty first-attempt implementation
 to just get it to run.

  # the following arrays are generates elsewhere, d \in \{2,3\}, N is
 large, alpha real
  # R : d x N x N  array
  # E : N x N array

  # VERSION 1
  for a = 1:d
   hHamiltonian[a, a, :, :] = slice(hHamiltonian, a, a, :, :) -
 alpha * E
   for b = 1:d
hHamiltonian[a,b,:,:] = slice(hHamiltonian, a, b, :, :) +
 alpha^2 * E .* slice(R, a, :, :) .* slice(R, b, :, :)
   end
  end

 instead of what I would have liked to write:

  # VERSION 2
  for a = 1:d
   hHamiltonian[a, a, :, :] += -alpha * E
   for b = 1:d
hHamiltonian[a, b, :, :] += alpha^2 * E .* R[a, :, :] .*
 R[b, :, :]
   end
  end


 Granted, since writing the above post I read up on Comprehensions (first
 time I have used them, and quite like the result)

  # VERSION 3
  hHamiltonian = [ - alpha*E[m,n]*del[a,b] + alpha^2 * E[m,n] *
 R[a,m,n] * R[b,m,n]
  for a = 1:d, b=1:d, m = 1:N, n = 1:N]


 I am quite happy with this last version, for the moment at least. Some
 points remain:
 1. what is the performance of Comprehensions compared with vectorisation
 or straight for-loops?
 2. The current slicing behaviour of Julia is just unexpectedly clunky.
 Whether or not VERSION 2 is good code to write, there are many instances
 where I would have written like this without a second thought. Another
 example is vectorised finite element assembly which looks very similar, but
 more complex.
 3. VERSION 2 is still the most natural way to write for many people who do
 quick and dirty numerical experiments and don't want to think too much
 about good coding practises. These are the kind of people who would prefer
 the code to run for 2 days rather than 2 hours, if it means they spend 1/10
 of their time coding.

 Any comments will be helpful. Thanks,
- Christoph



 On Tuesday, 2 September 2014 11:23:34 UTC+1, Tim Holy wrote:

 Your example involves two tricky issues: slice behavior and the fact
 that,
 despite appearances, A += b is not in-place. See issues #3424, #3217, and
 precedents they link to.

 I'd be interested in hearing more detail about how using slice gets
 nasty; as
 you say, from this example slice doesn't look so bad. In trying to fix
 this, we
 want to make sure we're aware of all the issues.

 --Tim

 On Monday, September 01, 2014 10:02:54 PM Christoph Ortner wrote:
  a = rand(3,3,3,3)
  b = rand(3,3)
  # this works:
  a[1,1,:,:] = slice(a,1,1,:,:)+b
  # this does not work:
  a[1,1,:,:] += b
 
  This example does not look so bad, but once you use expressive variable
  names and more dimensions it quickly gets very nasty. Because of it, I
 am
  doing less vectorisation than I would prefer.
 
  I know there is a lot of discussion on slicing on the Julia issues
 list, so
  I did not want to post another issue there.
 
  Is this likely to be resolved in future releases? Are there elegant
  alternatives?




Re: [julia-users] Strange Slicing Behaviour

2014-09-02 Thread Christoph Ortner


On Tuesday, 2 September 2014 19:12:01 UTC+1, Stefan Karpinski wrote:

 Are slices in Julia any worse than in Matlab? If so, what does Matlab do 
 that's better? I agree that our current slicing needs improvements (they 
 are planned), but it is largely due to its Matlab heritage.


I did not mean to apply that Julia is worse in this respect. Off the cuff, 
I would say slicing is no better or worse than in matlab. And, for the 
record, slicing multi-dimensional arrays in Matlab has been driving me mad 
for some quite some time.

I've skimmed the discussions in the issues lists on github, and I very 
much liked the idea of  distinguishing 
a[i:i, :, :]   
from
a[i, :, :]
until I remembered that I want
a[i,:]
to be a row-vector.  But I can't have the cake and eat it too.

Is there a consensus yet what the final slicing behaviour will be?

--Christoph



Re: [julia-users] Strange Slicing Behaviour

2014-09-02 Thread Stefan Karpinski
No, this is a pretty contentious issue. A lot of the relevant discussion is
in #4774 https://github.com/JuliaLang/julia/issues/4774. The one thing
everyone agrees on which is going to happen in 0.4 for sure is that slicing
will generally create views into the original array rather than copying the
data.


On Tue, Sep 2, 2014 at 2:24 PM, Christoph Ortner christophortn...@gmail.com
 wrote:



 On Tuesday, 2 September 2014 19:12:01 UTC+1, Stefan Karpinski wrote:

 Are slices in Julia any worse than in Matlab? If so, what does Matlab do
 that's better? I agree that our current slicing needs improvements (they
 are planned), but it is largely due to its Matlab heritage.


 I did not mean to apply that Julia is worse in this respect. Off the cuff,
 I would say slicing is no better or worse than in matlab. And, for the
 record, slicing multi-dimensional arrays in Matlab has been driving me mad
 for some quite some time.

 I've skimmed the discussions in the issues lists on github, and I very
 much liked the idea of  distinguishing
 a[i:i, :, :]
 from
 a[i, :, :]
 until I remembered that I want
 a[i,:]
 to be a row-vector.  But I can't have the cake and eat it too.

 Is there a consensus yet what the final slicing behaviour will be?

 --Christoph




Re: [julia-users] Julia IDE

2014-09-02 Thread Keith Campbell
https://github.com/JuliaLang/Interact.jl

On Tuesday, September 2, 2014 12:51:00 PM UTC-4, Miloslav Raus wrote:

 Hi everybody,

 IMNSHO, the best way to incorporate plots into the IDE is not [just] 
 having to have them appear in a separate window, but the ability of the 
 repl to display arbitrary graphical [and hopefully interactive] objects 
 [or better yet, controls].

 Let's make Julia the best smaltalk'o-lisp'o-python ever ;-)

 Cheers

 Dne čtvrtek, 18. července 2013 17:47:40 UTC+2 mikeb2012 napsal(a):

 Disagree on 'killer feature'.

 Until recently, I was a very long time user/fan exclusively of Matlab.  
 Over a decade an a half ago, the one singular feature of the (then crappy 
 almost debug free) Matlab IDE had nothing to do with the IDE per se, it 
 boiled down to one line 'plot(x,y)'  That was it, and that is still it for 
 me.  As an engineer and researcher I have to be able to provide insights, 
 and visualizations are key to that.  And the most frequent visualizations I 
 use are graphs, and not just dam 2-D plots but 3-D 
 scatterplots/surfaceplots/volumetric/etc.

 When Julia *incorporates *decent plotting in to an IDE, *then *I predict 
 it will attract a lot of new users, especially newbies to Matlab-like 
 languages.  And once you have a lot of newbs, then you'll get insight in to 
 what they want resulting in more new users, and a ground-swell of maturing 
 users.  And the latter will *then *want awesome debugging as they become 
 more expert.  And no, having the user separately load/use a plot package is 
 *not* a viable solution, it's a disincentive to newcomers.  

 To summarise: when I can download JuliaStudio (or any IDE) and blindly do 
 the following (as any raw newbie might) and not get an error, then Julia 
 will have 'arrived':

 julia x=[1,2,3]; 

 3-element Int32 Array:

  1

  2

  3


 julia y=[1,2,3];

 3-element Int32 Array:

  1

  2

  3


 julia plot(x,y)

 plot not defined


 Until then, Julia is just another language with an appealing (to me) syntax. 





Re: [julia-users] Compressing .jld files

2014-09-02 Thread Jake Bolewski
I've used Blosc in the past with great success.  Oftentimes it is faster 
than the uncompressed version if IO is the bottleneck.  The compression 
ratios are not great but that is really not the point.

On Tuesday, September 2, 2014 2:09:20 PM UTC-4, Stefan Karpinski wrote:

 That looks pretty sweet. It seems to avoid a lot of the pitfalls of 
 naively compressing data files while still getting the benefits. It would 
 be great to support that in JLD, maybe even turned on by default.


 On Tue, Sep 2, 2014 at 1:35 PM, Kevin Squire kevin@gmail.com 
 javascript: wrote:

 Just to hype blosc a little more, see

 http://www.blosc.org/blosc-in-depth.html

 The main feature is that data is chunked so that the compressed chunk 
 size fits into L1 cache, and is then decompressed and used there.  There 
 are a few more buzzwords (multithreading, simd) in the link above. Worth 
 exploring where this might be useful in Julia. 

 Cheers,
   Kevin


 On Tuesday, September 2, 2014, Tim Holy tim@gmail.com javascript: 
 wrote:

 HDF5/JLD does support compression:

 https://github.com/timholy/HDF5.jl/blob/master/doc/hdf5.md#reading-and-writing-data

 But it's not turned on by default. Matlab uses compression by default, 
 and
 I've found it's a huge bottleneck in terms of performance
 (
 http://www.mathworks.com/matlabcentral/fileexchange/39721-save-mat-files-more-quickly).
  
 But perhaps there's a good middle ground. It would take someone
 doing a little experimentation to see what the compromises are.

 --Tim

 On Tuesday, September 02, 2014 08:30:39 AM Douglas Bates wrote:
  Now that the JLD format can handle DataFrame objects I would like to 
 switch
  from storing data sets in .RData format to .jld format.  Datasets 
 stored in
  .RData format are compressed after they are written.  The default
  compression is gzip.  Bzip2 and xz compression are also available.  The
  compression can make a substantial difference in the file size because 
 the
  data values are often highly repetitive.
 
  JLD is different in scope in that .jld files can be queried using 
 external
  programs like h5ls and the files can have new data added or existing 
 data
  edited or removed.  The .RData format is an archival format.  Once the 
 file
  is written it cannot be modified in place.
 
  Given these differences I can appreciate that JLD files are not 
 compressed.
   Nevertheless I think it would be useful to adopt a convention in the 
 JLD
  module for accessing data from files with a .jld.xz or .jld.7z 
 extension.
   It could be as simple as uncompressing the files in a temporary 
 directory,
  reading then removing, or it could be more sophisticated.  I notice 
 that my
  versions of libjulia.so on an Ubuntu 64-bit system are linked against 
 both
  libz.so and liblzma.so
 
  $ ldd /usr/lib/x86_64-linux-gnu/julia/libjulia.so
  linux-vdso.so.1 =  (0x7fff5214f000)
  libdl.so.2 = /lib/x86_64-linux-gnu/libdl.so.2 (0x7f62932ee000)
  libz.so.1 = /lib/x86_64-linux-gnu/libz.so.1 (0x7f62930d5000)
  libm.so.6 = /lib/x86_64-linux-gnu/libm.so.6 (0x7f6292dce000)
  librt.so.1 = /lib/x86_64-linux-gnu/librt.so.1 (0x7f6292bc6000)
  libpthread.so.0 = /lib/x86_64-linux-gnu/libpthread.so.0
  (0x7f62929a8000)
  libunwind.so.8 = /usr/lib/x86_64-linux-gnu/libunwind.so.8
  (0x7f629278c000)
  libstdc++.so.6 = /usr/lib/x86_64-linux-gnu/libstdc++.so.6
  (0x7f6292488000)
  libgcc_s.so.1 = /lib/x86_64-linux-gnu/libgcc_s.so.1 
 (0x7f6292272000)
  libc.so.6 = /lib/x86_64-linux-gnu/libc.so.6 (0x7f6291eab000)
  /lib64/ld-linux-x86-64.so.2 (0x7f62944b3000)
  liblzma.so.5 = /lib/x86_64-linux-gnu/liblzma.so.5 (0x7f6291c89000)
 
 
  AFAIK the user-level interface to gzip requires the GZip package.  
 Unless I
  have missed something (always a possibility) there is no user-level
  interface to liblzma in Julia.  If the library is going to be linked
  anyway, would it make sense to provide a user-level interface in Julia?

  


Re: [julia-users] Compressing .jld files

2014-09-02 Thread Stefan Karpinski
I think it would be very much in line with our general ethos of the default
thing we do is the fastest possible thing – and it seems like blosc is that.


On Tue, Sep 2, 2014 at 3:11 PM, Jake Bolewski jakebolew...@gmail.com
wrote:

 I've used Blosc in the past with great success.  Oftentimes it is faster
 than the uncompressed version if IO is the bottleneck.  The compression
 ratios are not great but that is really not the point.


 On Tuesday, September 2, 2014 2:09:20 PM UTC-4, Stefan Karpinski wrote:

 That looks pretty sweet. It seems to avoid a lot of the pitfalls of
 naively compressing data files while still getting the benefits. It would
 be great to support that in JLD, maybe even turned on by default.


 On Tue, Sep 2, 2014 at 1:35 PM, Kevin Squire kevin@gmail.com wrote:

 Just to hype blosc a little more, see

 http://www.blosc.org/blosc-in-depth.html

 The main feature is that data is chunked so that the compressed chunk
 size fits into L1 cache, and is then decompressed and used there.  There
 are a few more buzzwords (multithreading, simd) in the link above. Worth
 exploring where this might be useful in Julia.

 Cheers,
   Kevin


 On Tuesday, September 2, 2014, Tim Holy tim@gmail.com wrote:

 HDF5/JLD does support compression:
 https://github.com/timholy/HDF5.jl/blob/master/doc/hdf5.
 md#reading-and-writing-data

 But it's not turned on by default. Matlab uses compression by default,
 and
 I've found it's a huge bottleneck in terms of performance
 (http://www.mathworks.com/matlabcentral/fileexchange/
 39721-save-mat-files-more-quickly). But perhaps there's a good middle
 ground. It would take someone
 doing a little experimentation to see what the compromises are.

 --Tim

 On Tuesday, September 02, 2014 08:30:39 AM Douglas Bates wrote:
  Now that the JLD format can handle DataFrame objects I would like to
 switch
  from storing data sets in .RData format to .jld format.  Datasets
 stored in
  .RData format are compressed after they are written.  The default
  compression is gzip.  Bzip2 and xz compression are also available.
 The
  compression can make a substantial difference in the file size
 because the
  data values are often highly repetitive.
 
  JLD is different in scope in that .jld files can be queried using
 external
  programs like h5ls and the files can have new data added or existing
 data
  edited or removed.  The .RData format is an archival format.  Once
 the file
  is written it cannot be modified in place.
 
  Given these differences I can appreciate that JLD files are not
 compressed.
   Nevertheless I think it would be useful to adopt a convention in the
 JLD
  module for accessing data from files with a .jld.xz or .jld.7z
 extension.
   It could be as simple as uncompressing the files in a temporary
 directory,
  reading then removing, or it could be more sophisticated.  I notice
 that my
  versions of libjulia.so on an Ubuntu 64-bit system are linked against
 both
  libz.so and liblzma.so
 
  $ ldd /usr/lib/x86_64-linux-gnu/julia/libjulia.so
  linux-vdso.so.1 =  (0x7fff5214f000)
  libdl.so.2 = /lib/x86_64-linux-gnu/libdl.so.2 (0x7f62932ee000)
  libz.so.1 = /lib/x86_64-linux-gnu/libz.so.1 (0x7f62930d5000)
  libm.so.6 = /lib/x86_64-linux-gnu/libm.so.6 (0x7f6292dce000)
  librt.so.1 = /lib/x86_64-linux-gnu/librt.so.1 (0x7f6292bc6000)
  libpthread.so.0 = /lib/x86_64-linux-gnu/libpthread.so.0
  (0x7f62929a8000)
  libunwind.so.8 = /usr/lib/x86_64-linux-gnu/libunwind.so.8
  (0x7f629278c000)
  libstdc++.so.6 = /usr/lib/x86_64-linux-gnu/libstdc++.so.6
  (0x7f6292488000)
  libgcc_s.so.1 = /lib/x86_64-linux-gnu/libgcc_s.so.1
 (0x7f6292272000)
  libc.so.6 = /lib/x86_64-linux-gnu/libc.so.6 (0x7f6291eab000)
  /lib64/ld-linux-x86-64.so.2 (0x7f62944b3000)
  liblzma.so.5 = /lib/x86_64-linux-gnu/liblzma.so.5
 (0x7f6291c89000)
 
 
  AFAIK the user-level interface to gzip requires the GZip package.
 Unless I
  have missed something (always a possibility) there is no user-level
  interface to liblzma in Julia.  If the library is going to be linked
  anyway, would it make sense to provide a user-level interface in
 Julia?





Re: [julia-users] Compressing .jld files

2014-09-02 Thread Tim Holy
All these testimonials do make it sound promising. Even three-fold compression 
is a pretty big deal.

One disadvantage to compression is that it makes mmap impossible. But, since 
HDF5 supports hyperslabs, that's not as big a deal as it would have been.

--Tim

On Tuesday, September 02, 2014 12:11:55 PM Jake Bolewski wrote:
 I've used Blosc in the past with great success.  Oftentimes it is faster
 than the uncompressed version if IO is the bottleneck.  The compression
 ratios are not great but that is really not the point.
 
 On Tuesday, September 2, 2014 2:09:20 PM UTC-4, Stefan Karpinski wrote:
  That looks pretty sweet. It seems to avoid a lot of the pitfalls of
  naively compressing data files while still getting the benefits. It would
  be great to support that in JLD, maybe even turned on by default.
  
  
  On Tue, Sep 2, 2014 at 1:35 PM, Kevin Squire kevin@gmail.com
  
  javascript: wrote:
  Just to hype blosc a little more, see
  
  http://www.blosc.org/blosc-in-depth.html
  
  The main feature is that data is chunked so that the compressed chunk
  size fits into L1 cache, and is then decompressed and used there.  There
  are a few more buzzwords (multithreading, simd) in the link above. Worth
  exploring where this might be useful in Julia.
  
  Cheers,
  
Kevin
  
  On Tuesday, September 2, 2014, Tim Holy tim@gmail.com javascript:
  
  wrote:
  HDF5/JLD does support compression:
  
  https://github.com/timholy/HDF5.jl/blob/master/doc/hdf5.md#reading-and-w
  riting-data
  
  But it's not turned on by default. Matlab uses compression by default,
  and
  I've found it's a huge bottleneck in terms of performance
  (
  http://www.mathworks.com/matlabcentral/fileexchange/39721-save-mat-files
  -more-quickly). But perhaps there's a good middle ground. It would take
  someone
  doing a little experimentation to see what the compromises are.
  
  --Tim
  
  On Tuesday, September 02, 2014 08:30:39 AM Douglas Bates wrote:
   Now that the JLD format can handle DataFrame objects I would like to
  
  switch
  
   from storing data sets in .RData format to .jld format.  Datasets
  
  stored in
  
   .RData format are compressed after they are written.  The default
   compression is gzip.  Bzip2 and xz compression are also available. 
   The
   compression can make a substantial difference in the file size because
  
  the
  
   data values are often highly repetitive.
   
   JLD is different in scope in that .jld files can be queried using
  
  external
  
   programs like h5ls and the files can have new data added or existing
  
  data
  
   edited or removed.  The .RData format is an archival format.  Once the
  
  file
  
   is written it cannot be modified in place.
   
   Given these differences I can appreciate that JLD files are not
  
  compressed.
  
Nevertheless I think it would be useful to adopt a convention in the
  
  JLD
  
   module for accessing data from files with a .jld.xz or .jld.7z
  
  extension.
  
It could be as simple as uncompressing the files in a temporary
  
  directory,
  
   reading then removing, or it could be more sophisticated.  I notice
  
  that my
  
   versions of libjulia.so on an Ubuntu 64-bit system are linked against
  
  both
  
   libz.so and liblzma.so
   
   $ ldd /usr/lib/x86_64-linux-gnu/julia/libjulia.so
   linux-vdso.so.1 =  (0x7fff5214f000)
   libdl.so.2 = /lib/x86_64-linux-gnu/libdl.so.2 (0x7f62932ee000)
   libz.so.1 = /lib/x86_64-linux-gnu/libz.so.1 (0x7f62930d5000)
   libm.so.6 = /lib/x86_64-linux-gnu/libm.so.6 (0x7f6292dce000)
   librt.so.1 = /lib/x86_64-linux-gnu/librt.so.1 (0x7f6292bc6000)
   libpthread.so.0 = /lib/x86_64-linux-gnu/libpthread.so.0
   (0x7f62929a8000)
   libunwind.so.8 = /usr/lib/x86_64-linux-gnu/libunwind.so.8
   (0x7f629278c000)
   libstdc++.so.6 = /usr/lib/x86_64-linux-gnu/libstdc++.so.6
   (0x7f6292488000)
   libgcc_s.so.1 = /lib/x86_64-linux-gnu/libgcc_s.so.1
  
  (0x7f6292272000)
  
   libc.so.6 = /lib/x86_64-linux-gnu/libc.so.6 (0x7f6291eab000)
   /lib64/ld-linux-x86-64.so.2 (0x7f62944b3000)
   liblzma.so.5 = /lib/x86_64-linux-gnu/liblzma.so.5
   (0x7f6291c89000)
   
   
   AFAIK the user-level interface to gzip requires the GZip package.
  
  Unless I
  
   have missed something (always a possibility) there is no user-level
   interface to liblzma in Julia.  If the library is going to be linked
   anyway, would it make sense to provide a user-level interface in
   Julia?



Re: [julia-users] Compressing .jld files

2014-09-02 Thread Douglas Bates
Would it be reasonable to create a Blosc package or it is best to 
incorporate it directly into the HDF5 package?  If a separate package is 
reasonable I could start on it, as I was the one who suggested this in the 
first place.

On Tuesday, September 2, 2014 2:43:15 PM UTC-5, Tim Holy wrote:

 All these testimonials do make it sound promising. Even three-fold 
 compression 
 is a pretty big deal. 

 One disadvantage to compression is that it makes mmap impossible. But, 
 since 
 HDF5 supports hyperslabs, that's not as big a deal as it would have been. 

 --Tim 

 On Tuesday, September 02, 2014 12:11:55 PM Jake Bolewski wrote: 
  I've used Blosc in the past with great success.  Oftentimes it is faster 
  than the uncompressed version if IO is the bottleneck.  The compression 
  ratios are not great but that is really not the point. 
  
  On Tuesday, September 2, 2014 2:09:20 PM UTC-4, Stefan Karpinski wrote: 
   That looks pretty sweet. It seems to avoid a lot of the pitfalls of 
   naively compressing data files while still getting the benefits. It 
 would 
   be great to support that in JLD, maybe even turned on by default. 
   
   
   On Tue, Sep 2, 2014 at 1:35 PM, Kevin Squire kevin@gmail.com 
   
   javascript: wrote: 
   Just to hype blosc a little more, see 
   
   http://www.blosc.org/blosc-in-depth.html 
   
   The main feature is that data is chunked so that the compressed chunk 
   size fits into L1 cache, and is then decompressed and used there. 
  There 
   are a few more buzzwords (multithreading, simd) in the link above. 
 Worth 
   exploring where this might be useful in Julia. 
   
   Cheers, 
   
 Kevin 
   
   On Tuesday, September 2, 2014, Tim Holy tim@gmail.com 
 javascript: 
   
   wrote: 
   HDF5/JLD does support compression: 
   
   
 https://github.com/timholy/HDF5.jl/blob/master/doc/hdf5.md#reading-and-w 
   riting-data 
   
   But it's not turned on by default. Matlab uses compression by 
 default, 
   and 
   I've found it's a huge bottleneck in terms of performance 
   ( 
   
 http://www.mathworks.com/matlabcentral/fileexchange/39721-save-mat-files 
   -more-quickly). But perhaps there's a good middle ground. It would 
 take 
   someone 
   doing a little experimentation to see what the compromises are. 
   
   --Tim 
   
   On Tuesday, September 02, 2014 08:30:39 AM Douglas Bates wrote: 
Now that the JLD format can handle DataFrame objects I would like 
 to 
   
   switch 
   
from storing data sets in .RData format to .jld format.  Datasets 
   
   stored in 
   
.RData format are compressed after they are written.  The default 
compression is gzip.  Bzip2 and xz compression are also available. 
The 
compression can make a substantial difference in the file size 
 because 
   
   the 
   
data values are often highly repetitive. 

JLD is different in scope in that .jld files can be queried using 
   
   external 
   
programs like h5ls and the files can have new data added or 
 existing 
   
   data 
   
edited or removed.  The .RData format is an archival format.  Once 
 the 
   
   file 
   
is written it cannot be modified in place. 

Given these differences I can appreciate that JLD files are not 
   
   compressed. 
   
 Nevertheless I think it would be useful to adopt a convention in 
 the 
   
   JLD 
   
module for accessing data from files with a .jld.xz or .jld.7z 
   
   extension. 
   
 It could be as simple as uncompressing the files in a temporary 
   
   directory, 
   
reading then removing, or it could be more sophisticated.  I 
 notice 
   
   that my 
   
versions of libjulia.so on an Ubuntu 64-bit system are linked 
 against 
   
   both 
   
libz.so and liblzma.so 

$ ldd /usr/lib/x86_64-linux-gnu/julia/libjulia.so 
linux-vdso.so.1 =  (0x7fff5214f000) 
libdl.so.2 = /lib/x86_64-linux-gnu/libdl.so.2 
 (0x7f62932ee000) 
libz.so.1 = /lib/x86_64-linux-gnu/libz.so.1 (0x7f62930d5000) 
libm.so.6 = /lib/x86_64-linux-gnu/libm.so.6 (0x7f6292dce000) 
librt.so.1 = /lib/x86_64-linux-gnu/librt.so.1 
 (0x7f6292bc6000) 
libpthread.so.0 = /lib/x86_64-linux-gnu/libpthread.so.0 
(0x7f62929a8000) 
libunwind.so.8 = /usr/lib/x86_64-linux-gnu/libunwind.so.8 
(0x7f629278c000) 
libstdc++.so.6 = /usr/lib/x86_64-linux-gnu/libstdc++.so.6 
(0x7f6292488000) 
libgcc_s.so.1 = /lib/x86_64-linux-gnu/libgcc_s.so.1 
   
   (0x7f6292272000) 
   
libc.so.6 = /lib/x86_64-linux-gnu/libc.so.6 (0x7f6291eab000) 
/lib64/ld-linux-x86-64.so.2 (0x7f62944b3000) 
liblzma.so.5 = /lib/x86_64-linux-gnu/liblzma.so.5 
(0x7f6291c89000) 


AFAIK the user-level interface to gzip requires the GZip package. 
   
   Unless I 
   
have missed something (always a possibility) there is no 
 user-level 
interface to liblzma in Julia.  If the library is going to be 
 linked 
anyway, 

[julia-users] Re: live plotting in PyPlot.jl?

2014-09-02 Thread Simon Danisch
Okay, I fixed the issues I think. Also, I didn't actually update the normal 
vectors, when updating the z-values. This is why the lighting looked very 
weird and made it look less 3D.
This should be fixed as well ;) Try it out!

Am Dienstag, 19. August 2014 07:46:22 UTC+2 schrieb Sheehan Olver:

 Hi,

 Is there a way to force plotting in PyPlot.jl, to simulate animation? 
  Right now if I do a for loop over a sequence of plots, it only outputs the 
 last plot.

 This is in IJulia running on OS X with matplotlib version 1.3.1 installed, 
 and pygui(true)

 Sheehan



Re: [julia-users] Compressing .jld files

2014-09-02 Thread Jake Bolewski
It would be best to incorporate it into the HDF5 package.  A julia package 
would be useful if you wanted to do the same sort of compression on Julia 
binary blobs, such as serialized julia values in an IOBuffer.

On Tuesday, September 2, 2014 3:47:33 PM UTC-4, Douglas Bates wrote:

 Would it be reasonable to create a Blosc package or it is best to 
 incorporate it directly into the HDF5 package?  If a separate package is 
 reasonable I could start on it, as I was the one who suggested this in the 
 first place.

 On Tuesday, September 2, 2014 2:43:15 PM UTC-5, Tim Holy wrote:

 All these testimonials do make it sound promising. Even three-fold 
 compression 
 is a pretty big deal. 

 One disadvantage to compression is that it makes mmap impossible. But, 
 since 
 HDF5 supports hyperslabs, that's not as big a deal as it would have been. 

 --Tim 

 On Tuesday, September 02, 2014 12:11:55 PM Jake Bolewski wrote: 
  I've used Blosc in the past with great success.  Oftentimes it is 
 faster 
  than the uncompressed version if IO is the bottleneck.  The compression 
  ratios are not great but that is really not the point. 
  
  On Tuesday, September 2, 2014 2:09:20 PM UTC-4, Stefan Karpinski wrote: 
   That looks pretty sweet. It seems to avoid a lot of the pitfalls of 
   naively compressing data files while still getting the benefits. It 
 would 
   be great to support that in JLD, maybe even turned on by default. 
   
   
   On Tue, Sep 2, 2014 at 1:35 PM, Kevin Squire kevin@gmail.com 
   
   javascript: wrote: 
   Just to hype blosc a little more, see 
   
   http://www.blosc.org/blosc-in-depth.html 
   
   The main feature is that data is chunked so that the compressed 
 chunk 
   size fits into L1 cache, and is then decompressed and used there. 
  There 
   are a few more buzzwords (multithreading, simd) in the link above. 
 Worth 
   exploring where this might be useful in Julia. 
   
   Cheers, 
   
 Kevin 
   
   On Tuesday, September 2, 2014, Tim Holy tim@gmail.com 
 javascript: 
   
   wrote: 
   HDF5/JLD does support compression: 
   
   
 https://github.com/timholy/HDF5.jl/blob/master/doc/hdf5.md#reading-and-w 
   riting-data 
   
   But it's not turned on by default. Matlab uses compression by 
 default, 
   and 
   I've found it's a huge bottleneck in terms of performance 
   ( 
   
 http://www.mathworks.com/matlabcentral/fileexchange/39721-save-mat-files 
   -more-quickly). But perhaps there's a good middle ground. It would 
 take 
   someone 
   doing a little experimentation to see what the compromises are. 
   
   --Tim 
   
   On Tuesday, September 02, 2014 08:30:39 AM Douglas Bates wrote: 
Now that the JLD format can handle DataFrame objects I would like 
 to 
   
   switch 
   
from storing data sets in .RData format to .jld format.  Datasets 
   
   stored in 
   
.RData format are compressed after they are written.  The default 
compression is gzip.  Bzip2 and xz compression are also 
 available. 
The 
compression can make a substantial difference in the file size 
 because 
   
   the 
   
data values are often highly repetitive. 

JLD is different in scope in that .jld files can be queried using 
   
   external 
   
programs like h5ls and the files can have new data added or 
 existing 
   
   data 
   
edited or removed.  The .RData format is an archival format. 
  Once the 
   
   file 
   
is written it cannot be modified in place. 

Given these differences I can appreciate that JLD files are not 
   
   compressed. 
   
 Nevertheless I think it would be useful to adopt a convention in 
 the 
   
   JLD 
   
module for accessing data from files with a .jld.xz or .jld.7z 
   
   extension. 
   
 It could be as simple as uncompressing the files in a temporary 
   
   directory, 
   
reading then removing, or it could be more sophisticated.  I 
 notice 
   
   that my 
   
versions of libjulia.so on an Ubuntu 64-bit system are linked 
 against 
   
   both 
   
libz.so and liblzma.so 

$ ldd /usr/lib/x86_64-linux-gnu/julia/libjulia.so 
linux-vdso.so.1 =  (0x7fff5214f000) 
libdl.so.2 = /lib/x86_64-linux-gnu/libdl.so.2 
 (0x7f62932ee000) 
libz.so.1 = /lib/x86_64-linux-gnu/libz.so.1 (0x7f62930d5000) 
libm.so.6 = /lib/x86_64-linux-gnu/libm.so.6 (0x7f6292dce000) 
librt.so.1 = /lib/x86_64-linux-gnu/librt.so.1 
 (0x7f6292bc6000) 
libpthread.so.0 = /lib/x86_64-linux-gnu/libpthread.so.0 
(0x7f62929a8000) 
libunwind.so.8 = /usr/lib/x86_64-linux-gnu/libunwind.so.8 
(0x7f629278c000) 
libstdc++.so.6 = /usr/lib/x86_64-linux-gnu/libstdc++.so.6 
(0x7f6292488000) 
libgcc_s.so.1 = /lib/x86_64-linux-gnu/libgcc_s.so.1 
   
   (0x7f6292272000) 
   
libc.so.6 = /lib/x86_64-linux-gnu/libc.so.6 (0x7f6291eab000) 
/lib64/ld-linux-x86-64.so.2 (0x7f62944b3000) 
liblzma.so.5 = /lib/x86_64-linux-gnu/liblzma.so.5 

Re: [julia-users] Re: Julia talk at EuroSciPy 2014

2014-09-02 Thread Stefan Karpinski
A macro's role is to take some expressions, symbols and literals and
produce a single expression that will be what is actually evaluated. This
expansion occurs when the macro is *parsed* not when the code runs.
Therefore, if you call eval on something inside of a macro body, you are
most likely evaluating it before it runs and at a point where the variables
you're evaluating don't even exist. If your macro is used at the top-level
global scope, these happen to be the same thing, but your macro is still
broken and it won't work inside of a function, for example since the eval
occurs when the function is defined, not when the function is called.


On Sat, Aug 30, 2014 at 5:27 PM, Don MacMillen don.macmil...@gmail.com
wrote:

 There you have it... Zen was never my thing.  So I find the previous two
 comments cryptic.
 The macro _seemed_ to be doing what I wanted it to.  Is there some
 documentation that
 speaks to calling (or rather not calling) eval from inside a macro?


 On Saturday, August 30, 2014 2:15:54 PM UTC-7, John Myles White wrote:

 This might need to be part of the Zen of Julia.

  — John

 On Aug 30, 2014, at 2:11 PM, Jameson Nash vtj...@gmail.com wrote:

 calling eval in a macro doesn't do what you think it does, so it doesn't
 do what you want


 On Sat, Aug 30, 2014 at 5:05 PM, Don MacMillen don.ma...@gmail.com
 wrote:

 Perfect Steve, many thanks for the explanation.  But just to be sure I
 understand,
 the multiple eval of input expression, your begin println(hello); 3
 end  would only
 occur during macro expansion?

 Also, just to beat this poor dead horse into the ground, to get the
 behavior I wanted,
 get rid of the splice, get rid of the splat and pass a single vector
 parameter to the
 macro and then eval it there.  Now that's the behavior I wanted but
 performance is
 another issue.  How would I reason about the relative performance here?

 macro hornervec(x, c)
 p = eval(c)
 ex = esc(p[end])
 for i = length(p)-1:-1:1
 ex = :($(esc(p[i])) + t * $ex)
 end
 Expr(:block, :(t = $(esc(x))), ex)
 end


 On Saturday, August 30, 2014 12:42:11 AM UTC-7, Steven G. Johnson wrote:

 The answer is related to your splicing questions.  What gets passed to
 the macro is not the value of the argument, but rather the symbolic
 expression of the argument.  If I didn't use a temporary variable, that
 symbolic expression would get inserted multiple times into the polynomial
 evaluation.  This is not what you want because it means the expression
 could be evaluated multiple times.


 Try passing an expression with a side effect and you'll see what I
 mean:

 @horner(begin
printf(hello)
3
end, 4,5,6,7)


 Whoops, I mean println, not printf.  And I mean, try passing it to a
 version of horner that does not use a temporary variable.






[julia-users] Pkg.update() error

2014-09-02 Thread xiongjieyi
When I run Pkg.update() in verson 0.3-release have such problem:

julia versioninfo()
Julia Version 0.3.1-pre+41
Commit 77c15f9* (2014-08-31 17:39 UTC)
Platform Info:
  System: Linux (x86_64-redhat-linux)
  CPU: Intel(R) Xeon(R) CPU E7- 4830  @ 2.13GHz
  WORD_SIZE: 64
  BLAS: libopenblas (USE64BITINT NO_AFFINITY NEHALEM)
  LAPACK: libopenblas
  LIBM: libopenlibm
  LLVM: libLLVM-3.3

julia Pkg.update()
INFO: Updating METADATA...
INFO: Computing changes...
ERROR: Rmath's requirements can't be satisfied because of the following 
fixed packages: julia
 in error at error.jl:22
 in resolve at ./pkg/entry.jl:372
 in update at ./pkg/entry.jl:278
 in anonymous at pkg/dir.jl:28
 in cd at ./file.jl:20
 in __cd#227__ at ./pkg/dir.jl:28
 in update at ./pkg.jl:41
 in update_3B_1867 at 
/home/JXiong/programs/julia/julia-0.3.0-release/usr/bin/../lib/julia/sys.so


Re: [julia-users] Compressing .jld files

2014-09-02 Thread Tim Holy
Certainly it would be more than welcome in HDF5. If there is call for a 
standalone implementation, that would be fine too.

Best,
--Tim

On Tuesday, September 02, 2014 12:58:24 PM Jake Bolewski wrote:
 It would be best to incorporate it into the HDF5 package.  A julia package
 would be useful if you wanted to do the same sort of compression on Julia
 binary blobs, such as serialized julia values in an IOBuffer.
 
 On Tuesday, September 2, 2014 3:47:33 PM UTC-4, Douglas Bates wrote:
  Would it be reasonable to create a Blosc package or it is best to
  incorporate it directly into the HDF5 package?  If a separate package is
  reasonable I could start on it, as I was the one who suggested this in the
  first place.
  
  On Tuesday, September 2, 2014 2:43:15 PM UTC-5, Tim Holy wrote:
  All these testimonials do make it sound promising. Even three-fold
  compression
  is a pretty big deal.
  
  One disadvantage to compression is that it makes mmap impossible. But,
  since
  HDF5 supports hyperslabs, that's not as big a deal as it would have been.
  
  --Tim
  
  On Tuesday, September 02, 2014 12:11:55 PM Jake Bolewski wrote:
   I've used Blosc in the past with great success.  Oftentimes it is
  
  faster
  
   than the uncompressed version if IO is the bottleneck.  The compression
   ratios are not great but that is really not the point.
   
   On Tuesday, September 2, 2014 2:09:20 PM UTC-4, Stefan Karpinski wrote:
That looks pretty sweet. It seems to avoid a lot of the pitfalls of
naively compressing data files while still getting the benefits. It
  
  would
  
be great to support that in JLD, maybe even turned on by default.


On Tue, Sep 2, 2014 at 1:35 PM, Kevin Squire kevin@gmail.com

javascript: wrote:
Just to hype blosc a little more, see

http://www.blosc.org/blosc-in-depth.html

The main feature is that data is chunked so that the compressed
  
  chunk
  
size fits into L1 cache, and is then decompressed and used there.
   
   There
   
are a few more buzzwords (multithreading, simd) in the link above.
  
  Worth
  
exploring where this might be useful in Julia.

Cheers,

  Kevin

On Tuesday, September 2, 2014, Tim Holy tim@gmail.com
  
  javascript:
  
wrote:
HDF5/JLD does support compression:
  https://github.com/timholy/HDF5.jl/blob/master/doc/hdf5.md#reading-and-w
  
riting-data

But it's not turned on by default. Matlab uses compression by
  
  default,
  
and
I've found it's a huge bottleneck in terms of performance
(
  
  http://www.mathworks.com/matlabcentral/fileexchange/39721-save-mat-files
  
-more-quickly). But perhaps there's a good middle ground. It would
  
  take
  
someone
doing a little experimentation to see what the compromises are.

--Tim

On Tuesday, September 02, 2014 08:30:39 AM Douglas Bates wrote:
 Now that the JLD format can handle DataFrame objects I would like
  
  to
  
switch

 from storing data sets in .RData format to .jld format.  Datasets

stored in

 .RData format are compressed after they are written.  The default
 compression is gzip.  Bzip2 and xz compression are also
  
  available.
  
 The
 compression can make a substantial difference in the file size
  
  because
  
the

 data values are often highly repetitive.
 
 JLD is different in scope in that .jld files can be queried using

external

 programs like h5ls and the files can have new data added or
  
  existing
  
data

 edited or removed.  The .RData format is an archival format.
   
   Once the
   
file

 is written it cannot be modified in place.
 
 Given these differences I can appreciate that JLD files are not

compressed.

  Nevertheless I think it would be useful to adopt a convention in
  
  the
  
JLD

 module for accessing data from files with a .jld.xz or .jld.7z

extension.

  It could be as simple as uncompressing the files in a temporary

directory,

 reading then removing, or it could be more sophisticated.  I
  
  notice
  
that my

 versions of libjulia.so on an Ubuntu 64-bit system are linked
  
  against
  
both

 libz.so and liblzma.so
 
 $ ldd /usr/lib/x86_64-linux-gnu/julia/libjulia.so
 linux-vdso.so.1 =  (0x7fff5214f000)
 libdl.so.2 = /lib/x86_64-linux-gnu/libdl.so.2
  
  (0x7f62932ee000)
  
 libz.so.1 = /lib/x86_64-linux-gnu/libz.so.1 (0x7f62930d5000)
 libm.so.6 = /lib/x86_64-linux-gnu/libm.so.6 (0x7f6292dce000)
 librt.so.1 = /lib/x86_64-linux-gnu/librt.so.1
  
  (0x7f6292bc6000)
  
 libpthread.so.0 = /lib/x86_64-linux-gnu/libpthread.so.0
 (0x7f62929a8000)
 libunwind.so.8 = /usr/lib/x86_64-linux-gnu/libunwind.so.8
 (0x7f629278c000)
 libstdc++.so.6 = 

Re: [julia-users] live plotting in PyPlot.jl?

2014-09-02 Thread Sheehan Olver
I tried Interact.jl, and it's really fun!  Here is code that does a contour 
plot of Helmholtz with a slider for different wave numbers, where ny is the 
discretization in y.  (nx = ∞, which means adaptive). 


Pkg.add(“Interact”)
Pkg.add(“Gadfly”)
Pkg.add(“ApproxFun”)
Pkg.checkout(“ApproxFun”)
using ApproxFun,Interact


d=Interval()⊗Interval()
B=dirichlet(d)
Δ=lap(d)

@manipulate for k=-100.0:.01:2000.0,ny=10:200
contour(pdesolve([B,Δ+k*I],[1.,1.,1.,1.],ny))
end


On 1 Sep 2014, at 8:21 pm, Shashi Gowda shashigowd...@gmail.com wrote:

 @Sheehan
 
 There is now Interact.jl (Pkg.add(Interact)) which lets you travel your 
 for-loops with sliders and such widgets, to put it one way. Here's an example 
 notebook showing how you can do interactive plotting with Gadfly or PyPlot: 
 http://nbviewer.ipython.org/github/JuliaLang/Interact.jl/blob/master/doc/notebooks/Interactive%20Plotting.ipynb
 
 
 
 On Mon, Sep 1, 2014 at 12:28 PM, Sheehan Olver dlfivefi...@gmail.com wrote:
 
   Got GLPlot working, it's awesome!  The following does a movie of a 
 solution to wave equation on a square using latest version of ApproxFun (the 
 color is weird since I haven't figured that part out yet):
 
 
 
 window = createdisplay(w=1000,h=1000,eyeposition=Vec3(1,1,1), lookat=Vec3(0))
 
 function zcolor(i, j, t)
   x   = float32(i - 0.5)
   z   = float32(j - 0.5)
   radius  = sqrt((x * x) + (z * z))
 
   r = sin(10.0f0 * radius + t)
 g = cos(10.0f0 * radius + t)
 b = radius
 return Vec4(r,g,b, 1)
 end
 yy=xx=-1.:.05:1.
 N=length(xx)
 color = [zcolor(i/N, j/N, 15) for i=1:N, j=1:N];
 
 
 h=0.01;
 u0=TensorFun((x,y)-exp(-10x.^2-20(y-.1).^2))
 d=Interval()⊗Interval()
 L=I-h^2*lap(d);
 B=dirichlet(d);
 S=schurfact([B,L],80);
 
 u=Array(TensorFun,1)
 u[1]=u0
 u[2]=u0
 n=2;
 
 
 GLPlot.glClearColor(1,1,1,0)
 m=400
 for k=n+1:n+m
 u[k]=(S\[zeros(4),2u[k-1]-u[k-2]])
 vals=[u[k][x,y] for x in xx,y in yy];
 texdata=map(Vec1,vals)
 obj   = glplot(texdata, primitive=SURFACE(), color=color);
 GLPlot.glClear(GLPlot.GL_COLOR_BUFFER_BIT | GLPlot.GL_DEPTH_BUFFER_BIT)
 render(obj)
 yield()
  GLFW.SwapBuffers(window.glfwWindow)
 end
 n+=m
 
 On 21 Aug 2014, at 9:51 am, Simon Danisch sdani...@gmail.com wrote:
 
 Good to hear.
 The test looks funny, as I overlay everything GLPlot is able to do ;)
 I should remove example.jl, as it uses legacy code.
 I'm not sure about the surface example, though... Did you change anything?
 This might be due to a change of the output from imread (Images.jl) and 
 shouldn't be a problem if you use your own arrays.
 I'll have a look at this tomorrow...
 
 
 2014-08-20 23:58 GMT+02:00 Sheehan Olver dlfivefi...@gmail.com:
 OK I rebuilt julia and cleared my .julia folder, which seems to have cleared 
 that issue.  Pkg.test(“GLPlot”) seems to work, though the output looks 
 funny.   I also get the following:
 
 julia include(surface.jl)
 INFO: loaded GLFW 3.0.4 Cocoa NSGL chdir menubar dynamic from 
 /Users/solver/.julia/v0.4/GLFW/deps/usr64/lib/libglfw
 ERROR: Color Format RGBA not supported
  in error at error.jl:21
  in Texture at /Users/solver/.julia/v0.4/GLAbstraction/src/GLTexture.jl:156
  in include at ./boot.jl:245
  in include_from_node1 at ./loading.jl:128
 while loading /Users/solver/.julia/v0.4/GLPlot/example/surface.jl, in 
 expression starting on line 24
 
 julia include(example.jl)
 INFO: loaded GLFW 3.0.4 Cocoa NSGL chdir menubar dynamic from 
 /Users/solver/.julia/v0.4/GLFW/deps/usr64/lib/libglfw
 ERROR: Cam not defined
  in include at ./boot.jl:245
  in include_from_node1 at ./loading.jl:128
 while loading /Users/solver/.julia/v0.4/GLPlot/example/example.jl, in 
 expression starting on line 25
 
 
 On 20 Aug 2014, at 9:37 pm, Tim Holy tim.h...@gmail.com wrote:
 
  You might need a Pkg.update(), or Pkg.build(Images) if the update doesn't
  solve it.
 
  --Tim
 
  On Wednesday, August 20, 2014 09:23:16 PM Sheehan Olver wrote:
  OK Now I get
 
  could not open file
  /Users/solver/.julia/v0.3/Images/src/ioformats/../../deps/deps.jl while
  loading /Users/solver/.julia/v0.3/Images/src/ioformats/libmagickwand.jl, 
  in
  expression starting on line 24 while loading
  /Users/solver/.julia/v0.3/Images/src/Images.jl, in expression starting on
  line 38 while loading
  /Users/solver/.julia/v0.3/GLAbstraction/src/GLTexture.jl, in expression
  starting on line 1 while loading
  /Users/solver/.julia/v0.3/GLAbstraction/src/GLTypes.jl, in expression
  starting on line 40 while loading
  /Users/solver/.julia/v0.3/GLAbstraction/src/GLAbstraction.jl, in 
  expression
  starting on line 8
  On 20 Aug 2014, at 9:06 pm, Simon Danisch sdani...@gmail.com wrote:
  Yes I do =)
  You need to install Images.jl properly with its dependency.
  https://github.com/timholy/Images.jl
  I should at some point load this conditional, as you don't really need
  Images.jl as long as you don't read images from your HD.
 
 
  2014-08-20 13:01 

[julia-users] Re: FastGauss.jl: fast computation of Gauss quadrature rules

2014-09-02 Thread Jason Merrill
On Tuesday, September 2, 2014 5:57:43 AM UTC-7, Alex Townsend wrote:



 On Tuesday, 2 September 2014 03:00:10 UTC-4, Jason Merrill wrote:

 On Monday, September 1, 2014 2:33:31 PM UTC-7, Alex Townsend wrote:

 I have written a package FastGauss.jl available here: 

 https://github.com/ajt60gaibb/FastGauss.jl


 I am a Julia beginner (only been learning for 2 weeks) so I am assuming 
 the code can be 
 improved in a million and one ways. Please tell me if I've done 
 something that Julia does 
 not like. I am not sure if it is appropriate to make this an official 
 package.

  
 One thing to look out for is making sure your functions have consistent 
 return types. E.g. in 
 https://github.com/ajt60gaibb/FastGauss.jl/blob/91e2ac656b856876563d5aacf7b5a405e068b3da/src/GaussLobatto.jl#L4
  
 you have

 Thanks! I tried to get the return types consistent, but obviously missed a 
 few. I've been trying to use @code_typed to tell me this 
 information, but reading the output is a little difficult (at the moment). 


I think the community as a whole would like to see better tooling around 
finding and fixing this kind of soft bug.

You might check out https://github.com/tonyhffong/Lint.jl, and 
https://github.com/astrieanna/TypeCheck.jl. I haven't tried either of them 
myself yet, but I've heard people say good things about both of them.
 


Re: [julia-users] trouble after updating Julia

2014-09-02 Thread Tony Kelman
Ivar described the situation to a T. We have pretty good coverage of 
problems everywhere except Windows thanks to Travis. Occasionally things 
pop up that are FreeBSD specific, but that's not as common or hard to fix. 
I've been working sporadically 
on https://github.com/JuliaLang/julia/pull/6028 since the start of March. I 
opened that PR just 2 weeks after my first exposure to Julia. Unfortunately 
there was only a couple of days' time where I could get Julia to build and 
run all the tests within 30 minutes on AppVeyor. Both Julia and the 
AppVeyor VM's have gotten slower since then.

Currently downloading and running the binary installer, then running the 
tests can pass on 32-bit Windows in just under 30 minutes on an AppVeyor 
VM, but 64 bit has a current failure (backtrace test) and would time out if 
it didn't. We could split the tests up in a matrix, but you need to build 
the binaries somewhat more regularly than the current nightlies for it to 
be useful for CI purposes. Doing make dist in AppVeyor can't finish 
within 30 minutes either, even using binary dependencies for everything.

I was playing with a different CI service called Wercker that doesn't run 
on pull requests and doesn't have build matrix functionality, but crucially 
caches dependencies and allows you to pre-build custom build boxes (so you 
can for example upgrade the VM to Ubuntu 14.04, install a bunch of 
customized packages without needing to wait for apt-get on every build, 
etc). With all the dependencies cached, it is actually possible to 
cross-compile both 32 bit and 64 bit Julia in about 20-25 minutes. I had a 
temporary location I was able to upload some test binaries to, but would 
need something with larger capacity (probably configured to throw away all 
but the most recent few builds?) to set up this kind of thing permanently. 
We could figure out some kind of webhook or something to send a 
notification from wherever the cross-compile build happens and uploads a 
binary to trigger an AppVeyor build to install and run the tests. I was a 
little hesitant to give AWS my credit card info (poor grad student here), 
otherwise I would set up a trial on S3.

There's another one of these CI services called Shippable that I also 
looked at, it's Docker-based and can run on pull requests (I think) and do 
build matrices, but they don't have the same build environment 
customization features as Wercker does (yet). If they eventually let you 
provide your own custom Docker image that would be perfect for 
cross-compiling binaries, just have to find someplace to upload them to.

If someone has the resources and abilities to set up a machine farm I'm 
more than willing to share my build Julia for Windows as quickly as 
possible scripts. The hard part is the farm, since we aren't Mozilla.

-Tony


On Monday, September 1, 2014 10:40:13 AM UTC-7, Ivar Nesje wrote:

 Travis also runs on Pull Requests, so we get a warning as long as the 
 tests covers the issue on Linux or OSX. It would be really great to test on 
 windows and different os versions as well, but afaik Travis does not 
 support more than we do now. There has been some suggestions about using 
 AppVeyor for windows testing, but there is some issues about their VMs 
 being too slow to complete within their 30 minute time limit.

 I don't understand how really simple can describe a machine farm, but 
 everybody would be really happy if we could get more extensive build 
 statuses on pull requests. Unfortuenatly it requires a nonzero amount of 
 maintenance, and someone will actually have to do that work and set it up.

 Regards
 Ivar

 kl. 16:59:45 UTC+2 mandag 1. september 2014 skrev Dan Luu følgende:

 Something that might help prevent issues like this in the future is 
 using something like bors (http://graydon.livejournal.com/186550.html 
 http://www.google.com/url?q=http%3A%2F%2Fgraydon.livejournal.com%2F186550.htmlsa=Dsntz=1usg=AFQjCNEyFlNExquxny_xXxdnk5FjS-i3KQ)
  

 instead of Travis for this kind of thing, since Travis only notifies 
 people after the failure. Rust uses this, and I like it a lot. IIRC, 
 it's prevented me from breaking a test on some obscure platform I 
 don't own at least once. 

 If you don't want to read all of the text in the link, the idea is 
 really simple: when someone creates a pull request, tests get run on 
 some machine farm. Instead of having maintainers merge pull requests, 
 they approve them. If a PR is approved and tests pass, the PR will get 
 merged (with some logic to make sure nothing can fail due to a race 
 condition). 

 On Mon, Sep 1, 2014 at 3:36 AM, Kevin Squire kevin@gmail.com 
 wrote: 
  See https://github.com/JuliaLang/julia/issues/8200. 
  
  
  On Sun, Aug 31, 2014 at 7:45 PM, Dan Luu dan...@gmail.com wrote: 
  
  I'm also having problems, and I wonder if I've run into the same 
 issue. 
  
  When I updated Julia today on my Mac (10.9.2), I got the following 
 error: 
  
  /bin/sh: line 1: 23089 

Re: [julia-users] live plotting in PyPlot.jl?

2014-09-02 Thread Simon Danisch
Good to hear =)
The plot does look really weird... Should it look like the other plots?



2014-09-03 0:51 GMT+02:00 Sheehan Olver dlfivefi...@gmail.com:

 I tried Interact.jl, and it's really fun!  Here is code that does a
 contour plot of Helmholtz with a slider for different wave numbers, where
 ny is the discretization in y.  (nx = ∞, which means adaptive).


 Pkg.add(“Interact”)
 Pkg.add(“Gadfly”)
 Pkg.add(“ApproxFun”)
 Pkg.checkout(“ApproxFun”)
 using ApproxFun,Interact


 d=Interval()⊗Interval()
 B=dirichlet(d)
 Δ=lap(d)

 @manipulate for k=-100.0:.01:2000.0,ny=10:200
 contour(pdesolve([B,Δ+k*I],[1.,1.,1.,1.],ny))
 end


 On 1 Sep 2014, at 8:21 pm, Shashi Gowda shashigowd...@gmail.com wrote:

 @Sheehan

 There is now Interact.jl (Pkg.add(Interact)) which lets you travel your
 for-loops with sliders and such widgets, to put it one way. Here's an
 example notebook showing how you can do interactive plotting with Gadfly or
 PyPlot:
 http://nbviewer.ipython.org/github/JuliaLang/Interact.jl/blob/master/doc/notebooks/Interactive%20Plotting.ipynb



 On Mon, Sep 1, 2014 at 12:28 PM, Sheehan Olver dlfivefi...@gmail.com
 wrote:


 Got GLPlot working, it's awesome!  The following does a movie of a
 solution to wave equation on a square using latest version of ApproxFun
 (the color is weird since I haven't figured that part out yet):



 window = createdisplay(w=1000,h=1000,eyeposition=Vec3(1,1,1),
 lookat=Vec3(0))

 function zcolor(i, j, t)
 x = float32(i - 0.5)
  z = float32(j - 0.5)
 radius = sqrt((x * x) + (z * z))

 r = sin(10.0f0 * radius + t)
 g = cos(10.0f0 * radius + t)
 b = radius
 return Vec4(r,g,b, 1)
 end
 yy=xx=-1.:.05:1.
 N=length(xx)
 color = [zcolor(i/N, j/N, 15) for i=1:N, j=1:N];


 h=0.01;
 u0=TensorFun((x,y)-exp(-10x.^2-20(y-.1).^2))
 d=Interval()⊗Interval()
 L=I-h^2*lap(d);
 B=dirichlet(d);
 S=schurfact([B,L],80);

 u=Array(TensorFun,1)
 u[1]=u0
 u[2]=u0
 n=2;


 GLPlot.glClearColor(1,1,1,0)
 m=400
 for k=n+1:n+m
 u[k]=(S\[zeros(4),2u[k-1]-u[k-2]])
 vals=[u[k][x,y] for x in xx,y in yy];
 texdata=map(Vec1,vals)
 obj   = glplot(texdata, primitive=SURFACE(), color=color);
 GLPlot.glClear(GLPlot.GL_COLOR_BUFFER_BIT |
 GLPlot.GL_DEPTH_BUFFER_BIT)
 render(obj)
 yield()
  GLFW.SwapBuffers(window.glfwWindow)
 end
 n+=m

 On 21 Aug 2014, at 9:51 am, Simon Danisch sdani...@gmail.com wrote:

 Good to hear.
 The test looks funny, as I overlay everything GLPlot is able to do ;)
 I should remove example.jl, as it uses legacy code.
 I'm not sure about the surface example, though... Did you change anything?
 This might be due to a change of the output from imread (Images.jl) and
 shouldn't be a problem if you use your own arrays.
 I'll have a look at this tomorrow...


 2014-08-20 23:58 GMT+02:00 Sheehan Olver dlfivefi...@gmail.com:

 OK I rebuilt julia and cleared my .julia folder, which seems to have
 cleared that issue.  Pkg.test(“GLPlot”) seems to work, though the output
 looks funny.   I also get the following:

 julia include(surface.jl)
 INFO: loaded GLFW 3.0.4 Cocoa NSGL chdir menubar dynamic from
 /Users/solver/.julia/v0.4/GLFW/deps/usr64/lib/libglfw
 ERROR: Color Format RGBA not supported
  in error at error.jl:21
  in Texture at
 /Users/solver/.julia/v0.4/GLAbstraction/src/GLTexture.jl:156
  in include at ./boot.jl:245
  in include_from_node1 at ./loading.jl:128
 while loading /Users/solver/.julia/v0.4/GLPlot/example/surface.jl, in
 expression starting on line 24

 julia include(example.jl)
 INFO: loaded GLFW 3.0.4 Cocoa NSGL chdir menubar dynamic from
 /Users/solver/.julia/v0.4/GLFW/deps/usr64/lib/libglfw
 ERROR: Cam not defined
  in include at ./boot.jl:245
  in include_from_node1 at ./loading.jl:128
 while loading /Users/solver/.julia/v0.4/GLPlot/example/example.jl, in
 expression starting on line 25


 On 20 Aug 2014, at 9:37 pm, Tim Holy tim.h...@gmail.com wrote:

  You might need a Pkg.update(), or Pkg.build(Images) if the update
 doesn't
  solve it.
 
  --Tim
 
  On Wednesday, August 20, 2014 09:23:16 PM Sheehan Olver wrote:
  OK Now I get
 
  could not open file
  /Users/solver/.julia/v0.3/Images/src/ioformats/../../deps/deps.jl
 while
  loading
 /Users/solver/.julia/v0.3/Images/src/ioformats/libmagickwand.jl, in
  expression starting on line 24 while loading
  /Users/solver/.julia/v0.3/Images/src/Images.jl, in expression
 starting on
  line 38 while loading
  /Users/solver/.julia/v0.3/GLAbstraction/src/GLTexture.jl, in
 expression
  starting on line 1 while loading
  /Users/solver/.julia/v0.3/GLAbstraction/src/GLTypes.jl, in expression
  starting on line 40 while loading
  /Users/solver/.julia/v0.3/GLAbstraction/src/GLAbstraction.jl, in
 expression
  starting on line 8
  On 20 Aug 2014, at 9:06 pm, Simon Danisch sdani...@gmail.com wrote:
  Yes I do =)
  You need to install Images.jl properly with its dependency.
  https://github.com/timholy/Images.jl
  I should at some point load this conditional, as you don't really
 need
  

[julia-users] Re: FastGauss.jl: fast computation of Gauss quadrature rules

2014-09-02 Thread Alex Townsend
Both those tools look great. Trying them now.  Thanks a bunch. 
Alex 

On Tuesday, 2 September 2014 19:16:36 UTC-4, Jason Merrill wrote:

 On Tuesday, September 2, 2014 5:57:43 AM UTC-7, Alex Townsend wrote:



 On Tuesday, 2 September 2014 03:00:10 UTC-4, Jason Merrill wrote:

 On Monday, September 1, 2014 2:33:31 PM UTC-7, Alex Townsend wrote:

 I have written a package FastGauss.jl available here: 

 https://github.com/ajt60gaibb/FastGauss.jl


 I am a Julia beginner (only been learning for 2 weeks) so I am assuming 
 the code can be 
 improved in a million and one ways. Please tell me if I've done 
 something that Julia does 
 not like. I am not sure if it is appropriate to make this an official 
 package.

  
 One thing to look out for is making sure your functions have consistent 
 return types. E.g. in 
 https://github.com/ajt60gaibb/FastGauss.jl/blob/91e2ac656b856876563d5aacf7b5a405e068b3da/src/GaussLobatto.jl#L4
  
 you have

 Thanks! I tried to get the return types consistent, but obviously missed 
 a few. I've been trying to use @code_typed to tell me this 
 information, but reading the output is a little difficult (at the 
 moment). 


 I think the community as a whole would like to see better tooling around 
 finding and fixing this kind of soft bug.

 You might check out https://github.com/tonyhffong/Lint.jl, and 
 https://github.com/astrieanna/TypeCheck.jl. I haven't tried either of 
 them myself yet, but I've heard people say good things about both of them.
  



[julia-users] Re: live plotting in PyPlot.jl?

2014-09-02 Thread Simon Danisch
I fixed a little error, so you should checkout the newest commit!

Am Dienstag, 19. August 2014 07:46:22 UTC+2 schrieb Sheehan Olver:

 Hi,

 Is there a way to force plotting in PyPlot.jl, to simulate animation? 
  Right now if I do a for loop over a sequence of plots, it only outputs the 
 last plot.

 This is in IJulia running on OS X with matplotlib version 1.3.1 installed, 
 and pygui(true)

 Sheehan



[julia-users] build.jl for tests

2014-09-02 Thread Peter Zion
Hi there,

Apologies first if this is clearly documented somewhere.  I'm new to Julia 
so I'm still getting used to things like package management.

I am building new MongoDB bindings for Julia (the existing ones at 
pkg.julialang.org appear to be abandoned) and I was hoping I would be able 
to make use of Julia's great testing framework.

In order to do this well I believe that I need to specify a build.jl that 
only applies to the tests. The specific case is that the bindings module 
only needs to build the Mongo C client library to run, but in order to test 
the client you need to build MongoDB itself.  Then, the test would run an 
instance of the database using a temporary directory etc. and test against 
that.

Is there a way to do specify a build.jl that is only used for testing?  Or 
do I currently have to had MongoDB itself to the top level deps/build.jl?

Thanks in advance for any help!



Re: [julia-users] build.jl for tests

2014-09-02 Thread Kevin Squire
Assuming you're using the Travis testing framework, one way to handle this
would be to modify the .travis.yml to install MongoDB.  For the Linux
build, it would be easiest if you installed the Debian package, although it
seems that Mongo itself also has packages you could install.

Here's an example using VideoIO which installs various libav packages:

https://github.com/kmsquire/VideoIO.jl/blob/master/.travis.yml

If you haven't done so yet, you'll also have to configure the repository to
use Travis testing.

The config above is only for Linux testing.  It should be possible to
configure something for OSX, but I'm not sure how that's done.  Windows
testing is less common, but I think a few packages are set up using
Appveyor.

Cheers,
   Kevin


On Tue, Sep 2, 2014 at 7:42 PM, Peter Zion peter.z...@gmail.com wrote:

 Hi there,

 Apologies first if this is clearly documented somewhere.  I'm new to Julia
 so I'm still getting used to things like package management.

 I am building new MongoDB bindings for Julia (the existing ones at
 pkg.julialang.org appear to be abandoned) and I was hoping I would be
 able to make use of Julia's great testing framework.

 In order to do this well I believe that I need to specify a build.jl that
 only applies to the tests. The specific case is that the bindings module
 only needs to build the Mongo C client library to run, but in order to test
 the client you need to build MongoDB itself.  Then, the test would run an
 instance of the database using a temporary directory etc. and test against
 that.

 Is there a way to do specify a build.jl that is only used for testing?  Or
 do I currently have to had MongoDB itself to the top level deps/build.jl?

 Thanks in advance for any help!




Re: [julia-users] build.jl for tests

2014-09-02 Thread Isaiah Norton
Depending on your goals and how far along you are, it may be worth forking
the project and updating the existing code instead of starting from
scratch. Once you get going, make any necessary updates to 0.3
compatibility, etc., then the METADATA URL could be changed to point to
your fork. You can always commit to your own fork, but one suggested step
is to open an issue on the existing repo to see if the authors respond and
are amenable to giving you commit access or transferring the repo (so that
issues are preserved). Often authors are quite happy to pass on a dormant
project to a new maintainer.

(do we have a Julia database organization this could go to for longer-term
conservatorship?)


On Tue, Sep 2, 2014 at 10:42 PM, Peter Zion peter.z...@gmail.com wrote:

 Hi there,

 Apologies first if this is clearly documented somewhere.  I'm new to Julia
 so I'm still getting used to things like package management.

 I am building new MongoDB bindings for Julia (the existing ones at
 pkg.julialang.org appear to be abandoned) and I was hoping I would be
 able to make use of Julia's great testing framework.

 In order to do this well I believe that I need to specify a build.jl that
 only applies to the tests. The specific case is that the bindings module
 only needs to build the Mongo C client library to run, but in order to test
 the client you need to build MongoDB itself.  Then, the test would run an
 instance of the database using a temporary directory etc. and test against
 that.

 Is there a way to do specify a build.jl that is only used for testing?  Or
 do I currently have to had MongoDB itself to the top level deps/build.jl?

 Thanks in advance for any help!




Re: [julia-users] build.jl for tests

2014-09-02 Thread Ivar Nesje
https://github.com/JuliaDB