Re: [julia-users] possible timing regression in A\b

2014-08-23 Thread John Myles White
Perhaps you have a defective BLAS installation for 0.3?

 — John

On Aug 22, 2014, at 11:08 PM, Don MacMillen don.macmil...@gmail.com wrote:

 Hmmm, thx for the experiment.  To clarify, I assumed that the normal behavior 
 would have been closer
 to the 4.0 time, but I don't have anything earlier.  The 0.3 was just updated 
 using apt-get on Ubuntu 14.04 
 (VM running on windows).  0.4 was of course cloned and compiled.  Just 
 scraped and re-installed 0.3
 with apt-get, and I still see the same behavior.  (This time remembering to 
 discard initial compile time)
 
 julia @time  begin a = rand(5000,5000); b = rand(5000); x =a\b end;
 elapsed time: 30.509950844 seconds (400100776 bytes allocated, 0.11% gc time)
 
 It's mystery to me.  If I learn anything more, will let you know.
 Thanks again.
 
 Don
 
 
 
 On Friday, August 22, 2014 9:54:22 PM UTC-7, John Myles White wrote:
 I don’t see this behavior at all on my system. 
 
 After discarding an intial compilation step, here’s what I get: 
 
 0.3 — elapsed time: 3.013803287 seconds (400120776 bytes allocated, 1.77% gc 
 time) 
 0.4 — elapsed time: 2.920384195 seconds (400120776 bytes allocated, 1.89% gc 
 time) 
 
 Also to clarify: do you mean to refer to 0.3 as a regression relative 0.4? 
 
  — John 
 
 On Aug 22, 2014, at 9:50 PM, Don MacMillen don.ma...@gmail.com wrote: 
 
  Is anyone else seeing the following?  If not, what could I have done to my 
  env to trigger it? 
  
  Thx. 
  
  julia VERSION 
  v0.3.0 
  
  julia @time  begin a = rand(5000,5000); b = rand(5000); x =a\b end; 
  elapsed time: 31.413347385 seconds (440084348 bytes allocated, 0.12% gc 
  time) 
  
  julia 
  
   
  
  julia VERSION 
  v0.4.0-dev+308 
  
  julia @time  begin a = rand(5000,5000); b = rand(5000); x =a\b end; 
  elapsed time: 1.686715561 seconds (431769824 bytes allocated, 0.87% gc 
  time) 
  
  julia 
  
 



Re: [julia-users] possible timing regression in A\b

2014-08-23 Thread Don MacMillen
sudo apt-get install libopenblas-dev

julia @time  begin a = rand(5000,5000); b = rand(5000); x =a\b end;
elapsed time: 2.517259033 seconds (400100776 bytes allocated, 1.44% gc time)

Thanks John.

On Friday, August 22, 2014 11:11:09 PM UTC-7, John Myles White wrote:

 Perhaps you have a defective BLAS installation for 0.3?

  — John

 On Aug 22, 2014, at 11:08 PM, Don MacMillen don.ma...@gmail.com 
 javascript: wrote:

 Hmmm, thx for the experiment.  To clarify, I assumed that the normal 
 behavior would have been closer
 to the 4.0 time, but I don't have anything earlier.  The 0.3 was just 
 updated using apt-get on Ubuntu 14.04 
 (VM running on windows).  0.4 was of course cloned and compiled.  Just 
 scraped and re-installed 0.3
 with apt-get, and I still see the same behavior.  (This time remembering 
 to discard initial compile time)

 julia @time  begin a = rand(5000,5000); b = rand(5000); x =a\b end;
 elapsed time: 30.509950844 seconds (400100776 bytes allocated, 0.11% gc 
 time)

 It's mystery to me.  If I learn anything more, will let you know.
 Thanks again.

 Don



 On Friday, August 22, 2014 9:54:22 PM UTC-7, John Myles White wrote:

 I don’t see this behavior at all on my system. 

 After discarding an intial compilation step, here’s what I get: 

 0.3 — elapsed time: 3.013803287 seconds (400120776 bytes allocated, 1.77% 
 gc time) 
 0.4 — elapsed time: 2.920384195 seconds (400120776 bytes allocated, 1.89% 
 gc time) 

 Also to clarify: do you mean to refer to 0.3 as a regression relative 
 0.4? 

  — John 

 On Aug 22, 2014, at 9:50 PM, Don MacMillen don.ma...@gmail.com wrote: 

  Is anyone else seeing the following?  If not, what could I have done to 
 my env to trigger it? 
  
  Thx. 
  
  julia VERSION 
  v0.3.0 
  
  julia @time  begin a = rand(5000,5000); b = rand(5000); x =a\b end; 
  elapsed time: 31.413347385 seconds (440084348 bytes allocated, 0.12% gc 
 time) 
  
  julia 
  
   
  
  julia VERSION 
  v0.4.0-dev+308 
  
  julia @time  begin a = rand(5000,5000); b = rand(5000); x =a\b end; 
  elapsed time: 1.686715561 seconds (431769824 bytes allocated, 0.87% gc 
 time) 
  
  julia 
  




Re: [julia-users] why sum(abs(A)) is very slow

2014-08-23 Thread Rafael Fourquet

 There's a complicated limit to when you want to fuse loops – at some point
 multiple iterations becomes better than fused loops and it all depends on
 how much and what kind of work you're doing. In general doing things lazily
 does not cut down on allocation since you have to allocate the
 representation of the operations that you're deferring and close over any
 values that they depend on.

This particular example only works out so well because the iterable is so
 simple that the compiler can eliminate the laziness and do the eager loop
 fused version for you. This will not generally be the case.


Thank you for taking so much time to explain and for your patience!


 You're welcome to experiment (and Julia's type system makes it pretty easy
 to do so), but I think that you'll quickly find that more laziness is not a
 panacea for performance problems.


My question came partly from Python3 having lazy map and reduce. But having
the choice is good and in Julia all laziness can be provided now by imap
etc. If someone has an (self-contained) example where lazy element-wise
computations is worse than eager, please post! (I'm interested in
understanding better the above mentioned limit)


Re: [julia-users] Manipulations with abstract parametric types

2014-08-23 Thread Tim Holy
You're indeed correct, thanks.

--Tim

On Friday, August 22, 2014 11:48:22 PM Jameson Nash wrote:
 Upon reflection, I realized that your second one wasn't working because it
 is badly posed. Your definition implicitly assumes a lot about the
 structure and fields of any subtype that may not be true, or may have
 multiple answers.
 
 For example, with the following abstract definition:
 
 abstract MyAbstract{T}
 
 I can define all of the following:
 
 type MyType1{T} : MyAbstract{T} end
 type MyType2 : MyAbstract{Int} end
 type MyType3{T} : MyAbstract{Float32} end
 type MyType4{T,S} : MyAbstract{S} end
 
 In short, the type parameter of a subtype has little or nothing to do with
 the type parameter (or eltype) of the abstract type. Therefore, it is not a
 meaningful statement to write a generic promote_type function in the way
 that you were proposing.
 
 Your original eltype function would have returned the incorrect eltype for
 all of these, but the eltype based upon super, is able to use dispatch to
 return the correct eltype, regardless of how the subtypes are defined.
 
 On Fri, Aug 22, 2014 at 3:45 PM, Tim Holy tim.h...@gmail.com wrote:
  Got it now, that's a good strategy.
  
  Of course, it's the second one I'm more concerned about.
  
  --Tim
  
  On Friday, August 22, 2014 03:18:19 PM Jameson Nash wrote:
   For the first, you are missing the base case for the recursion, so it
   defaults to eltype(::Any) instead of eltype(::Type{MyAbstract{T}})
   
   On Friday, August 22, 2014, Tim Holy tim.h...@gmail.com wrote:
On Friday, August 22, 2014 01:17:53 PM Jameson Nash wrote:
 Instead of eltype as written, use the following format:
 eltype{T:...}(::Type{T}) = eltype(super(T))

With this definition:
Base.eltype{T:MyAbstractType}(::Type{T}) = eltype(super(T))

eltype(MyType{Float32}) yields Any.


And for the second:
julia f{M:MyAbstractType}(::Type{M}) = (M.name){Float32}
f (generic function with 1 method)

julia f(MyType{Float64})
ERROR: type: instantiate_type: expected TypeConstructor, got TypeName

 in f at none:1

It was a good suggestion, it's just that I had already tried it.

--Tim

 This should help with the type inference issue too.
 
 I think M.name is already the type, so you don't need to use eval
  
  (which
  
 doesn't return the right thing anyways, in the general case)
 
 Sorry I can't type more clearly on a phone, but I'm hoping this will
  
  be
  
 enough to get you going in the right direction.
 
 On Friday, August 22, 2014, Tim Holy tim.h...@gmail.com
  
  javascript:;
  
wrote:
  Hi all,
  
  I've come up against a couple of surprising type-manipulation
  
  issues
  
and

  I'm
  wondering if I'm just missing something. Hoping someone out there

knows a

  better way of doing these manipulations.
  
  The issues arise when I try writing generic algorithms on abstract
  parametric
  types that need to make decisions based on both the concrete
  parameters
  and
  the concrete type. Let's start with a simple example:
  
  abstract MyAbstractType{T}
  
  immutable MyType{T} : MyAbstractType{T}
  
  a::T
  b::T
  
  end
  
  Now let's write a generic eltype method:
  
  Base.eltype{M:MyAbstractType}(::Type{M}) = M.parameters[1]
  
  It's a little ugly to have to use M.parameters[1] here. While I'm

curious

  to
  know whether I'm missing a simple alternative, I can live with
  
  this.
  
  Now let's define some arithmetic operations:
  
  (*)(x::Number, m::MyType) = MyType(x*m.a, x*m.b)
  (.*)(x::Number, m::MyType) = MyType(x*m.a, x*m.b)
  
  And then try it out:
  julia y = MyType(3,12)
  
  julia 2y
  MyType{Int64}(6,24)
  
  julia 2.1y
  MyType{Float64}(6.301,25.203)
  
  
  So far, so good. Now let's try something a little more
  
  sophisticated:
  julia A = [MyType(3,12), MyType(4,7)]
  
  2-element Array{MyType{Int64},1}:
   MyType{Int64}(3,12)
   MyType{Int64}(4,7)
  
  julia 2A
  
  2-element Array{Any,1}:
   MyType{Int64}(6,24)
   MyType{Int64}(8,14)
  
  The problem here is we're getting an Array{Any,1} back. No problem
  
  you
  
  say,
  let's try to help type inference out, by specifying that a
  
  ::MyType{Int}*::Float64 is a MyType{Float64}. If we were willing
  ::to

write

  this
  using concrete types, it would be easy:
  
  Base.promote_array_type{T:Real, S}(::Type{T}, ::Type{MyType{S}})
  =
  MyType{promote_type(T, S)}
  
  But if we want to just use the abstract type, then I've been

unsuccessful

  in
  avoiding this construction:
  
  

Re: [julia-users] why sum(abs(A)) is very slow

2014-08-23 Thread gael . mcdon
(I was also thinking about element-wise operations)


Re: [julia-users] needless loss of performance: immutable conflates two concepts

2014-08-23 Thread Adam Strzelecki
 To me the appeal of annotating arguments as read-only (in) or write-only 
 (out) is to catch subtle programmer errors.

I always had impression this was original purpose of such annotation is 
languages such as C++. Optimization based on these came afterwards.

Even I have mixed feelings about Apple Swift, I think having functions argument 
to be immutable by default and require to put var to make them mutable is 
brilliant idea. I wish Julia had the same, e.g.

i = 1 # gives constant named i
var j = 2 # gives variable (mutable) named j

Forcing you to type extra var makes you really think what you are doing.

 But adding immutable wrappers for all the mutable types in the system and 
 then adding the corresponding methods for both just strikes me as way too 
 much duplication and complexity.

Are you talking about mutable containers or mutable references?

 Maybe we'll come up with a cleaner way to do it. Sometimes these things just 
 need a good long time to think about and then voila! some bright idea emerges.

Please have a look at ideas behind Apple Swift, some of them are pretty neat. 
Maybe Julia can borrow come of them.

Regards,
--Adam

[julia-users] Avalaible command line arguments, when installing Julia on windows

2014-08-23 Thread stonebig34
Hello, 

I would like to install julia via command line on windows.

Is there an  option to specify  NO create Startmenu and shortcut ?

So far I hav only found the way to specify the target_directory  :  
julia-0.3.0-win32.exe /D=%JULIAROOT%

Regards,


Re: [julia-users] Re: Problem with v 0.3.0 on MacOSX 10.9.4

2014-08-23 Thread Stefan Karpinski
Ah, this old chestnut. Sorry about that – the Stats thing has been a
perennial pain.


On Fri, Aug 22, 2014 at 8:01 PM, Tony Kelman t...@kelman.net wrote:

 Looks like a problem from the old Stats package which got renamed a while
 back. Try

 rm -rf /Users/hs/.julia/.cache/Stats


 On Friday, August 22, 2014 1:52:16 PM UTC-7, Henry Smith wrote:

 Hi,

 Just d/led it and tried it out.  I had a couple of old versions of 0.2.x
 (and still have 0.2.1 installed but trashed the others - some rc's). The
 computer is an iMac with 20 GB of RAM, 2.7 GHz quad i5.

 When I asked about the Pkg.status(), it came up with an error and similar
 for PKG.installed() and Pkg.update(). I copy the output below (not too big,
 I hope) I can't figure out what if anything I did wrong and did not find
 anything about problems on the Mac -- TIA for any help

 Henry

 Last login: Fri Aug 22 16:07:45 on ttys009
 iMac-162:~ hs$ exec '/Applications/Julia-0.3.0.
 app/Contents/Resources/julia/bin/julia'
_
_   _ _(_)_ |  A fresh approach to technical computing
   (_) | (_) (_)|  Documentation: http://docs.julialang.org
_ _   _| |_  __ _   |  Type help() for help.
   | | | | | | |/ _` |  |
   | | |_| | | | (_| |  |  Version 0.3.0 (2014-08-20 20:43 UTC)
  _/ |\__'_|_|_|\__'_|  |  Official http://julialang.org/ release
 |__/   |  x86_64-apple-darwin13.3.0

 julia help()

  Welcome to Julia. The full manual is available at

 http://docs.julialang.org

  To get help, try help(function), help(@macro), or help(variable).
  To search all help text, try apropos(string).

 julia Pkg.status()
 ERROR: failed process: Process(`git --git-dir=/Users/hs/.julia/.cache/Stats
 merge-base 87d1c8d890962dfcfd0b45b82907464787ac7c64
 8208e29af9f80ef633e50884ffb17cb25a9f5113`, ProcessExited(1)) [1]
  in readbytes at /Applications/Julia-0.3.0.app/
 Contents/Resources/julia/lib/julia/sys.dylib
  in readchomp at pkg/git.jl:24
  in installed_version at /Applications/Julia-0.3.0.app/
 Contents/Resources/julia/lib/julia/sys.dylib
  in installed at /Applications/Julia-0.3.0.app/
 Contents/Resources/julia/lib/julia/sys.dylib
  in status at pkg/entry.jl:107
  in anonymous at pkg/dir.jl:28
  in cd at /Applications/Julia-0.3.0.app/Contents/Resources/julia/lib/
 julia/sys.dylib
  in cd at pkg/dir.jl:28
  in status at pkg.jl:28 (repeats 2 times)

 julia Pkg.installed()
 ERROR: failed process: Process(`git --git-dir=/Users/hs/.julia/.cache/Stats
 merge-base 87d1c8d890962dfcfd0b45b82907464787ac7c64
 8208e29af9f80ef633e50884ffb17cb25a9f5113`, ProcessExited(1)) [1]
  in readbytes at /Applications/Julia-0.3.0.app/
 Contents/Resources/julia/lib/julia/sys.dylib
  in readchomp at pkg/git.jl:24
  in installed_version at /Applications/Julia-0.3.0.app/
 Contents/Resources/julia/lib/julia/sys.dylib
  in installed at /Applications/Julia-0.3.0.app/
 Contents/Resources/julia/lib/julia/sys.dylib (repeats 3 times)
  in anonymous at pkg/dir.jl:28
  in cd at /Applications/Julia-0.3.0.app/Contents/Resources/julia/lib/
 julia/sys.dylib
  in cd at pkg/dir.jl:28
  in installed at pkg.jl:25

 julia Pkg.add(Distributions)
 INFO: Nothing to be done
 INFO: METADATA is out-of-date — you may not have the latest version of
 Distributions
 INFO: Use `Pkg.update()` to get the latest versions of your packages

 julia

 julia Pkg.update()
 INFO: Updating METADATA...
 INFO: Updating cache of IniFile...
 INFO: Updating cache of Cairo...
 INFO: Updating cache of PyPlot...
 INFO: Updating cache of Debug...
 INFO: Updating cache of Calculus...
 INFO: Updating cache of Units...
 INFO: Updating cache of HDF5...
 INFO: Updating cache of ICU...
 INFO: Updating cache of Homebrew...
 INFO: Updating cache of BinDeps...
 INFO: Updating cache of Compose...
 INFO: Updating cache of Color...
 INFO: Updating cache of TimeSeries...
 INFO: Updating cache of Gadfly...
 ERROR: failed process: Process(`git --git-dir=/Users/hs/.julia/.cache/Stats
 merge-base 87d1c8d890962dfcfd0b45b82907464787ac7c64
 8208e29af9f80ef633e50884ffb17cb25a9f5113`, ProcessExited(1)) [1]
  in readbytes at /Applications/Julia-0.3.0.app/
 Contents/Resources/julia/lib/julia/sys.dylib
  in readchomp at pkg/git.jl:24
  in installed_version at /Applications/Julia-0.3.0.app/
 Contents/Resources/julia/lib/julia/sys.dylib
  in installed at /Applications/Julia-0.3.0.app/
 Contents/Resources/julia/lib/julia/sys.dylib
  in update at /Applications/Julia-0.3.0.app/Contents/Resources/julia/lib/
 julia/sys.dylib
  in anonymous at pkg/dir.jl:28
  in cd at /Applications/Julia-0.3.0.app/Contents/Resources/julia/lib/
 julia/sys.dylib
  in __cd#227__ at /Applications/Julia-0.3.0.app/
 Contents/Resources/julia/lib/julia/sys.dylib
  in update at 
 /Applications/Julia-0.3.0.app/Contents/Resources/julia/lib/julia/sys.dylib
 (repeats 2 times)

 julia




Re: [julia-users] possible timing regression in A\b

2014-08-23 Thread Stefan Karpinski
There's a reasonable chance that Debian/Ubuntu has a pessimized OpenBLAS
for the sake of portability.


On Sat, Aug 23, 2014 at 3:18 AM, Don MacMillen don.macmil...@gmail.com
wrote:

 sudo apt-get install libopenblas-dev

 julia @time  begin a = rand(5000,5000); b = rand(5000); x =a\b end;
 elapsed time: 2.517259033 seconds (400100776 bytes allocated, 1.44% gc
 time)

 Thanks John.

 On Friday, August 22, 2014 11:11:09 PM UTC-7, John Myles White wrote:

 Perhaps you have a defective BLAS installation for 0.3?

  — John


 On Aug 22, 2014, at 11:08 PM, Don MacMillen don.ma...@gmail.com wrote:

 Hmmm, thx for the experiment.  To clarify, I assumed that the normal
 behavior would have been closer
 to the 4.0 time, but I don't have anything earlier.  The 0.3 was just
 updated using apt-get on Ubuntu 14.04
 (VM running on windows).  0.4 was of course cloned and compiled.  Just
 scraped and re-installed 0.3
 with apt-get, and I still see the same behavior.  (This time remembering
 to discard initial compile time)

 julia @time  begin a = rand(5000,5000); b = rand(5000); x =a\b end;
 elapsed time: 30.509950844 seconds (400100776 bytes allocated, 0.11% gc
 time)

 It's mystery to me.  If I learn anything more, will let you know.
 Thanks again.

 Don



 On Friday, August 22, 2014 9:54:22 PM UTC-7, John Myles White wrote:

 I don’t see this behavior at all on my system.

 After discarding an intial compilation step, here’s what I get:

 0.3 — elapsed time: 3.013803287 seconds (400120776 bytes allocated,
 1.77% gc time)
 0.4 — elapsed time: 2.920384195 seconds (400120776 bytes allocated,
 1.89% gc time)

 Also to clarify: do you mean to refer to 0.3 as a regression relative
 0.4?

  — John

 On Aug 22, 2014, at 9:50 PM, Don MacMillen don.ma...@gmail.com wrote:

  Is anyone else seeing the following?  If not, what could I have done
 to my env to trigger it?
 
  Thx.
 
  julia VERSION
  v0.3.0
 
  julia @time  begin a = rand(5000,5000); b = rand(5000); x =a\b end;
  elapsed time: 31.413347385 seconds (440084348 bytes allocated, 0.12%
 gc time)
 
  julia
 
  
 
  julia VERSION
  v0.4.0-dev+308
 
  julia @time  begin a = rand(5000,5000); b = rand(5000); x =a\b end;
  elapsed time: 1.686715561 seconds (431769824 bytes allocated, 0.87% gc
 time)
 
  julia
 





[julia-users] What's new in 0.3?

2014-08-23 Thread Ed Scheinerman
Is there a document describing new features and significant changes between 
versions 0.2 and 0.3? 

One item I noticed is that in 0.2 the express 1:5 == [1:5] evaluated as true, 
but in 0.3 it's false. 


[julia-users] Re: What's new in 0.3?

2014-08-23 Thread Valentin Churavy
There is https://github.com/JuliaLang/julia/blob/v0.3.0/NEWS.md 

On Saturday, 23 August 2014 15:02:56 UTC+2, Ed Scheinerman wrote:

 Is there a document describing new features and significant changes 
 between versions 0.2 and 0.3? 

 One item I noticed is that in 0.2 the express 1:5 == [1:5] evaluated as 
 true, but in 0.3 it's false. 



Re: [julia-users] possible timing regression in A\b

2014-08-23 Thread Milan Bouchet-Valat
Le samedi 23 août 2014 à 08:55 -0400, Stefan Karpinski a écrit :
 There's a reasonable chance that Debian/Ubuntu has a pessimized
 OpenBLAS for the sake of portability.
It's also possible that the Debian package (these are the nighlties,
right?) should depend on (or recommend) openblas-devel, and not only
openblas, since Julia is currently not able to dlopen() a library if
there's only libopenblas.so.0, and no libopenblas.so (#6742). I know I
initially forgot to add this dependency in my RPM package.


Regards

 On Sat, Aug 23, 2014 at 3:18 AM, Don MacMillen
 don.macmil...@gmail.com wrote:
 sudo apt-get install libopenblas-dev
 
 
 julia @time  begin a = rand(5000,5000); b = rand(5000); x =a
 \b end;
 elapsed time: 2.517259033 seconds (400100776 bytes allocated,
 1.44% gc time)
 
 
 Thanks John.
 
 On Friday, August 22, 2014 11:11:09 PM UTC-7, John Myles White
 wrote:
 Perhaps you have a defective BLAS installation for
 0.3?
 
 
  — John
 
 
 On Aug 22, 2014, at 11:08 PM, Don MacMillen
 don.ma...@gmail.com wrote:
 
  Hmmm, thx for the experiment.  To clarify, I assumed
  that the normal behavior would have been closer
  to the 4.0 time, but I don't have anything earlier.
   The 0.3 was just updated using apt-get on Ubuntu
  14.04 
  (VM running on windows).  0.4 was of course cloned
  and compiled.  Just scraped and re-installed 0.3
  with apt-get, and I still see the same behavior.
   (This time remembering to discard initial compile
  time)
  
  
  julia @time  begin a = rand(5000,5000); b =
  rand(5000); x =a\b end;
  elapsed time: 30.509950844 seconds (400100776 bytes
  allocated, 0.11% gc time)
  
  
  It's mystery to me.  If I learn anything more, will
  let you know.
  Thanks again.
  
  
  Don
  
  
  
  
  On Friday, August 22, 2014 9:54:22 PM UTC-7, John
  Myles White wrote:
  I don’t see this behavior at all on my
  system. 
  
  After discarding an intial compilation step,
  here’s what I get: 
  
  0.3 — elapsed time: 3.013803287 seconds
  (400120776 bytes allocated, 1.77% gc time) 
  0.4 — elapsed time: 2.920384195 seconds
  (400120776 bytes allocated, 1.89% gc time) 
  
  Also to clarify: do you mean to refer to 0.3
  as a regression relative 0.4? 
  
   — John 
  
  On Aug 22, 2014, at 9:50 PM, Don MacMillen
  don.ma...@gmail.com wrote: 
  
   Is anyone else seeing the following?  If
  not, what could I have done to my env to
  trigger it? 
   
   Thx. 
   
   julia VERSION 
   v0.3.0 
   
   julia @time  begin a = rand(5000,5000); b
  = rand(5000); x =a\b end; 
   elapsed time: 31.413347385 seconds
  (440084348 bytes allocated, 0.12% gc time) 
   
   julia 
   
    
   
   julia VERSION 
   v0.4.0-dev+308 
   
   julia @time  begin a = rand(5000,5000); b
  = rand(5000); x =a\b end; 
   elapsed time: 1.686715561 seconds
  (431769824 bytes allocated, 0.87% gc time) 
   
   julia 
   
  
 
 
 
 



[julia-users] Re: ANN: PGF/TikZ packages

2014-08-23 Thread Valentin Churavy
There is also a PGF backend for Compose which is used by Gadfly, which is 
producing quite nice plots.

On Saturday, 23 August 2014 00:22:34 UTC+2, Kaj Wiik wrote:

 Hi!

 I've been using Tikz, so this package is very welcome!

 However,
 using PGFPlots
 a=plot(rand(20),rand(20))
 save(test.pdf, a)
 ERROR: `save` has no method matching save(::ASCIIString, ::TikzPicture)

 Any suggestions?

 This is in Julia 0.3.0 and Ubuntu 14.04.

 Cheers,
 Kaj



Re: [julia-users] possible timing regression in A\b

2014-08-23 Thread Tony Kelman
Interesting question is what was being called that was taking 30 seconds 
then, and why no warning? Was it a fallback to a slow system blas/lapack, 
or a fallback to generic naive LU in Julia?


On Saturday, August 23, 2014 6:08:16 AM UTC-7, Milan Bouchet-Valat wrote:

 Le samedi 23 août 2014 à 08:55 -0400, Stefan Karpinski a écrit : 
  There's a reasonable chance that Debian/Ubuntu has a pessimized 
  OpenBLAS for the sake of portability. 
 It's also possible that the Debian package (these are the nighlties, 
 right?) should depend on (or recommend) openblas-devel, and not only 
 openblas, since Julia is currently not able to dlopen() a library if 
 there's only libopenblas.so.0, and no libopenblas.so (#6742). I know I 
 initially forgot to add this dependency in my RPM package. 


 Regards 

  On Sat, Aug 23, 2014 at 3:18 AM, Don MacMillen 
  don.ma...@gmail.com javascript: wrote: 
  sudo apt-get install libopenblas-dev 
  
  
  julia @time  begin a = rand(5000,5000); b = rand(5000); x =a 
  \b end; 
  elapsed time: 2.517259033 seconds (400100776 bytes allocated, 
  1.44% gc time) 
  
  
  Thanks John. 
  
  On Friday, August 22, 2014 11:11:09 PM UTC-7, John Myles White 
  wrote: 
  Perhaps you have a defective BLAS installation for 
  0.3? 
  
  
   — John 
  
  
  On Aug 22, 2014, at 11:08 PM, Don MacMillen 
  don.ma...@gmail.com wrote: 
  
   Hmmm, thx for the experiment.  To clarify, I assumed 
   that the normal behavior would have been closer 
   to the 4.0 time, but I don't have anything earlier. 
The 0.3 was just updated using apt-get on Ubuntu 
   14.04 
   (VM running on windows).  0.4 was of course cloned 
   and compiled.  Just scraped and re-installed 0.3 
   with apt-get, and I still see the same behavior. 
(This time remembering to discard initial compile 
   time) 
   
   
   julia @time  begin a = rand(5000,5000); b = 
   rand(5000); x =a\b end; 
   elapsed time: 30.509950844 seconds (400100776 bytes 
   allocated, 0.11% gc time) 
   
   
   It's mystery to me.  If I learn anything more, will 
   let you know. 
   Thanks again. 
   
   
   Don 
   
   
   
   
   On Friday, August 22, 2014 9:54:22 PM UTC-7, John 
   Myles White wrote: 
   I don’t see this behavior at all on my 
   system. 
   
   After discarding an intial compilation step, 
   here’s what I get: 
   
   0.3 — elapsed time: 3.013803287 seconds 
   (400120776 bytes allocated, 1.77% gc time) 
   0.4 — elapsed time: 2.920384195 seconds 
   (400120776 bytes allocated, 1.89% gc time) 
   
   Also to clarify: do you mean to refer to 0.3 
   as a regression relative 0.4? 
   
— John 
   
   On Aug 22, 2014, at 9:50 PM, Don MacMillen 
   don.ma...@gmail.com wrote: 
   
Is anyone else seeing the following?  If 
   not, what could I have done to my env to 
   trigger it? 

Thx. 

julia VERSION 
v0.3.0 

julia @time  begin a = rand(5000,5000); b 
   = rand(5000); x =a\b end; 
elapsed time: 31.413347385 seconds 
   (440084348 bytes allocated, 0.12% gc time) 

julia 

 

julia VERSION 
v0.4.0-dev+308 

julia @time  begin a = rand(5000,5000); b 
   = rand(5000); x =a\b end; 
elapsed time: 

[julia-users] Re: ANN: PGF/TikZ packages

2014-08-23 Thread Pablo Zubieta
@Kaj Wiik

I believe you need to use:

a = Plot.Linear(rand(20), rand(20))

Checkout here:

http://nbviewer.ipython.org/github/sisl/PGFPlots.jl/blob/master/doc/PGFPlots.ipynb

Cheers,
Pablo


[julia-users] Re: What's new in 0.3?

2014-08-23 Thread Ed Scheinerman
THANKS!

On Saturday, August 23, 2014 9:06:50 AM UTC-4, Valentin Churavy wrote:

 There is https://github.com/JuliaLang/julia/blob/v0.3.0/NEWS.md 

 On Saturday, 23 August 2014 15:02:56 UTC+2, Ed Scheinerman wrote:

 Is there a document describing new features and significant changes 
 between versions 0.2 and 0.3? 

 One item I noticed is that in 0.2 the express 1:5 == [1:5] evaluated as 
 true, but in 0.3 it's false. 



Re: [julia-users] Avalaible command line arguments, when installing Julia on windows

2014-08-23 Thread Isaiah Norton
There is no option for this right now, but feel free to file an issue so I
don't forget. Right now what you can do is extract the files by running `7z
x` twice - the first time will create julia-installer.exe, and extracting
that will produce two folders called $PLUGINS and $_OUTDIR. $_OUTDIR
contains Julia and libraries.

We will probably switch to a pure-NSIS installer soon (without the 7-zip
self-extraction step), which will make it possible to add support for such
switches.


On Sat, Aug 23, 2014 at 7:30 AM, stonebi...@gmail.com wrote:

 Hello,

 I would like to install julia via command line on windows.

 Is there an  option to specify  NO create Startmenu and shortcut ?

 So far I hav only found the way to specify the target_directory  :
 julia-0.3.0-win32.exe /D=%JULIAROOT%

 Regards,



[julia-users] Announcement: Playground.jl

2014-08-23 Thread Rory Finnegan
Hi everyone,

I've published my Playground.jl 
https://github.com/Rory-Finnegan/Playground.jl package to create julia 
sandboxes like python virtual environments, if anyone wants to give it a 
try.  So far I've tested it on Funtoo and Linux Mint, but I'm looking for 
people to try it out on other platforms (like Windows and OSX).

Cheers,
Rory


[julia-users] Possible to remap REPL bindings?

2014-08-23 Thread Sean Mackesey
Is it possible to configure the new 0.3.0 REPL bindings? If not, is this 
planned for the future? This issue is not mentioned in the docs 
http://docs.julialang.org/en/release-0.3/manual/interacting-with-julia/#key-bindings
.


[julia-users] Multivariate Normal versus Multivariate Normal Canon in Distributions package

2014-08-23 Thread asim


I am trying to use the multivariate normal canonical form to draw random 
numbers. According to my understanding, the following two functions should 
give the same result over a large number of draws, but I am getting 
different results. What am I doing wrong. 

function ablRegCoeffPostDraw(xtx::Array{Float64, 2}, xty::Array{Float64, 1},
 errorVariance::Float64, priorMean::Array{Float64, 1}, 
priorPrec::Array{Float64, 2})

postPrec=priorPrec+xtx/errorVariance

postMean=priorPrec*priorMean+xty/errorVariance
  
postMean=postPrec\postMean
  
rand(MvNormal(postMean, inv(postPrec)))
  
end

function ablRegCoeffPostDraw1(xtx::Array{Float64, 2}, xty::Array{Float64, 
1},
 errorVariance::Float64, priorMean::Array{Float64, 1}, 
priorPrec::Array{Float64, 2})

postPrec=priorPrec+xtx/errorVariance

postMean=priorPrec*priorMean+xty/errorVariance
  
postMean=postPrec\postMean
  
potential=postPrec*postMean
  
rand(MvNormalCanon(potential, postPrec))
  
end


[julia-users] Function with state calling stateless version, or the other way around?

2014-08-23 Thread Jay Kickliter
I have these DSP routines 
https://github.com/JayKickliter/Radio.jl/blob/master/src/Filter/FIRFilter.jl 
that do sample rate conversion. My goal from the beginning was to make the 
code easy to use in a one-off fashion, or in a way that preserves state 
between calls. For some reason that I can't even remember, I started off 
with the state-free versions doing all the heavy lifting. I now think I 
should have done it the other way around. But if anyone else winds up using 
this code, I imagine they will be using the stateless versions most of the 
time. There's a little overhead in creating the state-preserving filter 
objects, which would be wasted if you're only going to use them once. 

What would you do? I've been working on this code for months, and would 
really like to release it. But I'd like to hear some opinions before I go 
and change everything. 


Re: [julia-users] Julia so far

2014-08-23 Thread Isaiah Norton
Speculative questions will get very speculative answers :) My sense is that
the rate of adoption and growth of the community has so far been just about
ideal. Too much growth too soon can actually be a bad thing if the
community and infrastructure can't absorb the growth (overwhelmed mailing
lists that become toxic, PRs hanging too long, etc.), or if it leads to
settling on sub-optimal situations due to opposition to breaking existing
code.

how is the language being received in industry and education, it being
 strong in academe already.


Considering how often the when will static compilation be ready question
is asked, the industry interest seems to be robust. Julia has already
gotten various forms of sponsorship and code contributions from a number of
companies.

As far as education, one answer is this list of courses using Julia, which
should see some healthy growth when we start hearing from courses this Fall:

http://julialang.org/teaching/

Are there plans for a formal 1.0 release or will it continue to develop
 incrementally?


I can't speak authoritatively, but I don't believe there are any plans for
this in the next year at least.

Have there been rumors of panic at Matlab HQ or Wolfram Towers?


Can't comment on that.




On Sat, Aug 23, 2014 at 2:51 PM, cormull...@mac.com wrote:

 Hi. I was reading http://julialang.org/blog/2012/02/why-we-created-julia/:

 About two and a half years ago, we set out to create the language of
 our greed. It’s not complete, but it’s time for a 1.0 release...


 and it's about two and a half years since then. (So, happy fifth birthday,
 Julia? :)

 I'd be interested to read the opinions of Julia programmers and developers
 as to how they judge the success of their creation so far. I've seen a
 number of blog posts and watched the conference videos. I noticed the
 growth in Julia users
 http://www.kdnuggets.com/2014/08/four-main-languages-analytics-data-mining-data-science.html
  -
 how is the language being received in industry and education, it being
 strong in academe already. Are there plans for a formal 1.0 release or will
 it continue to develop incrementally? Have there been rumors of panic at
 Matlab HQ or Wolfram Towers?



[julia-users] Function with state calling stateless version, or the other way around?

2014-08-23 Thread Abe Schneider
It may be worth having both around. If you wrap the stateless versions with 
something that keeps state, users get the choice of which one to use.

For example, you could use Tasks to create generators that produce the next 
value. The Task will keep the state for you, which allows you to keep your DSP 
code simple and stateless.


[julia-users] Does Julia have something similar to Python's documentation string?

2014-08-23 Thread Xiao FENG

Thanks.


[julia-users] Re: JuliaCon Opening Session Videos Posted!

2014-08-23 Thread Joshua Job
Is there any word on when we may expect the rest of the videos? I'm 
particularly anxious to see the Gadfly session. :D

On Monday, August 11, 2014 4:53:18 AM UTC-7, Jacob Quinn wrote:

 Hey all,

 Gather round and here the tales of a wonderous new language, presented by Tim 
 Holy 
 https://www.youtube.com/watch?v=FA-1B_amwt8list=PLP8iPy9hna6TSRouJfvobfxkZFYiPSvPdindex=1,
  
 Pontus Stenetrop 
 https://www.youtube.com/watch?v=OrFxjE44COclist=PLP8iPy9hna6TSRouJfvobfxkZFYiPSvPdindex=2,
  
 and Arch Robison 
 https://www.youtube.com/watch?v=GFTCQNYddhslist=PLP8iPy9hna6TSRouJfvobfxkZFYiPSvPdindex=3
 .

 Check out the JuliaCon youtube playlist 
 https://www.youtube.com/playlist?list=PLP8iPy9hna6TSRouJfvobfxkZFYiPSvPd
 , Blog post announcement 
 http://julialang.org/blog/2014/08/juliacon-opening-session/, and feel 
 free to jump in on the discussions at /r/programming 
 http://www.reddit.com/r/programming/comments/2d86qh/juliacon_opening_session_videos_released/
  
 and Hacker News https://news.ycombinator.com/item?id=8162869.

 The plan is to release another session of videos every few days, so keep 
 on the lookout for more Julia goodness.

 -Jacob