Re: [julia-users] ANN: Humanize.jl - human-friendly numbers

2014-10-02 Thread Michele Zaffalon
Why is timedelta(70) one minute but the last command timedelta(Date(2014,3,7)
- Date(2013,2,4))  one year and one month? Should it not be more consistent
to have one minute and 10 seconds in the first case? Besides, 10 seconds in
one minute in percentage is more than 1 month in 1 year...
michele

On Thu, Oct 2, 2014 at 4:14 AM, Iain Dunning iaindunn...@gmail.com wrote:

 Hi all,

 Announcing...

 *Humanize.jl*
 https://github.com/IainNZ/Humanize.jl

 Humanize numbers, currently:
 * data sizes
 * Date/datetime differences

 This package is MIT licensed, and is based on jmoiron's humanize Python
 library https://github.com/jmoiron/humanize/.

 Installation: Pkg.add(Humanize)

 Examples:

 julia datasize(300)
 3.0 MB
 julia datasize(300, style=:bin, format=%.3f)
 2.861 MiB
 julia datasize(300, style=:gnu, format=%.3f)
 2.861M
 julia timedelta(70)
 a minute
 julia timedelta(0,0,0,23,50,50)
 23 hours
 julia timedelta(DateTime(2014,2,3,12,11,10) - DateTime(2013,3,7,13,1,20))
 11 months
 julia timedelta(Date(2014,3,7) - Date(2013,2,4))
 1 year, 1 month



Re: [julia-users] make testall failing

2014-10-02 Thread Elliot Saba
Can you start julia normally?  E.g. can you run `./julia` and get a REPL?
-E


Re: [julia-users] make testall failing

2014-10-02 Thread Kapil
Yes, the julia interpreter starts with ./julia
ᐧ

Regards,
Kapil Agarwal

On Thu, Oct 2, 2014 at 2:23 AM, Elliot Saba staticfl...@gmail.com wrote:

 Can you start julia normally?  E.g. can you run `./julia` and get a REPL?
 -E



Re: [julia-users] make testall failing

2014-10-02 Thread Elliot Saba
What happens if you run `Base.runtests()` inside the REPL?
-E

On Wed, Oct 1, 2014 at 11:25 PM, Kapil kapil6...@gmail.com wrote:

 Yes, the julia interpreter starts with ./julia
 ᐧ

 Regards,
 Kapil Agarwal

 On Thu, Oct 2, 2014 at 2:23 AM, Elliot Saba staticfl...@gmail.com wrote:

 Can you start julia normally?  E.g. can you run `./julia` and get a REPL?
 -E





Re: [julia-users] Re: Calling Julia from .NET

2014-10-02 Thread Stefan Babinec
Thanks Isaiah.

It works.

I also tried to pass Julia's system image to jl_init and with the little 
help 

of SetDllDirectory method it works properly when i try to run it in 
console. 

***
using System.Runtime.InteropServices;

namespace ConsoleApplication1
{
class Program
{
[DllImport(kernel32.dll, SetLastError = true)]
static extern bool SetDllDirectory(string lpPathName);

[DllImport(libjulia.dll)]
static extern void jl_init(string message);

[DllImport(libjulia.dll)]
static extern void jl_eval_string(string message);

static void Main(string[] args)
{
SetDllDirectory(@c:\Users\SB\AppData\Local\Julia-0.3.0\bin\);

jl_init(@c:\Users\SB\AppData\Local\Julia-0.3.0\bin\); 

jl_eval_string(print(sqrt(2.0)));
}
}
}
***

But when I try to run it directly in VS 2103 in (Debug/Release) I get 

following error from jl_init:

System.AccesViolationException.
Attempted to read or write protected memory. This is often an indication 
that 

other memory is corrupt.

Small piece from :
http://social.msdn.microsoft.com/Forums/en-US/8789ea67-fbc5-4a7b-a4eb-d4a8a050d5c1/attempt-to-read-or-write-protected-memory-this-is-often-an-indicating-that-other-memory-is-corrupt

This issue shouldn't happen in managed code. The problem is typically 
caused by some component (typically unmanaged, but can also be a bad 
P/Invoke signature) that corrupts the program's memory.

I tried also to suppress this exception by disabling 
Tools menu -Options - Debugging - General - Suppress JIT optimization 
on module load
but with no success ?

Any ideas ?

Best regards.

Stefan.

On Wednesday, October 1, 2014 12:16:15 AM UTC+2, Isaiah wrote:

 The  indicates the cmd prompt working directory:
 ```
 C:\cmn\Julia-0.3.0bin\ConsoleApplication2.exe
 1.4142135623730951
 ```
 Otherwise, try passing the bin path as a char* to jl_init as suggested.


 On Tue, Sep 30, 2014 at 11:24 AM, Stefan Babinec sysl...@gmail.com 
 javascript: wrote:

 I've copied exe file directly to julia's bin directory Isaiah.

 And I get the above mentioned error when I try to run it in the bin 
 directory.


 On Tuesday, September 30, 2014 4:57:41 PM UTC+2, Isaiah wrote:

 Try running it from the Julia directory as `bin\CommandLine2.exe`. This 
 is very much a minimal example; for general use, the bin directory should 
 be passed as an argument to `jl_init`:

 https://github.com/JuliaLang/julia/blob/1ee440bee5035ccb33f82b8a45febd
 dd2f973baa/src/jlapi.c#L70-L73

 To go much further than this will really require to dig in to both 
 jlapi.c and the general Julia source code. Be aware that dealing with type 
 translation and garbage collection are both non-trivial. See also 
 `examples/embedding.c` in the julia tree, and several previous discussion 
 on the mailing list. 

 On Tue, Sep 30, 2014 at 9:08 AM, Stefan Babinec sysl...@gmail.com 
 wrote:

 Hi Isaiah.

 I tried and got this error:
 System image file  ?l\../lib/julia/sys.ji not found

 System Image sys.ji looks to be on his place and I have no problem 
 running Julia.

 Thanks.


 On Tuesday, September 30, 2014 2:03:42 PM UTC+2, Isaiah wrote:

 I should mention that it is necessary to change the project target CPU 
 from the default Any to x64 or x86 to match the libjulia architecture.
 On Sep 29, 2014 11:58 PM, Isaiah Norton isaiah...@gmail.com wrote:

 I tried this some time ago during 0.2, so to make sure it still works 
 I made a minimal translation of the embedding example to C#:

 ```
 using System;
 using System.Runtime.InteropServices;

 namespace ConsoleApplication2
 {
 class Program
 {
 [DllImport(libjulia.dll)]
 static extern void jl_init();
 [DllImport(libjulia.dll)]
 static extern void jl_eval_string(string message);

 static void Main(string[] args)
 {
 jl_init();
 jl_eval_string(print(sqrt(2.0)));
 }
 }
 }
 ```

 I compiled this, copied the binary into `Julia-0.3.0\bin`, and it 
 works:

 ```
 C:\cmn\Julia-0.3.0bin\ConsoleApplication2.exe
 1.4142135623730951
 ```


 On Mon, Sep 29, 2014 at 4:11 PM, Tobias Knopp 
 tobias...@googlemail.com wrote:

 yep, I have done this (mostly for fun) before and it works. One 
 needs some experience with P/Invoke of course but this is no magic but 
 similar to our ccall.

 Cheers,

 Tobi

 Am Montag, 29. September 2014 20:52:10 UTC+2 schrieb Stefan 
 Karpinski:

 I assume that you can call C libraries from .NET, right? The C 
 library for Julia is libjulia – how to call it from C is explained in 
 the 
 embedding docs, calling it from .NET should be similar. 


  On Sep 29, 2014, at 12:37 PM, Guido De Bouver 
 guido.d...@telenet.be wrote: 
  
  I have not found the C# examples, but I have not looked for them. 
 Sorry for that. 
  
  So, any help on this, how could we call Julia from .NET  
  


  



[julia-users] Re: Simple benchmark improvement question

2014-10-02 Thread dextorious
Thanks for the quick feedback, I appreciate it. I had indeed somehow missed 
Julia's arrays being column-major and taking that into account essentially 
yields equal performance between Numba and Julia in this benchmark. I'll 
compare it to Fortran/C++ just to see how large the difference is, but the 
current performance does seem to be the limit of LLVM-based approaches in 
general. 

To answer Tim Holy's point, Numba does do SIMD vectorization, but I haven't 
tested it extensively in the latest releases. For now, I'll port some of 
the more compact simulations I'm doing to Julia and Python/Numba for a less 
trivial benchmark and see how the performance compares then. 


Re: [julia-users] Re: 2nd Julia meetup in Japan: JuliaTokyo #2

2014-10-02 Thread Sorami Hisamoto
This time we had around 40 participants, about the same as the last
event (JuliaTokyo #1) back in July.

We had audiences from mixed backgrounds; physics, finance,
bioinformatics, adtech, marketing and web engineering to name a few.

It seems the biggest cluster of people are from R community, people
doing various data analysis. There's a monthly R meetup in Japan
called Tokyo.R, where nearly 100 people attend each time, and we do
see Julia come up in  the talks quite often in recent events.
https://groups.google.com/forum/#!forum/r-study-tokyo

However these data analysis people are not so satisfied with Julia as
a quick replacement of R yet, because of the lack of packages and
documentations.

The difference between that R meetup and our Julia meetup is that
participants in latter are generally more interested and familiar with
programming.


On Mon, Sep 29, 2014 at 7:28 PM, Viral Shah vi...@mayin.org wrote:
 Thanks for the summary. How was the turnout? I have been noticing lots of
 Japanese tweets on julia too lately. Do send the summaries - they are fun to
 read!

 -viral


 On Saturday, September 27, 2014 7:16:27 PM UTC+5:30, ther...@gmail.com
 wrote:

 Hi all,

 Today we had our 2nd Julia meetup in Japan, called JuliaTokyo #2.

 Here's the list of presentation slides;
 http://juliatokyo.connpass.com/event/8010/presentation/

 ---

 JuliaTokyo #2 Timetable in English

 # Main Talks
 1. Introductory Session - @sorami
 2. Julia in the Corporation - @QuantixResearch
 3. Hamiltonian Monte Carlo Method with Julia - @bicycle1885
 4. DataFrames.jl - @weda_654
 5. Parallel Computing with Julia - @sfchaos
 6. Toolbox for Julia Development - @yomichi_137

 # Lightning Talks
 1. MeCab.jl (MeCab: Japanese morphological tokenizer) - @chezou
 2. Review of v0.3 release note - yoshifumi_seki
 3. Using BinDeps.jl - @r9y9
 4. Julia Language Anime Character - @kimrin

 ---

 We had a survey for the participants on what kind of languages they use on
 a daily basis. 81 answers (multiple choices allowed), and here's the result;

 rank, language, #people
 01. Python - 50
 02. R - 36
 03. Java - 25
 04. Ruby - 20
 04. C++ - 20
 05. Other - 19
 06. Excel - 18
 07. C - 15
 08. Julia - 14
 09. Visual Basic - 6
 09. Perl - 6
 09. Matlab / Octave - 6
 09. Scala - 6
 10. Fortran - 2
 10. Clojure - 2
 11. F# - 1

 ---

 It seems that Julia is slowly gaining its popularity in Japan too!

 - sorami


 btw, the name JuliaTokyo is from Juliana's Tokyo, THE most famous
 disco in Japan back in early 90s.




[julia-users] problem with a macro generating a type with inner constructor

2014-10-02 Thread Ariel Keselman
working on the GeometricPredicates package, I want to generate a few mostly 
similar types like this:

macro mymac(name)
n = name.args[1]
quote
type $n
x::Int64
$n(num::Int64) = new(num)
end
end
end

@mymac(:mynewtype)
println(mynewtype)

It raises an error (mynewtype is undefined) unless I comment out the inner 
constructor. Is this a bug? I couldn't find anything similar in the mailing 
lists

any thoughts? Thanks!



[julia-users] Re: Simple benchmark improvement question

2014-10-02 Thread Hans W Borchers
Which version of MATLAB did you use?
Could you publish the MATLAB code as well?
I am interested in testing MATLAB's JIT compiler.
Thanks, Hans Werner


On Thursday, October 2, 2014 1:14:47 PM UTC+2, dexto...@gmail.com wrote:

 Thanks for the quick feedback, I appreciate it. I had indeed somehow 
 missed Julia's arrays being column-major and taking that into account 
 essentially yields equal performance between Numba and Julia in this 
 benchmark. I'll compare it to Fortran/C++ just to see how large the 
 difference is, but the current performance does seem to be the limit of 
 LLVM-based approaches in general. 

 To answer Tim Holy's point, Numba does do SIMD vectorization, but I 
 haven't tested it extensively in the latest releases. For now, I'll port 
 some of the more compact simulations I'm doing to Julia and Python/Numba 
 for a less trivial benchmark and see how the performance compares then. 



[julia-users] Re: problem with a macro generating a type with inner constructor

2014-10-02 Thread Toivo Henningsson
Seems like a macro hygiene issue, in two ways :)

   - It's a bug in macro hygiene that you don't get that error without the 
   inner constructor (apparently it doesn't consider type definition as a form 
   of assignment to the named symbol).
   - It can be fixed by escaping the name of the type: n = esc(name.args[1]) 
   to circumvent hygienization for it.

Btw, any reason that you want to supply a quoted symbol to the macro? It 
seems more straightforward with

macro mymac(name::Symbol)
n = esc(name)
quote
type $n
x::Int64
$n(num::Int64) = new(num)
end
end
end

@mymac(mynewtype)
println(mynewtype)


On Thursday, 2 October 2014 14:05:03 UTC+2, Ariel Keselman wrote:

 working on the GeometricPredicates package, I want to generate a few 
 mostly similar types like this:

 macro mymac(name)
 n = name.args[1]
 quote
 type $n
 x::Int64
 $n(num::Int64) = new(num)
 end
 end
 end

 @mymac(:mynewtype)
 println(mynewtype)

 It raises an error (mynewtype is undefined) unless I comment out the inner 
 constructor. Is this a bug? I couldn't find anything similar in the 
 mailing lists

 any thoughts? Thanks!



[julia-users] Re: problem with a macro generating a type with inner constructor

2014-10-02 Thread Ariel Keselman
Thanks, this solved it!

I'm using the unquoted version now, I was just testing different things, 
got a bit confused by this behavior. Should I open an issue about the 
(explicit) constructor-less version?




Re: [julia-users] Re: Calling Julia from .NET

2014-10-02 Thread Isaiah Norton
Hi Stefan, I'm not sure. I guess it might be failing the first time it
tries to map an executable page. You might find some clues on the llvm
mailing list. But the bigger issue is that the msvc debugger won't be very
useful because msvc doesn't know anything about the jit'd object format
(still ELF I think). If you want to use a debugger you should download the
mingw64 tools in readme.windows and use gdb.
On Oct 2, 2014 3:59 AM, Stefan Babinec sysli...@gmail.com wrote:

 Thanks Isaiah.

 It works.

 I also tried to pass Julia's system image to jl_init and with the little
 help

 of SetDllDirectory method it works properly when i try to run it in
 console.

 ***
 using System.Runtime.InteropServices;

 namespace ConsoleApplication1
 {
 class Program
 {
 [DllImport(kernel32.dll, SetLastError = true)]
 static extern bool SetDllDirectory(string lpPathName);

 [DllImport(libjulia.dll)]
 static extern void jl_init(string message);

 [DllImport(libjulia.dll)]
 static extern void jl_eval_string(string message);

 static void Main(string[] args)
 {
 SetDllDirectory(@c:\Users\SB\AppData\Local\Julia-0.3.0\bin\);

 jl_init(@c:\Users\SB\AppData\Local\Julia-0.3.0\bin\);

 jl_eval_string(print(sqrt(2.0)));
 }
 }
 }
 ***

 But when I try to run it directly in VS 2103 in (Debug/Release) I get

 following error from jl_init:

 System.AccesViolationException.
 Attempted to read or write protected memory. This is often an indication
 that

 other memory is corrupt.

 Small piece from :

 http://social.msdn.microsoft.com/Forums/en-US/8789ea67-fbc5-4a7b-a4eb-d4a8a050d5c1/attempt-to-read-or-write-protected-memory-this-is-often-an-indicating-that-other-memory-is-corrupt

 This issue shouldn't happen in managed code. The problem is typically
 caused by some component (typically unmanaged, but can also be a bad
 P/Invoke signature) that corrupts the program's memory.

 I tried also to suppress this exception by disabling
 Tools menu -Options - Debugging - General - Suppress JIT optimization
 on module load
 but with no success ?

 Any ideas ?

 Best regards.

 Stefan.

 On Wednesday, October 1, 2014 12:16:15 AM UTC+2, Isaiah wrote:

 The  indicates the cmd prompt working directory:
 ```
 C:\cmn\Julia-0.3.0bin\ConsoleApplication2.exe
 1.4142135623730951
 ```
 Otherwise, try passing the bin path as a char* to jl_init as suggested.


 On Tue, Sep 30, 2014 at 11:24 AM, Stefan Babinec sysl...@gmail.com
 wrote:

 I've copied exe file directly to julia's bin directory Isaiah.

 And I get the above mentioned error when I try to run it in the bin
 directory.


 On Tuesday, September 30, 2014 4:57:41 PM UTC+2, Isaiah wrote:

 Try running it from the Julia directory as `bin\CommandLine2.exe`. This
 is very much a minimal example; for general use, the bin directory should
 be passed as an argument to `jl_init`:

 https://github.com/JuliaLang/julia/blob/1ee440bee5035ccb33f82b8a45febd
 dd2f973baa/src/jlapi.c#L70-L73

 To go much further than this will really require to dig in to both
 jlapi.c and the general Julia source code. Be aware that dealing with type
 translation and garbage collection are both non-trivial. See also
 `examples/embedding.c` in the julia tree, and several previous discussion
 on the mailing list.

 On Tue, Sep 30, 2014 at 9:08 AM, Stefan Babinec sysl...@gmail.com
 wrote:

 Hi Isaiah.

 I tried and got this error:
 System image file  ?l\../lib/julia/sys.ji not found

 System Image sys.ji looks to be on his place and I have no problem
 running Julia.

 Thanks.


 On Tuesday, September 30, 2014 2:03:42 PM UTC+2, Isaiah wrote:

 I should mention that it is necessary to change the project target
 CPU from the default Any to x64 or x86 to match the libjulia 
 architecture.
 On Sep 29, 2014 11:58 PM, Isaiah Norton isaiah...@gmail.com
 wrote:

 I tried this some time ago during 0.2, so to make sure it still
 works I made a minimal translation of the embedding example to C#:

 ```
 using System;
 using System.Runtime.InteropServices;

 namespace ConsoleApplication2
 {
 class Program
 {
 [DllImport(libjulia.dll)]
 static extern void jl_init();
 [DllImport(libjulia.dll)]
 static extern void jl_eval_string(string message);

 static void Main(string[] args)
 {
 jl_init();
 jl_eval_string(print(sqrt(2.0)));
 }
 }
 }
 ```

 I compiled this, copied the binary into `Julia-0.3.0\bin`, and it
 works:

 ```
 C:\cmn\Julia-0.3.0bin\ConsoleApplication2.exe
 1.4142135623730951
 ```


 On Mon, Sep 29, 2014 at 4:11 PM, Tobias Knopp 
 tobias...@googlemail.com wrote:

 yep, I have done this (mostly for fun) before and it works. One
 needs some experience with P/Invoke of course but this is no magic but
 similar to our ccall.

 Cheers,

 Tobi

 Am Montag, 29. September 2014 20:52:10 UTC+2 schrieb Stefan
 Karpinski:

 I assume that you 

[julia-users] Performance tweak - low exponents

2014-10-02 Thread nils . gudat
I have written a bit of code that performs a couple of thousand loops in 
which some variables have to be squared or cubed.
As per Performance Tips - Tweaks 
http://julia.readthedocs.org/en/latest/manual/performance-tips/#tweaks I 
wrote these as x*x instead of x^2, but to my surprise I found that my code 
is actually
slower (one iteration taking 12.7 instead of 11.7 seconds on average), not 
faster!

Is this a peculiarity of my problem or does this tweak refer to a previous 
release of Julia and is outdated?

(I'm working on Julia 0.3.0)


[julia-users] Re: problem with a macro generating a type with inner constructor

2014-10-02 Thread Toivo Henningsson
On Thursday, 2 October 2014 14:36:55 UTC+2, Ariel Keselman wrote:

 Thanks, this solved it!

 I'm using the unquoted version now, I was just testing different things, 
 got a bit confused by this behavior. Should I open an issue about the 
 (explicit) constructor-less version?



I just created one: https://github.com/JuliaLang/julia/issues/8554



Re: [julia-users] Re: 2nd Julia meetup in Japan: JuliaTokyo #2

2014-10-02 Thread John Myles White
FWIW, I think going after the data analyst community is a losing bet for 
Julia until a few more years have passed. The R community contains very few 
developers, so most of the R community couldn't possibly benefit from a young 
language that needs develepors, not users. It's a bad relationship in both 
directions, because the R folks don't get something useful out of the Julia 
language in its current state and the Julia folks don't get something useful 
from the R folks, who generally show up wanting to use code rather than write 
it.

 -- John

On Oct 2, 2014, at 4:52 AM, Sorami Hisamoto therem...@gmail.com wrote:

 This time we had around 40 participants, about the same as the last
 event (JuliaTokyo #1) back in July.
 
 We had audiences from mixed backgrounds; physics, finance,
 bioinformatics, adtech, marketing and web engineering to name a few.
 
 It seems the biggest cluster of people are from R community, people
 doing various data analysis. There's a monthly R meetup in Japan
 called Tokyo.R, where nearly 100 people attend each time, and we do
 see Julia come up in  the talks quite often in recent events.
 https://groups.google.com/forum/#!forum/r-study-tokyo
 
 However these data analysis people are not so satisfied with Julia as
 a quick replacement of R yet, because of the lack of packages and
 documentations.
 
 The difference between that R meetup and our Julia meetup is that
 participants in latter are generally more interested and familiar with
 programming.
 
 
 On Mon, Sep 29, 2014 at 7:28 PM, Viral Shah vi...@mayin.org wrote:
 Thanks for the summary. How was the turnout? I have been noticing lots of
 Japanese tweets on julia too lately. Do send the summaries - they are fun to
 read!
 
 -viral
 
 
 On Saturday, September 27, 2014 7:16:27 PM UTC+5:30, ther...@gmail.com
 wrote:
 
 Hi all,
 
 Today we had our 2nd Julia meetup in Japan, called JuliaTokyo #2.
 
 Here's the list of presentation slides;
 http://juliatokyo.connpass.com/event/8010/presentation/
 
 ---
 
 JuliaTokyo #2 Timetable in English
 
 # Main Talks
 1. Introductory Session - @sorami
 2. Julia in the Corporation - @QuantixResearch
 3. Hamiltonian Monte Carlo Method with Julia - @bicycle1885
 4. DataFrames.jl - @weda_654
 5. Parallel Computing with Julia - @sfchaos
 6. Toolbox for Julia Development - @yomichi_137
 
 # Lightning Talks
 1. MeCab.jl (MeCab: Japanese morphological tokenizer) - @chezou
 2. Review of v0.3 release note - yoshifumi_seki
 3. Using BinDeps.jl - @r9y9
 4. Julia Language Anime Character - @kimrin
 
 ---
 
 We had a survey for the participants on what kind of languages they use on
 a daily basis. 81 answers (multiple choices allowed), and here's the result;
 
 rank, language, #people
 01. Python - 50
 02. R - 36
 03. Java - 25
 04. Ruby - 20
 04. C++ - 20
 05. Other - 19
 06. Excel - 18
 07. C - 15
 08. Julia - 14
 09. Visual Basic - 6
 09. Perl - 6
 09. Matlab / Octave - 6
 09. Scala - 6
 10. Fortran - 2
 10. Clojure - 2
 11. F# - 1
 
 ---
 
 It seems that Julia is slowly gaining its popularity in Japan too!
 
 - sorami
 
 
 btw, the name JuliaTokyo is from Juliana's Tokyo, THE most famous
 disco in Japan back in early 90s.
 
 



Re: [julia-users] Re: 2nd Julia meetup in Japan: JuliaTokyo #2

2014-10-02 Thread Sorami Hisamoto
I agree with that.

In our meetups we generally get good reactions from the people who
implement their own algorithms (theoretical physicists, machine
learning researchers, etc.), but not much from the data analysts.

Analyst people come up hearing about Julia much faster than R, and
then after learning about the language they get disappointed as it's
not mature enough to use it off-the-shelf.

On Thu, Oct 2, 2014 at 11:28 PM, John Myles White
johnmyleswh...@gmail.com wrote:
 FWIW, I think going after the data analyst community is a losing bet for 
 Julia until a few more years have passed. The R community contains very few 
 developers, so most of the R community couldn't possibly benefit from a young 
 language that needs develepors, not users. It's a bad relationship in both 
 directions, because the R folks don't get something useful out of the Julia 
 language in its current state and the Julia folks don't get something useful 
 from the R folks, who generally show up wanting to use code rather than write 
 it.

  -- John

 On Oct 2, 2014, at 4:52 AM, Sorami Hisamoto therem...@gmail.com wrote:

 This time we had around 40 participants, about the same as the last
 event (JuliaTokyo #1) back in July.

 We had audiences from mixed backgrounds; physics, finance,
 bioinformatics, adtech, marketing and web engineering to name a few.

 It seems the biggest cluster of people are from R community, people
 doing various data analysis. There's a monthly R meetup in Japan
 called Tokyo.R, where nearly 100 people attend each time, and we do
 see Julia come up in  the talks quite often in recent events.
 https://groups.google.com/forum/#!forum/r-study-tokyo

 However these data analysis people are not so satisfied with Julia as
 a quick replacement of R yet, because of the lack of packages and
 documentations.

 The difference between that R meetup and our Julia meetup is that
 participants in latter are generally more interested and familiar with
 programming.


 On Mon, Sep 29, 2014 at 7:28 PM, Viral Shah vi...@mayin.org wrote:
 Thanks for the summary. How was the turnout? I have been noticing lots of
 Japanese tweets on julia too lately. Do send the summaries - they are fun to
 read!

 -viral


 On Saturday, September 27, 2014 7:16:27 PM UTC+5:30, ther...@gmail.com
 wrote:

 Hi all,

 Today we had our 2nd Julia meetup in Japan, called JuliaTokyo #2.

 Here's the list of presentation slides;
 http://juliatokyo.connpass.com/event/8010/presentation/

 ---

 JuliaTokyo #2 Timetable in English

 # Main Talks
 1. Introductory Session - @sorami
 2. Julia in the Corporation - @QuantixResearch
 3. Hamiltonian Monte Carlo Method with Julia - @bicycle1885
 4. DataFrames.jl - @weda_654
 5. Parallel Computing with Julia - @sfchaos
 6. Toolbox for Julia Development - @yomichi_137

 # Lightning Talks
 1. MeCab.jl (MeCab: Japanese morphological tokenizer) - @chezou
 2. Review of v0.3 release note - yoshifumi_seki
 3. Using BinDeps.jl - @r9y9
 4. Julia Language Anime Character - @kimrin

 ---

 We had a survey for the participants on what kind of languages they use on
 a daily basis. 81 answers (multiple choices allowed), and here's the 
 result;

 rank, language, #people
 01. Python - 50
 02. R - 36
 03. Java - 25
 04. Ruby - 20
 04. C++ - 20
 05. Other - 19
 06. Excel - 18
 07. C - 15
 08. Julia - 14
 09. Visual Basic - 6
 09. Perl - 6
 09. Matlab / Octave - 6
 09. Scala - 6
 10. Fortran - 2
 10. Clojure - 2
 11. F# - 1

 ---

 It seems that Julia is slowly gaining its popularity in Japan too!

 - sorami


 btw, the name JuliaTokyo is from Juliana's Tokyo, THE most famous
 disco in Japan back in early 90s.





Re: [julia-users] Re: 2nd Julia meetup in Japan: JuliaTokyo #2

2014-10-02 Thread John Myles White
Yeah, this is exactly my experience. When I first got involved with R, I spent 
more time with machine learning folks and less with statisticians. I naively 
assumed that statisticians were as savvy about programming as ML folks, which 
has proven to definitely not be the case. This lead me to think Julia would be 
far more useful to statisticians than it's proven to be.

 -- John

On Oct 2, 2014, at 7:40 AM, Sorami Hisamoto therem...@gmail.com wrote:

 I agree with that.
 
 In our meetups we generally get good reactions from the people who
 implement their own algorithms (theoretical physicists, machine
 learning researchers, etc.), but not much from the data analysts.
 
 Analyst people come up hearing about Julia much faster than R, and
 then after learning about the language they get disappointed as it's
 not mature enough to use it off-the-shelf.
 
 On Thu, Oct 2, 2014 at 11:28 PM, John Myles White
 johnmyleswh...@gmail.com wrote:
 FWIW, I think going after the data analyst community is a losing bet for 
 Julia until a few more years have passed. The R community contains very few 
 developers, so most of the R community couldn't possibly benefit from a 
 young language that needs develepors, not users. It's a bad relationship in 
 both directions, because the R folks don't get something useful out of the 
 Julia language in its current state and the Julia folks don't get something 
 useful from the R folks, who generally show up wanting to use code rather 
 than write it.
 
 -- John
 
 On Oct 2, 2014, at 4:52 AM, Sorami Hisamoto therem...@gmail.com wrote:
 
 This time we had around 40 participants, about the same as the last
 event (JuliaTokyo #1) back in July.
 
 We had audiences from mixed backgrounds; physics, finance,
 bioinformatics, adtech, marketing and web engineering to name a few.
 
 It seems the biggest cluster of people are from R community, people
 doing various data analysis. There's a monthly R meetup in Japan
 called Tokyo.R, where nearly 100 people attend each time, and we do
 see Julia come up in  the talks quite often in recent events.
 https://groups.google.com/forum/#!forum/r-study-tokyo
 
 However these data analysis people are not so satisfied with Julia as
 a quick replacement of R yet, because of the lack of packages and
 documentations.
 
 The difference between that R meetup and our Julia meetup is that
 participants in latter are generally more interested and familiar with
 programming.
 
 
 On Mon, Sep 29, 2014 at 7:28 PM, Viral Shah vi...@mayin.org wrote:
 Thanks for the summary. How was the turnout? I have been noticing lots of
 Japanese tweets on julia too lately. Do send the summaries - they are fun 
 to
 read!
 
 -viral
 
 
 On Saturday, September 27, 2014 7:16:27 PM UTC+5:30, ther...@gmail.com
 wrote:
 
 Hi all,
 
 Today we had our 2nd Julia meetup in Japan, called JuliaTokyo #2.
 
 Here's the list of presentation slides;
 http://juliatokyo.connpass.com/event/8010/presentation/
 
 ---
 
 JuliaTokyo #2 Timetable in English
 
 # Main Talks
 1. Introductory Session - @sorami
 2. Julia in the Corporation - @QuantixResearch
 3. Hamiltonian Monte Carlo Method with Julia - @bicycle1885
 4. DataFrames.jl - @weda_654
 5. Parallel Computing with Julia - @sfchaos
 6. Toolbox for Julia Development - @yomichi_137
 
 # Lightning Talks
 1. MeCab.jl (MeCab: Japanese morphological tokenizer) - @chezou
 2. Review of v0.3 release note - yoshifumi_seki
 3. Using BinDeps.jl - @r9y9
 4. Julia Language Anime Character - @kimrin
 
 ---
 
 We had a survey for the participants on what kind of languages they use on
 a daily basis. 81 answers (multiple choices allowed), and here's the 
 result;
 
 rank, language, #people
 01. Python - 50
 02. R - 36
 03. Java - 25
 04. Ruby - 20
 04. C++ - 20
 05. Other - 19
 06. Excel - 18
 07. C - 15
 08. Julia - 14
 09. Visual Basic - 6
 09. Perl - 6
 09. Matlab / Octave - 6
 09. Scala - 6
 10. Fortran - 2
 10. Clojure - 2
 11. F# - 1
 
 ---
 
 It seems that Julia is slowly gaining its popularity in Japan too!
 
 - sorami
 
 
 btw, the name JuliaTokyo is from Juliana's Tokyo, THE most famous
 disco in Japan back in early 90s.
 
 
 



[julia-users] Re: problem with a macro generating a type with inner constructor

2014-10-02 Thread David Moon
This would not have happened with my suggested reworking of macro hygiene, 
I think.  I am not set up to test it right now.


Re: [julia-users] Re: problem with a macro generating a type with inner constructor

2014-10-02 Thread Stefan Karpinski
I'm in favor of merging that branch and seeing how it goes :-)


 On Oct 2, 2014, at 10:58 AM, David Moon dave_m...@alum.mit.edu wrote:
 
 This would not have happened with my suggested reworking of macro hygiene, I 
 think.  I am not set up to test it right now.


Re: [julia-users] LLVM.js

2014-10-02 Thread JobJob
Any updates on this?

On Friday, 13 December 2013 15:16:31 UTC+2, tshort wrote:

 I've played a little with this. Using Jameson's static compile branch, I 
 was able to dump some functions compiled by Julia to LLVM IR and compile 
 these with Emscripten. I did have to mess with some symbol names because 
 Emscripten doesn't like Julia's naming. See an Emscripten issue here:

 https://github.com/kripken/emscripten/issues/1888

 I also took a quick look at compiling openlibm, and I ran into some 
 nonportable header stuff that would need to be worked on.

 The nice thing about trying to get compiled stuff to run is that you don't 
 necessarily need all of Julia compiled. That means faster downloads, and 
 that we don't have to get everything working at the beginning.

 It'd be great if we could position Julia to be the leading numerical 
 language for the web. With both Firefox and Chrome running asm.js within 2 
 - 4X of native, I think there's lots of opportunity here.



 On Fri, Dec 13, 2013 at 12:22 AM, John Myles White johnmyl...@gmail.com 
 javascript: wrote:

 I think it would also be great to think a bit about how we might use 
 Julia to generate LLVM IR to generate Javascript for certain simple web 
 tasks. Writing Julia code and then letting a package compile it into an 
 includable Javascript file could be really fun.

  — John

 On Dec 12, 2013, at 9:19 PM, Stefan Karpinski ste...@karpinski.org 
 javascript: wrote:

  I’m not sure how practical it really is to wait until runtime to 
 compile your code rather than precompiling it
 
  It's pretty frigging practical, as it turns out. This is great. More 
 work in this direction and we may actually be able to run a full Julia 
 instance in a browser.
 
 
  On Fri, Dec 13, 2013 at 12:14 AM, John Myles White 
 johnmyl...@gmail.com javascript: wrote:
  The Emscripten folks are doing some really cool stuff: 
 http://badassjs.com/post/39573969361/llvm-js-llvm-itself-compiled-to-javascript-via
 
   — John
 
 




Re: [julia-users] Performance tweak - low exponents

2014-10-02 Thread Stefan Karpinski
This depends on the particulars of your code. If you can post the function, we 
can take a look. You can also investigate yourself using the @code_llvm and 
@code_native macros.


 On Oct 2, 2014, at 8:47 AM, nils.gu...@gmail.com wrote:
 
 I have written a bit of code that performs a couple of thousand loops in 
 which some variables have to be squared or cubed.
 As per Performance Tips - Tweaks I wrote these as x*x instead of x^2, but to 
 my surprise I found that my code is actually
 slower (one iteration taking 12.7 instead of 11.7 seconds on average), not 
 faster!
 
 Is this a peculiarity of my problem or does this tweak refer to a previous 
 release of Julia and is outdated?
 
 (I'm working on Julia 0.3.0)


Re: [julia-users] make testall failing

2014-10-02 Thread Stefan Karpinski
Please send error output as text rather than images.


 On Oct 2, 2014, at 11:18 AM, Kapil kapil6...@gmail.com wrote:
 
 I have attached the images of the error I get
 ᐧ
 
 Regards,
 Kapil Agarwal
 
 On Thu, Oct 2, 2014 at 3:07 AM, Elliot Saba staticfl...@gmail.com wrote:
 What happens if you run `Base.runtests()` inside the REPL?
 -E
 
 On Wed, Oct 1, 2014 at 11:25 PM, Kapil kapil6...@gmail.com wrote:
 Yes, the julia interpreter starts with ./julia
 ᐧ
 
 Regards,
 Kapil Agarwal
 
 On Thu, Oct 2, 2014 at 2:23 AM, Elliot Saba staticfl...@gmail.com wrote:
 Can you start julia normally?  E.g. can you run `./julia` and get a REPL?
 -E
 
 julia.png
 julia2.png


Re: [julia-users] ANN: Humanize.jl - human-friendly numbers

2014-10-02 Thread Stefan Karpinski
I was going to ask about that too. Or maybe about a minute. It seems like 
there are two things going on there: humanized expression which is still 
precise one minute and ten seconds versus approximate humanized expression 
about a minute. Separate functions or maybe an option?


 On Oct 2, 2014, at 2:11 AM, Michele Zaffalon michele.zaffa...@gmail.com 
 wrote:
 
 Why is timedelta(70) one minute but the last command timedelta(Date(2014,3,7) 
 - Date(2013,2,4))  one year and one month? Should it not be more consistent 
 to have one minute and 10 seconds in the first case? Besides, 10 seconds in 
 one minute in percentage is more than 1 month in 1 year...
 michele
 
 On Thu, Oct 2, 2014 at 4:14 AM, Iain Dunning iaindunn...@gmail.com wrote:
 Hi all,
 
 Announcing...
 
 Humanize.jl
 https://github.com/IainNZ/Humanize.jl
 
 Humanize numbers, currently:
 * data sizes
 * Date/datetime differences
 
 This package is MIT licensed, and is based on jmoiron's humanize Python 
 library.
 
 Installation: Pkg.add(Humanize)
 
 Examples:
 
 julia datasize(300) 
 3.0 MB 
 julia datasize(300, style=:bin, format=%.3f) 
 2.861 MiB 
 julia datasize(300, style=:gnu, format=%.3f) 
 2.861M
 julia timedelta(70) 
 a minute 
 julia timedelta(0,0,0,23,50,50) 
 23 hours 
 julia timedelta(DateTime(2014,2,3,12,11,10) - DateTime(2013,3,7,13,1,20)) 
 11 months 
 julia timedelta(Date(2014,3,7) - Date(2013,2,4)) 
 1 year, 1 month
 


[julia-users] Julia off-line documentation

2014-10-02 Thread Charles Novaes de Santana
Dear all,

I was wondering if there is any official Julia documentation that can be
accessed off-line.

What do you recommend to use if I want to read the docs without
connecting to internet. I looked for documents like these but I couldn't
find a good general one so far.

Thanks in advance for any comment!

Best,

Charles

-- 
Um axé! :)

--
Charles Novaes de Santana, PhD
http://www.imedea.uib-csic.es/~charles


Re: [julia-users] Julia off-line documentation

2014-10-02 Thread João Felipe Santos
You can build the docs as a PDF or as HTML and read them on your computer.
Check https://github.com/JuliaLang/julia/tree/release-0.3/doc

--
João Felipe Santos

On Thu, Oct 2, 2014 at 11:59 AM, Charles Novaes de Santana 
charles.sant...@gmail.com wrote:

 Dear all,

 I was wondering if there is any official Julia documentation that can be
 accessed off-line.

 What do you recommend to use if I want to read the docs without
 connecting to internet. I looked for documents like these but I couldn't
 find a good general one so far.

 Thanks in advance for any comment!

 Best,

 Charles

 --
 Um axé! :)

 --
 Charles Novaes de Santana, PhD
 http://www.imedea.uib-csic.es/~charles



Re: [julia-users] Julia off-line documentation

2014-10-02 Thread Jake Bolewski
Read the docs allows you to download the content in epub and zipped html 
(look at the lower right hand corder for the documentation version popup).

On Thursday, October 2, 2014 12:01:22 PM UTC-4, João Felipe Santos wrote:

 You can build the docs as a PDF or as HTML and read them on your computer. 
 Check https://github.com/JuliaLang/julia/tree/release-0.3/doc

 --
 João Felipe Santos

 On Thu, Oct 2, 2014 at 11:59 AM, Charles Novaes de Santana 
 charles...@gmail.com javascript: wrote:

 Dear all,

 I was wondering if there is any official Julia documentation that can 
 be accessed off-line. 

 What do you recommend to use if I want to read the docs without 
 connecting to internet. I looked for documents like these but I couldn't 
 find a good general one so far.

 Thanks in advance for any comment!

 Best,

 Charles

 -- 
 Um axé! :)

 --
 Charles Novaes de Santana, PhD
 http://www.imedea.uib-csic.es/~charles
  



Re: [julia-users] Julia off-line documentation

2014-10-02 Thread Charles Novaes de Santana
Obrigado, João! Thank you, Jake! I didn't know this simple tips! :)

Just out of curiosity: Would it be possible to have a built-in
documentation for each function/package, similar to R, for example?
Something intrinsic to the installation of any package? Is there a
specific reason for the way the documentation is written today (in
readthedocs)?

Thanks again!

Best,

Charles

On Thu, Oct 2, 2014 at 6:06 PM, Jake Bolewski jakebolew...@gmail.com
wrote:

 Read the docs allows you to download the content in epub and zipped html
 (look at the lower right hand corder for the documentation version popup).

 On Thursday, October 2, 2014 12:01:22 PM UTC-4, João Felipe Santos wrote:

 You can build the docs as a PDF or as HTML and read them on your
 computer. Check https://github.com/JuliaLang/julia/tree/release-0.3/doc

 --
 João Felipe Santos

 On Thu, Oct 2, 2014 at 11:59 AM, Charles Novaes de Santana 
 charles...@gmail.com wrote:

 Dear all,

 I was wondering if there is any official Julia documentation that can
 be accessed off-line.

 What do you recommend to use if I want to read the docs without
 connecting to internet. I looked for documents like these but I couldn't
 find a good general one so far.

 Thanks in advance for any comment!

 Best,

 Charles

 --
 Um axé! :)

 --
 Charles Novaes de Santana, PhD
 http://www.imedea.uib-csic.es/~charles





-- 
Um axé! :)

--
Charles Novaes de Santana, PhD
http://www.imedea.uib-csic.es/~charles


Re: [julia-users] Re: problem with a macro generating a type with inner constructor

2014-10-02 Thread Toivo Henningsson
Me as well.


[julia-users] New Images.jl - struggling

2014-10-02 Thread Andrew Gibb
I'm having difficulty getting to grips with the new image representations 
in the reworked Images.jl. I think I've got some of the basics down. But 
now I'm trying to visualise a spectrum, and I'm stumped.

What's the correct way to create an array/image of Float64s from an 
array/image of Gray{Ufixed8} (So that fft() works), and vice versa (so I 
can visualize the spectrum)?

Thanks.


Re: [julia-users] make testall failing

2014-10-02 Thread Elliot Saba
Hmmm.  Looks like there's a problem with pthreads? What happens if you type
addprocs(1)?
On Oct 2, 2014 8:30 AM, Kapil kapil6...@gmail.com wrote:

 The errors are the same as the ones I had sent in a text file two mails
 earlier. Still, I have attached the file again.
 ᐧ

 Regards,
 Kapil Agarwal

 On Thu, Oct 2, 2014 at 11:26 AM, Stefan Karpinski 
 stefan.karpin...@gmail.com wrote:

 Please send error output as text rather than images.


 On Oct 2, 2014, at 11:18 AM, Kapil kapil6...@gmail.com wrote:

 I have attached the images of the error I get
 ᐧ

 Regards,
 Kapil Agarwal

 On Thu, Oct 2, 2014 at 3:07 AM, Elliot Saba staticfl...@gmail.com
 wrote:

 What happens if you run `Base.runtests()` inside the REPL?
 -E

 On Wed, Oct 1, 2014 at 11:25 PM, Kapil kapil6...@gmail.com wrote:

 Yes, the julia interpreter starts with ./julia
 ᐧ

 Regards,
 Kapil Agarwal

 On Thu, Oct 2, 2014 at 2:23 AM, Elliot Saba staticfl...@gmail.com
 wrote:

 Can you start julia normally?  E.g. can you run `./julia` and get a
 REPL?
 -E




 julia.png

 julia2.png





Re: [julia-users] make testall failing

2014-10-02 Thread Kapil
I get the following output

julia addprocs(1)
1-element Array{Int64,1}:
 2

ᐧ

Regards,
Kapil Agarwal

On Thu, Oct 2, 2014 at 12:21 PM, Elliot Saba staticfl...@gmail.com wrote:

 Hmmm.  Looks like there's a problem with pthreads? What happens if you
 type addprocs(1)?
 On Oct 2, 2014 8:30 AM, Kapil kapil6...@gmail.com wrote:

 The errors are the same as the ones I had sent in a text file two mails
 earlier. Still, I have attached the file again.
 ᐧ

 Regards,
 Kapil Agarwal

 On Thu, Oct 2, 2014 at 11:26 AM, Stefan Karpinski 
 stefan.karpin...@gmail.com wrote:

 Please send error output as text rather than images.


 On Oct 2, 2014, at 11:18 AM, Kapil kapil6...@gmail.com wrote:

 I have attached the images of the error I get
 ᐧ

 Regards,
 Kapil Agarwal

 On Thu, Oct 2, 2014 at 3:07 AM, Elliot Saba staticfl...@gmail.com
 wrote:

 What happens if you run `Base.runtests()` inside the REPL?
 -E

 On Wed, Oct 1, 2014 at 11:25 PM, Kapil kapil6...@gmail.com wrote:

 Yes, the julia interpreter starts with ./julia
 ᐧ

 Regards,
 Kapil Agarwal

 On Thu, Oct 2, 2014 at 2:23 AM, Elliot Saba staticfl...@gmail.com
 wrote:

 Can you start julia normally?  E.g. can you run `./julia` and get a
 REPL?
 -E




 julia.png

 julia2.png





[julia-users] LsqFit having trouble with a particular model

2014-10-02 Thread Helgi Freyr
Hello,

I have been trying Julia out today. In particular, I have been playing 
around with fitting some data with LsqFit.

However, it's not working as well as I would have hoped.

Here is the code and the two data files and the output figure are attached:

using LsqFit
using PyPlot

model(x, p) = p[1]./x + p[2]./x.^2

xdata = readdlm(gridx.dat)
ydata = readdlm(F1.dat)

fit = curve_fit(model, xdata[1:end], ydata[1:end], [0.5, 0.5])
println(fit.param)
errors = estimate_errors(fit, 0.95)
println(errors)


plot(xdata,ydata,color=blue)
xlim(0,120)
plot(xdata,model(xdata, fit.param),color=green)
savefig(lsq.png)


Playing around with the model itself does do something. For example, as 
this code is basically just the one from the readme file of LsqFit, I tried:

model(x, p) = p[1]*exp(-x.*p[2])

which is used there. It does fit somewhat better but not quite. I ran the 
same thing in Mathematica and obtained:

p[1]: 0.229263
p[2]: -0.0949777

which gets most of the curve right. Am I missing something in trying to fit 
this curve to the data or are there any particular tricks to use with 
curve_fit that I should know about?

Best regards,
Helgi


F1.dat
Description: Binary data


gridx.dat
Description: Binary data


Re: [julia-users] New Images.jl - struggling

2014-10-02 Thread Tim Holy
Use `reinterpret` to strip the colorspace typing. So

imgf = reinterpret(Float64, float64(img))

will give you an Image of Float64s. `data(imgf)` then gives you the raw array.

--Tim


On Thursday, October 02, 2014 09:19:02 AM Andrew Gibb wrote:
 I'm having difficulty getting to grips with the new image representations
 in the reworked Images.jl. I think I've got some of the basics down. But
 now I'm trying to visualise a spectrum, and I'm stumped.
 
 What's the correct way to create an array/image of Float64s from an
 array/image of Gray{Ufixed8} (So that fft() works), and vice versa (so I
 can visualize the spectrum)?
 
 Thanks.



Re: [julia-users] New Images.jl - struggling

2014-10-02 Thread Tim Holy
But presumably best would be to make `fft(float64(img))` Just Work. Wouldn't be 
hard, but I have some deadlines I'm struggling to meet. Can you file an issue 
so I don't forget? (Or, feel free to submit a PR.)

--Tim

On Thursday, October 02, 2014 09:19:02 AM Andrew Gibb wrote:
 I'm having difficulty getting to grips with the new image representations
 in the reworked Images.jl. I think I've got some of the basics down. But
 now I'm trying to visualise a spectrum, and I'm stumped.
 
 What's the correct way to create an array/image of Float64s from an
 array/image of Gray{Ufixed8} (So that fft() works), and vice versa (so I
 can visualize the spectrum)?
 
 Thanks.



[julia-users] Re: plotting images in Julia Studio

2014-10-02 Thread a . p . whitey
This is fantastic!  But, when I try to plot something using Winston 
plot(x,y), no file is saved.

On Tuesday, November 19, 2013 12:32:00 PM UTC-7, Kees van Prooijen wrote:

 It is maybe not common knowledge that it is really easy to get some 
 interactive feedback from plotting packages in Julia Studio. A package like 
 Winston generates image files. These can be opened in an editor window and 
 the view can be split to have the code side by side with the image. With 
 each run of the program, the image is live updated to immediately see 
 effect of parameter changes.



Re: [julia-users] Julia off-line documentation

2014-10-02 Thread Steven G. Johnson


On Thursday, October 2, 2014 12:12:25 PM UTC-4, Charles Santana wrote:

 Just out of curiosity: Would it be possible to have a built-in 
 documentation for each function/package, similar to R, for example? 
 Something intrinsic to the installation of any package? Is there a 
 specific reason for the way the documentation is written today (in 
 readthedocs)? 
 http://www.imedea.uib-csic.es/~charles


This is being worked on for Julia 0.4.   
See https://github.com/JuliaLang/julia/issues/8514 
and https://github.com/JuliaLang/julia/issues/3988 ... the details have 
been debated for a while, but some consensus seems to be building.


[julia-users] Re: Error in adding PyPlot to system image

2014-10-02 Thread Steven G. Johnson
I don't know whether PyPlot in its current form can be added to the system 
image, because some runtime initialization occurs when you load the module. 
 Probably I should move that into an init function.

However, PyCall should work, because that does all of its libpython runtime 
detection stuff in a separate initialization function.  Have you tried 
adding just PyCall to the system image?  That should speed up PyPlot 
loading.


[julia-users] Re: LaTeX labels in Winston

2014-10-02 Thread Alex
Hi,

I think lasem (https://github.com/GNOME/lasem) could actually be useful 
until we have our own rendering engine. It uses Cairo+Pango as backend(s) 
and can actually deal with SVG, MathML and also latex input (via itex2mml). 
The dependencies are not so horrible (gobject, glib, gio, gdk-pixbuf, gdk, 
cairo, pangocairo, libxml, bison, flex), at least if one is using Gtk.jl 
anyways.

Accessing it from Julia was actually quite easy (see below). If there is 
some interest we could turn this into a package to explore this direction 
further ...

Best,

Alex.

https://lh5.googleusercontent.com/-NSrJmN9vOVI/VC2xoHE3jcI/AHM/0U2QXQMWIPM/s1600/Screen%2BShot%2B2014-10-02%2Bat%2B22.12.05.png
 
https://lh4.googleusercontent.com/-_BDpCbsd3wk/VC2xr4MX-xI/AHU/oSMmiI9CigM/s1600/Screen%2BShot%2B2014-10-02%2Bat%2B22.12.21.png


https://lh5.googleusercontent.com/-NSrJmN9vOVI/VC2xoHE3jcI/AHM/0U2QXQMWIPM/s1600/Screen%2BShot%2B2014-10-02%2Bat%2B22.12.05.png



Re: [julia-users] Julia off-line documentation

2014-10-02 Thread Charles Novaes de Santana
Great to know it! Thank you, Steven! I will try to follow this discussion!
;)

Charles

On Thu, Oct 2, 2014 at 9:58 PM, Steven G. Johnson stevenj@gmail.com
wrote:



 On Thursday, October 2, 2014 12:12:25 PM UTC-4, Charles Santana wrote:

 Just out of curiosity: Would it be possible to have a built-in
 documentation for each function/package, similar to R, for example?
 Something intrinsic to the installation of any package? Is there a
 specific reason for the way the documentation is written today (in
 readthedocs)?
 http://www.imedea.uib-csic.es/~charles


 This is being worked on for Julia 0.4.   See
 https://github.com/JuliaLang/julia/issues/8514 and
 https://github.com/JuliaLang/julia/issues/3988 ... the details have been
 debated for a while, but some consensus seems to be building.




-- 
Um axé! :)

--
Charles Novaes de Santana, PhD
http://www.imedea.uib-csic.es/~charles


[julia-users] huge performance hit if for-loops in separate function.

2014-10-02 Thread Jürgen Bohnert
Hi everyone,

I'm new to using julia and I would be grateful if anyone could shed some 
light on the following performance-issue that I cannot explain.
For some reason wrapping 2 for-loops into a separate function and then 
calling that function is 2 times slower than executing the for-loops 
directly.

*My goal:*
Perform timeconsuming but easily parallelizable operation '*operator*' over 
moderate/large parameter-space on a SINGLE Machine with multiple cores. - 
Put loops into function and call '*pmap*' on that.

function operator(x::Float64, y::Float64, extraparameters::Float64 ...)
# ... Does something
# returns Float64 scalar
end

The function '*operator*' works like a charm and fast.

*What I did:*
Execution over a parameter-space:  *x::Array{Float64, 1}, y::Array{Float64, 
1}*
via the function '*loops_simple*' (below)

function loops_simple(different_arguments ...)

# generate x::Array{Float64, 1} and y::Array{Float64, 1} and 
extraparameters
# preallocate output container
output = Array(Float64, (length(x), length(y)))

t = time()

for i=1:length(y)
for j=1:length(x)
output[j,i] = operator(x[j], y[i], extraparameters ...)
end
end

t = time() - t
println(time taken: $t)

return output
end

This runs really fast as expected. Trouble comes as soon as I stuff the 
loops into a separate function '*wrapper*' (for easier parallel execution 
but I didn't get that far.).

function wrapper(x::Array{Float64, 1}, y::Array{Float64, 1}, extraparameters
::Float64 ...)
output = Array(Float64, (length(x), length(y)))
for i=1:length(y)
for j=1:length(x)
output[j,i] = operator (x[j], y[i], extraparameters ...)
end
end
end


Now calling:

function loops_wrapped(different_arguments ...)
# generate x::Array{Float64, 1} and y::Array{Float64, 1} and 
extraparameters
t = time()
output = wrapper (x, y, extraparameters ...)
t = time() - t
println(time taken: $t)

return output
end

the process ends up taking twice the time of the '*loops_simple*' case. Can 
anyone explain what I'm doing wrong?

Thanks in advance.

I'm running julia Version 0.4.0-dev+584; Commit 114abf4; x86_64-linux-gnu


[julia-users] parallel profile--how?

2014-10-02 Thread Travis Porco
Hello--
Trying to find out better ways to profile parallel code in Julia, I came 
across a suggestion to rebuild Julia and use Vtune amplifier from Intel 
(never heard of Vtune), or to somehow have each worker call a function to 
turn profiling on and off somehow. 

When I run my code, there is a lot that seems to be going on: lines 
involving dict.jl and multi.jl and task.jl, and some or much of it does not 
seem to relate directly to a function call of mine. One could try to guess 
the answer decryption-style by removing various lines of code and trying 
to guess what changes. But this is essentially impossible for my 
application, since it is a stochastic simulation, and deleting lines 
changes the subsequent behavior even with the same random seed. I'm hoping 
for something better than voluminous println()'s !

The idea of mapping everything Julia is doing to some specific call may or 
may not even describe the way it works precisely. However, this sort of 
information
...
 5 ./multi.jl; RemoteValue; line: 590
  2 ./array.jl; fill!; line: 158
11 multi.jl; schedule_call; line: 636
 2 dict.jl; setindex!; line: 546
  2 ./tuple.jl; isequal; line: 69
...
while interesting, is not actionable. Nowhere in this particular subtree do 
my function calls get named (not shown). I know something, somewhere, is 
causing a tremendous bottleneck, despite having converted arrays to 
SharedArrays, removed references to global data that were hiding in default 
arguments, and so on...but I still can't tell where it is! (I don't have a 
small minimal postable version of the code.) 

Is there a way to have multiple profile output objects, and have the 
profile data rerouted into different ones as I go? It might at least 
provide some insight (I realize operation 1 might leave the system in a 
state where operation 2 might have extra work to do through no fault of its 
own, so it might not be simple.)
The dream would be:
Profile.init_bucket(1)
Profile.init_bucket(2)
@profile bucket=1 filter(data1,nsteps=24)
@profile bucket=2 filter(data2,nsteps=24)
etc. 
So then I could 
Profile.print(bucket=1) 
to see what went on in the first one, etc. 
I know the syntax doesn't work; I'm not asking for it to, but does anyone 
know a way to do this sort of thing? Save profile data and swap it in and 
out, for example? deepcopy(Profile) obviously fails! Ultimately I've got to 
connect profile output with lines of code, function calls, or data objects !

Thanks; I hate to post things like this but if there's an answer, somebody 
here will know it, and it might benefit somebody besides me.


Re: [julia-users] huge performance hit if for-loops in separate function.

2014-10-02 Thread Milan Bouchet-Valat
Le jeudi 02 octobre 2014 à 13:51 -0700, Jürgen Bohnert a écrit :
 Hi everyone,
 
 I'm new to using julia and I would be grateful if anyone could shed
 some light on the following performance-issue that I cannot explain.
 For some reason wrapping 2 for-loops into a separate function and then
 calling that function is 2 times slower than executing the for-loops
 directly.
 
 My goal:
 Perform timeconsuming but easily parallelizable operation 'operator'
 over moderate/large parameter-space on a SINGLE Machine with multiple
 cores. - Put loops into function and call 'pmap' on that.
 
 
 function operator(x::Float64, y::Float64,
 extraparameters::Float64 ...)
 # ... Does something
 # returns Float64 scalar
 end
 
 
 
 The function 'operator' works like a charm and fast.
 
 What I did:
 Execution over a parameter-space:  x::Array{Float64, 1},
 y::Array{Float64, 1}
 via the function 'loops_simple' (below)
 
 
 function loops_simple(different_arguments ...)
 
 # generate x::Array{Float64, 1} and y::Array{Float64, 1} and
 extraparameters
 # preallocate output container
 output = Array(Float64, (length(x), length(y)))
 
 t = time()
 
 for i=1:length(y)
 for j=1:length(x)
 output[j,i] = operator(x[j], y[i], extraparameters ...)
 end
 end
 
 t = time() - t
 println(time taken: $t)
 
 return output
 end
 
 
 
 This runs really fast as expected. Trouble comes as soon as I stuff
 the loops into a separate function 'wrapper' (for easier parallel
 execution but I didn't get that far.).
 
 
 function wrapper(x::Array{Float64, 1}, y::Array{Float64, 1},
 extraparameters::Float64 ...)
 output = Array(Float64, (length(x), length(y)))
 for i=1:length(y)
 for j=1:length(x)
 output[j,i] = operator (x[j], y[i], extraparameters ...)
 end
 end
 end
 
 
 
 
 Now calling:
 
 
 function loops_wrapped(different_arguments ...)
 # generate x::Array{Float64, 1} and y::Array{Float64, 1} and
 extraparameters
 t = time()
 output = wrapper (x, y, extraparameters ...)
 t = time() - t
 println(time taken: $t)
 
 return output
 end
 
 
 
 the process ends up taking twice the time of the 'loops_simple' case.
 Can anyone explain what I'm doing wrong?

I think you're going to get many replies if you manage to post a short
reproducible example somewhere, e.g. in a GitHub gist. It's hard to tell
what's happening without seeing a concrete piece of code in action.


Regards


 Thanks in advance.
 
 I'm running julia Version 0.4.0-dev+584; Commit 114abf4;
 x86_64-linux-gnu
 



Re: [julia-users] huge performance hit if for-loops in separate function.

2014-10-02 Thread Isaiah Norton
Since you are on -dev you could try the new @inline macro and see if that
helps.

On Thu, Oct 2, 2014 at 4:51 PM, Jürgen Bohnert jo.bro@gmail.com wrote:

 Hi everyone,

 I'm new to using julia and I would be grateful if anyone could shed some
 light on the following performance-issue that I cannot explain.
 For some reason wrapping 2 for-loops into a separate function and then
 calling that function is 2 times slower than executing the for-loops
 directly.

 *My goal:*
 Perform timeconsuming but easily parallelizable operation '*operator*'
 over moderate/large parameter-space on a SINGLE Machine with multiple
 cores. - Put loops into function and call '*pmap*' on that.

 function operator(x::Float64, y::Float64, extraparameters::Float64 ...)
 # ... Does something
 # returns Float64 scalar
 end

 The function '*operator*' works like a charm and fast.

 *What I did:*
 Execution over a parameter-space:  *x::Array{Float64, 1},
 y::Array{Float64, 1}*
 via the function '*loops_simple*' (below)

 function loops_simple(different_arguments ...)

 # generate x::Array{Float64, 1} and y::Array{Float64, 1} and
 extraparameters
 # preallocate output container
 output = Array(Float64, (length(x), length(y)))

 t = time()

 for i=1:length(y)
 for j=1:length(x)
 output[j,i] = operator(x[j], y[i], extraparameters ...)
 end
 end

 t = time() - t
 println(time taken: $t)

 return output
 end

 This runs really fast as expected. Trouble comes as soon as I stuff the
 loops into a separate function '*wrapper*' (for easier parallel execution
 but I didn't get that far.).

 function wrapper(x::Array{Float64, 1}, y::Array{Float64, 1},
 extraparameters::Float64 ...)
 output = Array(Float64, (length(x), length(y)))
 for i=1:length(y)
 for j=1:length(x)
 output[j,i] = operator (x[j], y[i], extraparameters ...)
 end
 end
 end


 Now calling:

 function loops_wrapped(different_arguments ...)
 # generate x::Array{Float64, 1} and y::Array{Float64, 1} and
 extraparameters
 t = time()
 output = wrapper (x, y, extraparameters ...)
 t = time() - t
 println(time taken: $t)

 return output
 end

 the process ends up taking twice the time of the '*loops_simple*' case.
 Can anyone explain what I'm doing wrong?

 Thanks in advance.

 I'm running julia Version 0.4.0-dev+584; Commit 114abf4; x86_64-linux-gnu



[julia-users] copy assignment opeartion on existing and valid destination arrays

2014-10-02 Thread Roy Wang
I often need a copy assignment type of operation to an existing 
destination array of the exact same element type and size. Let's only talk 
about arrays of concrete types, like a multi-dimensional array of floats. 
This is useful when I write optimization solvers, and I need to store 
vectors or matrices from the previous step. I usually pre-allocate a pair 
of arrays of the same type and size, *x* and *x_next*, then do:

*x_next = x;*
 
at the end of each iteration of my solver.

At first, I thought using *copy()* (shallow copy) on them is fine to make 
sure they are separate entities, since floating point numbers and integers 
are concrete types in Julia. While I verified this is true (at least on 
arrays of Float64s), I looked at (around line 202 at the time of this 
post), and *copy()* seems to call *copy!( similar(a), a)*. To my 
understanding, this allocates a new destination array, fills it with the 
corresponding values from the source array, then assigns the pointer of 
this new destination array to *x_next*, and the garbage collector removes 
the old array that *x_next* was pointing to. This is a lot of work when I 
just want to traverse through *x_next*, and assign it the corresponding 
values from* x*. Please correct me if my understanding is wrong!

This is a really common operation. I'd appreciate it if someone can advise 
me whether there is already an existing method for doing this (or a better 
solution) before I write my own.

Cheers,

Roy


[julia-users] Re: copy assignment opeartion on existing and valid destination arrays

2014-10-02 Thread Roy Wang

This kind of routine is what I'm talking about...

# copy assignment for vectors
function copyassignment!(a::Vector,b::Vector)
@assert length(a) == length(b)
for n=1:length(a)
a[n]=b[n];
end
end

My questions:
1) Is there a standard function that does this?
2) Is there a better way to do this so it'll handle any type of 
multi-dimensional array of integers and floats without performance penalty?


On Thursday, 2 October 2014 18:09:16 UTC-4, Roy Wang wrote:

 I often need a copy assignment type of operation to an existing 
 destination array of the exact same element type and size. Let's only talk 
 about arrays of concrete types, like a multi-dimensional array of floats. 
 This is useful when I write optimization solvers, and I need to store 
 vectors or matrices from the previous step. I usually pre-allocate a pair 
 of arrays of the same type and size, *x* and *x_next*, then do:

 *x_next = x;*
  
 at the end of each iteration of my solver.

 At first, I thought using *copy()* (shallow copy) on them is fine to make 
 sure they are separate entities, since floating point numbers and integers 
 are concrete types in Julia. While I verified this is true (at least on 
 arrays of Float64s), I looked at (around line 202 at the time of this 
 post), and *copy()* seems to call *copy!( similar(a), a)*. To my 
 understanding, this allocates a new destination array, fills it with the 
 corresponding values from the source array, then assigns the pointer of 
 this new destination array to *x_next*, and the garbage collector removes 
 the old array that *x_next* was pointing to. This is a lot of work when I 
 just want to traverse through *x_next*, and assign it the corresponding 
 values from* x*. Please correct me if my understanding is wrong!

 This is a really common operation. I'd appreciate it if someone can advise 
 me whether there is already an existing method for doing this (or a better 
 solution) before I write my own.

 Cheers,

 Roy



Re: [julia-users] Re: copy assignment opeartion on existing and valid destination arrays

2014-10-02 Thread John Myles White
Why not use copy!

  -- John

On Oct 2, 2014, at 3:24 PM, Roy Wang roy.c.c.w...@gmail.com wrote:

 
 This kind of routine is what I'm talking about...
 
 # copy assignment for vectors
 function copyassignment!(a::Vector,b::Vector)
 @assert length(a) == length(b)
 for n=1:length(a)
 a[n]=b[n];
 end
 end
 
 My questions:
 1) Is there a standard function that does this?
 2) Is there a better way to do this so it'll handle any type of 
 multi-dimensional array of integers and floats without performance penalty?
 
 
 On Thursday, 2 October 2014 18:09:16 UTC-4, Roy Wang wrote:
 I often need a copy assignment type of operation to an existing destination 
 array of the exact same element type and size. Let's only talk about arrays 
 of concrete types, like a multi-dimensional array of floats. This is useful 
 when I write optimization solvers, and I need to store vectors or matrices 
 from the previous step. I usually pre-allocate a pair of arrays of the same 
 type and size, x and x_next, then do:
 
 x_next = x;
  
 at the end of each iteration of my solver.
 
 At first, I thought using copy() (shallow copy) on them is fine to make sure 
 they are separate entities, since floating point numbers and integers are 
 concrete types in Julia. While I verified this is true (at least on arrays of 
 Float64s), I looked at (around line 202 at the time of this post), and copy() 
 seems to call copy!( similar(a), a). To my understanding, this allocates a 
 new destination array, fills it with the corresponding values from the source 
 array, then assigns the pointer of this new destination array to x_next, and 
 the garbage collector removes the old array that x_next was pointing to. This 
 is a lot of work when I just want to traverse through x_next, and assign it 
 the corresponding values from x. Please correct me if my understanding is 
 wrong!
 
 This is a really common operation. I'd appreciate it if someone can advise me 
 whether there is already an existing method for doing this (or a better 
 solution) before I write my own.
 
 Cheers,
 
 Roy



[julia-users] Re: Error in adding PyPlot to system image

2014-10-02 Thread Mehul Tikekar
Yes, adding PyCall works. It improves the PyPlot's load time from 13 sec to 
8 sec. Thanks!

On Thursday, October 2, 2014 4:02:19 PM UTC-4, Steven G. Johnson wrote:

 I don't know whether PyPlot in its current form can be added to the system 
 image, because some runtime initialization occurs when you load the module. 
  Probably I should move that into an init function.

 However, PyCall should work, because that does all of its libpython 
 runtime detection stuff in a separate initialization function.  Have you 
 tried adding just PyCall to the system image?  That should speed up PyPlot 
 loading.



Re: [julia-users] Re: copy assignment opeartion on existing and valid destination arrays

2014-10-02 Thread Roy Wang
Hey John,

Ah geez, copy!() was only 2 lines lower than copy() in abstractarray.jl. 
Thanks!


On Thursday, 2 October 2014 18:25:22 UTC-4, John Myles White wrote:

 Why not use copy!

   -- John

 On Oct 2, 2014, at 3:24 PM, Roy Wang roy.c@gmail.com javascript: 
 wrote:


 This kind of routine is what I'm talking about...

 # copy assignment for vectors
 function copyassignment!(a::Vector,b::Vector)
 @assert length(a) == length(b)
 for n=1:length(a)
 a[n]=b[n];
 end
 end

 My questions:
 1) Is there a standard function that does this?
 2) Is there a better way to do this so it'll handle any type of 
 multi-dimensional array of integers and floats without performance penalty?


 On Thursday, 2 October 2014 18:09:16 UTC-4, Roy Wang wrote:

 I often need a copy assignment type of operation to an existing 
 destination array of the exact same element type and size. Let's only talk 
 about arrays of concrete types, like a multi-dimensional array of floats. 
 This is useful when I write optimization solvers, and I need to store 
 vectors or matrices from the previous step. I usually pre-allocate a pair 
 of arrays of the same type and size, *x* and *x_next*, then do:

 *x_next = x;*
  
 at the end of each iteration of my solver.

 At first, I thought using *copy()* (shallow copy) on them is fine to 
 make sure they are separate entities, since floating point numbers and 
 integers are concrete types in Julia. While I verified this is true (at 
 least on arrays of Float64s), I looked at (around line 202 at the time of 
 this post), and *copy()* seems to call *copy!( similar(a), a)*. To my 
 understanding, this allocates a new destination array, fills it with the 
 corresponding values from the source array, then assigns the pointer of 
 this new destination array to *x_next*, and the garbage collector 
 removes the old array that *x_next* was pointing to. This is a lot of 
 work when I just want to traverse through *x_next*, and assign it the 
 corresponding values from* x*. Please correct me if my understanding is 
 wrong!

 This is a really common operation. I'd appreciate it if someone can 
 advise me whether there is already an existing method for doing this (or a 
 better solution) before I write my own.

 Cheers,

 Roy




Re: [julia-users] huge performance hit if for-loops in separate function.

2014-10-02 Thread Stefan Karpinski
No worries. We've all been there :-)


 On Oct 2, 2014, at 6:36 PM, Jürgen Bohnert jo.bro@gmail.com wrote:
 
 I'm terribly sorry. After checking another time I found a typo in my original 
 wrapper-function that was responsible. The performance was not a problem at 
 all but my wrapper-function was doing twice the work due to my own stupidity.
 
 Thank you for the replies any way. People seem to be nice around here.
 Best,
 J.
 
 Am Donnerstag, 2. Oktober 2014 23:14:18 UTC+2 schrieb Milan Bouchet-Valat:
 
 Le jeudi 02 octobre 2014 à 13:51 -0700, Jürgen Bohnert a écrit :
 Hi everyone,
 
 I'm new to using julia and I would be grateful if anyone could shed some 
 light on the following performance-issue that I cannot explain.
 For some reason wrapping 2 for-loops into a separate function and then 
 calling that function is 2 times slower than executing the for-loops 
 .. 
 the process ends up taking twice the time of the 'loops_simple' case. Can 
 anyone explain what I'm doing wrong?
 I think you're going to get many replies if you manage to post a short 
 reproducible example somewhere, e.g. in a GitHub gist. It's hard to tell 
 what's happening without seeing a concrete piece of code in action.
 
 
 Regards
 
 Thanks in advance.
 
 I'm running julia Version 0.4.0-dev+584; Commit 114abf4; x86_64-linux-gnu
 
 


Re: [julia-users] Missing rand(distribution, generator)

2014-10-02 Thread Andrew Dolgert
Darn, now it's on me. I've read the codebase, and could add the feature 
with a little work. It's just a method on rand(), coupled with pulling 
code, such as ziggurat, out of Base.

Thanks,
Drew

On Wednesday, October 1, 2014 11:39:16 PM UTC-4, John Myles White wrote:

 Hi Andrew,

 It sounds like you've got a lot of interesting ideas for improving 
 Distributions.jl. Please read through the existing codebase when you've got 
 some time and submit pull requests for any functionality you'd like to see 
 changed.

 In regard to your main question, I don't believe we support special RNG's 
 in Distributions.

  -- John

 On Oct 1, 2014, at 8:32 PM, Andrew Dolgert adol...@gmail.com 
 javascript: wrote:

 It doesn't seem possible to use an explicit random number generator to 
 sample a distribution:

 rng=MersenneTwister(seed)
 rand(Distributions.Exponential(scale), rng)

 Did I miss a way to do this?

 I want to use an explicit generator because
  - I can serialize it and pick up where I left off with the next run
  - I can use different generators in different parts of the program
  - It's good hygiene for stochastic simulations to know when rand is used.

 Using quantile(distribution, rand(rng)) isn't great because it doesn't use 
 the accepted sampling algorithms. For instance, the ziggurat algorithm for 
 exponentials is far better than inverting the cdf.

 Thanks,
 Drew




Re: [julia-users] Missing rand(distribution, generator)

2014-10-02 Thread Stefan Karpinski
This is how we get you ;-)


 On Oct 2, 2014, at 6:58 PM, Andrew Dolgert adolg...@gmail.com wrote:
 
 Darn, now it's on me. I've read the codebase, and could add the feature with 
 a little work. It's just a method on rand(), coupled with pulling code, such 
 as ziggurat, out of Base.
 
 Thanks,
 Drew
 
 On Wednesday, October 1, 2014 11:39:16 PM UTC-4, John Myles White wrote:
 Hi Andrew,
 
 It sounds like you've got a lot of interesting ideas for improving 
 Distributions.jl. Please read through the existing codebase when you've got 
 some time and submit pull requests for any functionality you'd like to see 
 changed.
 
 In regard to your main question, I don't believe we support special RNG's in 
 Distributions.
 
  -- John
 
 On Oct 1, 2014, at 8:32 PM, Andrew Dolgert adol...@gmail.com wrote:
 
 It doesn't seem possible to use an explicit random number generator to 
 sample a distribution:
 rng=MersenneTwister(seed)
 rand(Distributions.Exponential(scale), rng)
 Did I miss a way to do this?
 
 I want to use an explicit generator because
  - I can serialize it and pick up where I left off with the next run
  - I can use different generators in different parts of the program
  - It's good hygiene for stochastic simulations to know when rand is used.
 
 Using quantile(distribution, rand(rng)) isn't great because it doesn't use 
 the accepted sampling algorithms. For instance, the ziggurat algorithm for 
 exponentials is far better than inverting the cdf.
 
 Thanks,
 Drew
 


[julia-users] Re: ANN: QuantEcon.jl

2014-10-02 Thread Chase Coleman
Just wanted to bump this because the site is now live.

On Thursday, September 18, 2014 10:14:22 PM UTC-4, Spencer Lyon wrote:

 New package QuantEcon.jl https://github.com/QuantEcon/QuantEcon.jl.

 This package collects code for quantitative economic modeling. It is 
 currently comprised of two main parts:

1. 

A toolbox of routines useful when doing economics
 2. 

Implementations of types and solution methods for common economic 
models. 
 
 This library has a python twin: QuantEcon.py 
 https://github.com/QuantEcon/QuantEcon.py. The same development team is 
 working on both projects, so we hope to keep the two libraries in sync very 
 closely as new functionality is added.

 The library contains all the code necessary to do the computations found 
 on http://quant-econ.net/, a website dedicated to providing lectures that 
 each economics and programming. The website currently (as of 9/18/14) has 
 only a python version, but the Julia version is in late stages of 
 refinement and should be live very soon (hopefully within a week).

 The initial version of the website will feature 6 lectures dedicated to 
 helping a new user set up a working Julia environment and learn the basics 
 of the language. In addition to this language specific section, the website 
 will include 22 other lectures on topics including

- statistics: markov processes (continuous and discrete state), 
auto-regressive processes, the Kalman filter, covariance stationary 
proceses, ect. 
- economic models: the income fluctuation problem, an asset pricing 
model, the classic optimal growth model, optimal (Ramsey) taxation , the 
McCall search model 
- dynamic programming: shortest path, as well as recursive solutions 
to economic models 

 All the lectures have code examples in Julia and most of the 22 will 
 display code from the QuantEcon.jl library.
 ​



Re: [julia-users] parallel profile--how?

2014-10-02 Thread Tim Holy
This might not be hard at all, but I don't have time to look into it now. So 
to help you investigate, here are some principles:
- each backtrace is a vector of Uints; in profiling data, backtraces are 
separated by NULLs in one gigantic list.
- the Uints correspond to memory locations that are presumably specific to each 
worker
- Profile.retrieve() fetches the list _and_ a lookup dictionary that converts 
memory locations into something that's actually meaningful.

So turn on profiling in each worker, and then ask each worker to call 
Profile.retrieve(), and finally assemble the combined information.

profile.jl is not a huge file; while the printing logic is a little 
complicated, 
the instructions to turn profiling on and off and fetch data are pretty 
trivial. 
If you're writing parallel code, you should have no trouble understanding how 
it works. Particularly, a flat report that combines data across workers 
should be pretty easy.

Good luck!
--Tim


On Thursday, October 02, 2014 01:58:31 PM Travis Porco wrote:
 Hello--
 Trying to find out better ways to profile parallel code in Julia, I came
 across a suggestion to rebuild Julia and use Vtune amplifier from Intel
 (never heard of Vtune), or to somehow have each worker call a function to
 turn profiling on and off somehow.
 
 When I run my code, there is a lot that seems to be going on: lines
 involving dict.jl and multi.jl and task.jl, and some or much of it does not
 seem to relate directly to a function call of mine. One could try to guess
 the answer decryption-style by removing various lines of code and trying
 to guess what changes. But this is essentially impossible for my
 application, since it is a stochastic simulation, and deleting lines
 changes the subsequent behavior even with the same random seed. I'm hoping
 for something better than voluminous println()'s !
 
 The idea of mapping everything Julia is doing to some specific call may or
 may not even describe the way it works precisely. However, this sort of
 information
 ...
  5 ./multi.jl; RemoteValue; line: 590
   2 ./array.jl; fill!; line: 158
 11 multi.jl; schedule_call; line: 636
  2 dict.jl; setindex!; line: 546
   2 ./tuple.jl; isequal; line: 69
 ...
 while interesting, is not actionable. Nowhere in this particular subtree do
 my function calls get named (not shown). I know something, somewhere, is
 causing a tremendous bottleneck, despite having converted arrays to
 SharedArrays, removed references to global data that were hiding in default
 arguments, and so on...but I still can't tell where it is! (I don't have a
 small minimal postable version of the code.)
 
 Is there a way to have multiple profile output objects, and have the
 profile data rerouted into different ones as I go? It might at least
 provide some insight (I realize operation 1 might leave the system in a
 state where operation 2 might have extra work to do through no fault of its
 own, so it might not be simple.)
 The dream would be:
 Profile.init_bucket(1)
 Profile.init_bucket(2)
 @profile bucket=1 filter(data1,nsteps=24)
 @profile bucket=2 filter(data2,nsteps=24)
 etc.
 So then I could
 Profile.print(bucket=1)
 to see what went on in the first one, etc.
 I know the syntax doesn't work; I'm not asking for it to, but does anyone
 know a way to do this sort of thing? Save profile data and swap it in and
 out, for example? deepcopy(Profile) obviously fails! Ultimately I've got to
 connect profile output with lines of code, function calls, or data objects !
 
 Thanks; I hate to post things like this but if there's an answer, somebody
 here will know it, and it might benefit somebody besides me.