Re: [julia-users] Levi-Civita symbol/tensor

2015-02-11 Thread Pablo Zubieta
Done!

https://github.com/JuliaLang/julia/issues/10172


[julia-users] Re: statistical accumulator

2015-02-11 Thread Iain Dunning
JMW just released StreamStats.jl:
https://github.com/johnmyleswhite/StreamStats.jl

Which is what you want I think?

Cheers,
Iain

On Wednesday, February 11, 2015 at 10:53:10 PM UTC-5, Christian Peel wrote:
>
> I'm curious if someone has implemented a statistical accumulator in julia 
> similar to that in boost:
> http://www.boost.org/doc/libs/1_55_0/doc/html/accumulators.html
>
> I'm aware of the accumulator in DataStructures.jl, but if I read it right 
> it doesn't do statistical accumulation, just a running sum or a running 
> histogram. Looking at accumulator.jl (
> https://github.com/JuliaLang/DataStructures.jl/blob/master/src/accumulator.jl 
> ) I see a "+" symbol at the end
> push!{T,V<:Number}(ct::Accumulator{T,V}, x::T, a::V) = (ct.map[x] = 
> ct[x] + a)
> I'm looking for code that can (for example) calculate the variance on the 
> fly using only the second moment and mean as illustrated in eq 1.21 of this 
> page from the boost docs:  
> http://www.boost.org/doc/libs/1_55_0/doc/html/boost/accumulators/impl/lazy_variance_impl.html
> I've done this before in Matlab, just don't want to repeat it in Julia if 
> I don't need to.
>
> Thanks!
>


[julia-users] can a julia script do multiple dispatch itself?

2015-02-11 Thread Christian Peel
In Julia, if I have multiple functions with the same name but different 
arguments, the core of the language takes care of calling the right 
function. 

Let's say I have a cell which contains some functions which are related, 
but each take slightly different arguments.  I'd like to call each of these 
functions with the appropriate arguments.  Is it possible in Julia to do my 
own 'multiple dispatch' similar to that which the core of the language 
does?  I.e. can I somehow check the arguments of each function in the cell, 
and call each with the appropriate arguments? I guess there is a simple 
answer to this, but for whatever reason I haven't found it.

Thanks!


[julia-users] ANN: GridInterpolations.jl

2015-02-11 Thread Mykel Kochenderfer
We posted a draft of a new package that performs multivariate interpolation 
on a rectilinear grid. At the moment, it provides implementations of 
multilinear and simplex interpolation.

https://github.com/sisl/GridInterpolations.jl

We have not registered the package with METADATA.jl (hoping to get an 
initial round of feedback), so you'll have to clone it manually for now as 
mentioned in the README to play with it. There are other interpolation 
packages out there (e.g., Grid.jl), but this package focuses on 
multidimensional grids with non-uniform cutpoints along each dimension. It 
also supports simplex interpolation (I don't think any of the other Julia 
implementations do this).


[julia-users] statistical accumulator

2015-02-11 Thread Christian Peel
I'm curious if someone has implemented a statistical accumulator in julia 
similar to that in boost:
http://www.boost.org/doc/libs/1_55_0/doc/html/accumulators.html

I'm aware of the accumulator in DataStructures.jl, but if I read it right 
it doesn't do statistical accumulation, just a running sum or a running 
histogram. Looking at accumulator.jl 
(https://github.com/JuliaLang/DataStructures.jl/blob/master/src/accumulator.jl 
) I see a "+" symbol at the end
push!{T,V<:Number}(ct::Accumulator{T,V}, x::T, a::V) = (ct.map[x] = 
ct[x] + a)
I'm looking for code that can (for example) calculate the variance on the 
fly using only the second moment and mean as illustrated in eq 1.21 of this 
page from the boost docs:  
http://www.boost.org/doc/libs/1_55_0/doc/html/boost/accumulators/impl/lazy_variance_impl.html
I've done this before in Matlab, just don't want to repeat it in Julia if I 
don't need to.

Thanks!


Re: [julia-users] Re: Fortran subroutine and Julia

2015-02-11 Thread DP
okit does works...thanx everyone

On Thursday, February 12, 2015 at 8:01:46 AM UTC+5:30, Dominique Orban 
wrote:
>
> Oh I see, sorry. Yes, this works just as well for me:
>
>  
> a = 100.0
> b = 10.0
> c = Cdouble[1.0]
> ppmm = ccall((:multiply_, "fsbrtn"), Void,
>  (Ptr{Float64},Ptr{Float64},Ptr{Float64}),&a,&b,c)
> println(c[1])
>
>
>
> On Wednesday, February 11, 2015 at 6:52:55 PM UTC-5, Stefan Karpinski 
> wrote:
>>
>> Right, that should certainly work, but having a and b as scalars and 
>> using the & syntax should also.
>>
>> On Wed, Feb 11, 2015 at 6:50 PM, Dominique Orban  
>> wrote:
>>
>>> This works for me:
>>>
>>>  
>>> a = Cdouble[100.0]
>>> b = Cdouble[10.0]
>>> c = Cdouble[1.0]
>>> ppmm = ccall((:multiply_, "fsbrtn"), Void,
>>>  (Ptr{Float64},Ptr{Float64},Ptr{Float64}),a,b,c)
>>> println(c[1])
>>>
>>>
>>>
>>>
>>> On Wednesday, February 11, 2015 at 12:45:57 PM UTC-5, DP wrote:

 Trying to work with subroutines (I am a MATLAB person without fortran 
 and julia knowledge)

 File Name : fsbrtn.f90

 SUBROUTINE MULTIPLY(A,B,C) 
 DOUBLE PRECISION A,B,C 
 C = A*B 
 RETURN 
 END

 gfortran -shared -O2 fsbrtn.f90 -fPIC -o fsbrtn.so

 a = 100.0
 b = 10.0
 c = 1.0
 ppmm = ccall((:multiply_, "/home/juser/ManUTD/fortran_try/fsbrtn"),
Void,(Ptr{Float64},Ptr{Float64},Ptr{Float64}),&a,&b,c)
 println(c)

 Output
 1.0

 Where am I going wrong?
 ​

 ergerg

>>>
>>

Re: [julia-users] Levi-Civita symbol/tensor

2015-02-11 Thread Blake Johnson
Thanks for posting these, Pablo. For my most frequent use case I care about 
n = 3, but I suppose the O(n) algorithms would be more appropriate in Base.

You are also correct that sign(::AbstractVector) currently does an 
element-wise sign(). I didn't realize before writing my post that 
combinatorics.jl defines Permutations and Combinations types, but not the 
singular equivalents. So, that still leaves the naming issue unresolved.

Pablo, would you mind opening an issue or pull request to continue the 
discussion?

--Blake

On Wednesday, February 11, 2015 at 3:11:27 PM UTC-5, Pablo Zubieta wrote:
>
> Hi again,
>
> There were some bugs in my implementations. I updated the gist 
>  
> with the corrected versions and added a simpler looking function (but of 
> O(n²) running time).
>
> I did some tests and found (with my slow processor) that for permutations 
> of length <= 5 the quadratic implementation (levicivita_simple) performs 
> as fast as the (levicivita_inplace_check). For lengths from 5 to 15, 
> levicivita_inplace_check 
> is the fastest, followed by levicivita_simple. For lengths from 15 to 25 
> levicivita_simple 
> and levicivita perform the same (but slower than levicivita_inplace_check). 
> For more than 25 elements levicivita_inplace_check is always the fastest, 
> 2x faster than levicivita and n times faster than levicivita_simple.
>
> For people wanting the 3D Levi-Civita tensor, levicivita_simple and 
> levicivita_inplace_check 
> should be the same. For people wanting the parity of a permutation for long 
> permutations levicivita_inplace_check should work the best.
>
> Greetings!
>


Re: [julia-users] Re: Fortran subroutine and Julia

2015-02-11 Thread Dominique Orban
Oh I see, sorry. Yes, this works just as well for me:

 
a = 100.0
b = 10.0
c = Cdouble[1.0]
ppmm = ccall((:multiply_, "fsbrtn"), Void,
 (Ptr{Float64},Ptr{Float64},Ptr{Float64}),&a,&b,c)
println(c[1])



On Wednesday, February 11, 2015 at 6:52:55 PM UTC-5, Stefan Karpinski wrote:
>
> Right, that should certainly work, but having a and b as scalars and using 
> the & syntax should also.
>
> On Wed, Feb 11, 2015 at 6:50 PM, Dominique Orban  > wrote:
>
>> This works for me:
>>
>>  
>> a = Cdouble[100.0]
>> b = Cdouble[10.0]
>> c = Cdouble[1.0]
>> ppmm = ccall((:multiply_, "fsbrtn"), Void,
>>  (Ptr{Float64},Ptr{Float64},Ptr{Float64}),a,b,c)
>> println(c[1])
>>
>>
>>
>>
>> On Wednesday, February 11, 2015 at 12:45:57 PM UTC-5, DP wrote:
>>>
>>> Trying to work with subroutines (I am a MATLAB person without fortran 
>>> and julia knowledge)
>>>
>>> File Name : fsbrtn.f90
>>>
>>> SUBROUTINE MULTIPLY(A,B,C) 
>>> DOUBLE PRECISION A,B,C 
>>> C = A*B 
>>> RETURN 
>>> END
>>>
>>> gfortran -shared -O2 fsbrtn.f90 -fPIC -o fsbrtn.so
>>>
>>> a = 100.0
>>> b = 10.0
>>> c = 1.0
>>> ppmm = ccall((:multiply_, "/home/juser/ManUTD/fortran_try/fsbrtn"),
>>>Void,(Ptr{Float64},Ptr{Float64},Ptr{Float64}),&a,&b,c)
>>> println(c)
>>>
>>> Output
>>> 1.0
>>>
>>> Where am I going wrong?
>>> ​
>>>
>>> ergerg
>>>
>>
>

[julia-users] Re: change size of individual plot in gadfly under IJulia

2015-02-11 Thread Daniel Jones

Hi Andrei,

You can do this using the draw function, like:

draw(SVG(20cm, 10cm), plot(...))


On Wednesday, February 11, 2015 at 3:42:47 PM UTC-8, Andrei Berceanu wrote:
>
> set_default_plot_size changes the default size of all following plots, but 
> how can i set the size of a certain plot individually?
>
> //A
>


[julia-users] Reading from win registry

2015-02-11 Thread David Anthoff
Does anyone know about a package that allows reading from the windows
registry?

 

Thanks,

David

 

--

David Anthoff

University of California, Berkeley

 

http://www.david-anthoff.com

 



[julia-users] Re: benchmark with Julia 2x faster than C

2015-02-11 Thread Miles Lubin
Hi Arch, all,

Thanks for looking into this, it's amazing to have experts here who 
understand the depths of compilers. I'm stubbornly having difficulty 
reproducing your timings, even though I see the same assembly generated for 
clang. I've tried on an i5-3320M and on an E5-2650 and on both, Julia is 
faster. How were you measuring the times in nanoseconds?

Miles

On Thursday, January 29, 2015 at 12:20:46 PM UTC-5, Arch Robison wrote:
>
> I can't replicate the the 2x difference.  The C is faster for me.  But I 
> have gcc 4.8.2, not gcc 4.9.1.  Nonetheless, the experiment points out 
> where Julia is missing a loop optimization that Clang and gcc get.  Here is 
> a summary of combinations that I tried on a i7-4770 @ 3.40 GHz.
>
>- Julia 0.3.5: *70*  nsec.  Inner loop is:
>
> L82:vmulsd  XMM3, XMM1, XMM2
>
> vmulsd  XMM4, XMM1, XMM1
>
> vsubsd  XMM4, XMM4, XMM0
>
> vdivsd  XMM3, XMM4, XMM3
>
> vaddsd  XMM1, XMM1, XMM3
>
> vmulsd  XMM3, XMM1, XMM1
>
> vsubsd  XMM3, XMM3, XMM0
>
> vmovq   RDX, XMM3
>
> and RDX, RAX
>
> vmovq   XMM3, RDX
>
> vucomisd XMM3, QWORD PTR [RCX]
>
> ja  L82
>
>
>- Julia trunk from around Jan 19 + LLVM 3.5: *61 *nsec.  Inner loop is:
>
> L80:vmulsd  xmm4, xmm1, xmm1
>
> vsubsd  xmm4, xmm4, xmm0
>
> vmulsd  xmm5, xmm1, xmm2
>
> vdivsd  xmm4, xmm4, xmm5
>
> vaddsd  xmm1, xmm1, xmm4
>
> vmulsd  xmm4, xmm1, xmm1
>
> vsubsd  xmm4, xmm4, xmm0
>
> vandpd  xmm4, xmm4, xmm3
>
> vucomisd xmm4, qword ptr [rax]
>
> ja  L80
>
>  
>
> The abs is done more efficiently than for Julia 0.3.5 because of PR #8364.
>  LLVM missed a CSE opportunity here because of loop rotation: the last 
> vmulsd of each iteration computes the same thing as the first vmulsd of the 
> next iteration.  
>
>
>-  C code compiled with gcc 4.8.2, using gcc -O2 -std=c99 
>-march=native -mno-fma: *46 *nsec
>
> .L11:
>
> vaddsd  %xmm1, %xmm1, %xmm3
>
> vdivsd  %xmm3, %xmm2, %xmm2
>
> vsubsd  %xmm2, %xmm1, %xmm1
>
> vmulsd  %xmm1, %xmm1, %xmm2
>
> vsubsd  %xmm0, %xmm2, %xmm2
>
> vmovapd %xmm2, %xmm3
>
> vandpd  %xmm5, %xmm3, %xmm3
>
> vucomisd%xmm4, %xmm3
>
> ja  .L11
>
>
> Multiply by 2 (5 clock latency) has been replaced by add-to-self (3 clock 
> latency).  It picked up the CSE opportunity.  Only 1 vmulsd per iteration!
>
>
>- C code compiled with clang 3.5.0, using clang -O2 -march=native: *46 
>*nsec
>
> .LBB1_3:# %while.body
>
> # =>This Inner Loop Header: Depth=1
>
> vmulsd  %xmm3, %xmm1, %xmm5
>
> vdivsd  %xmm5, %xmm2, %xmm2
>
> vaddsd  %xmm2, %xmm1, %xmm1
>
> vmulsd  %xmm1, %xmm1, %xmm2
>
> vsubsd  %xmm0, %xmm2, %xmm2
>
> vandpd  %xmm4, %xmm2, %xmm5
>
> vucomisd.LCPI1_1(%rip), %xmm5
>
> ja  .LBB1_3
>
>
> Clang picks up the CSE opportunity but misses the add-to-self opportunity 
> (xmm3=-2.0).   It's also using LLVM.  
> We should check why Julia is missing the CSE opportunity.  Maybe Clang is 
> running a pass that handles CSE for a rotated loop?  Though looking at the 
> Julia pass list, it looks like CSE runs before loop rotation.  Needs more 
> investigation.
>
>
> - Arch 
>  
>


[julia-users] Re: Quantitative Economics with Julia (great PDF doc)

2015-02-11 Thread Ken B
Thank you for sharing, very nice!

On Wednesday, 11 February 2015 08:01:28 UTC+1, Arch Call wrote:
>
> Quantitative Economics with Julia 
> 
>
> Check out the PDF file in the link above.  It has 396 pages of excellent 
> documentation in the use of Julia in economics.
> The file is dated Feb 10, 2015 and is developed by Thomas Sargent & John 
> Stachurski.
>
> The web site is:  quant-econ.net
>
> Note:  Sometimes Chrome chokes on opening or downloading the file.  I had 
> better luck with IE.
>
> ...Archie
>
>
>

[julia-users] Re: Local scope for function storage

2015-02-11 Thread Peter Simon
I may be completely missing the point, but how about unused optional 
arguments(s) to store the constant(s):

julia> test9(x, aconst = 1.0) = x + aconst
test9 (generic function with 2 methods)


julia> @code_native test9(4.2)
.text
Filename: none
Source line: 1
pushRBP
mov RBP, RSP
movabs  RAX, 481367024
Source line: 1
addsd   XMM0, QWORD PTR [RAX]
pop RBP
ret






On Wednesday, February 11, 2015 at 2:27:54 PM UTC-8, Simon Danisch wrote:
>
> Ah yeah, sure...
> And also my all time favorite staged functions could be misused:
> stagedfunction test5() 
>quote 
> a = $(a = rand() )
> a + a
> end
>end
> But I'm actually searching for answers, why it is like this and what 
> speaks against a nice solution for such a common use case.
> My code using this is absolutely not performance critical ;)
>
> @Mauro I am aware of the scope, and I agree with you. I didn't open an 
> issue, as I wanted to discuss this first.
>
> Am Mittwoch, 11. Februar 2015 17:20:02 UTC+1 schrieb Simon Danisch:
>>
>> Hi, 
>> I was trying to find out, what the best way is to have some local, 
>> constant storage for a function, which is only accessible from inside the 
>> function
>> So something like this:
>> begin
>> const local a = []
>> test() = dosomething(a)
>> end
>>
>> I was quite surprised, that you can't do this efficiently.
>> Or at least I didn't find a way, or tested it incorrectly.
>> Here are the different combinations I tried and the emitted native code:
>>
>> https://gist.github.com/SimonDanisch/e4bed0a16bdd847a8c2b#file-local_function_storage-jl
>>
>> test11 && test12 seems to be what julia does internally for test1-test4 
>> (and the example here)
>> It's especially odd, as a global const seems to be faster than a local 
>> const, even though that the local version is more restricted.
>> Is this because there has been more time spend on making globals fast?
>>
>> Best,
>> Simon
>>
>

Re: [julia-users] Re: Fortran subroutine and Julia

2015-02-11 Thread Stefan Karpinski
Right, that should certainly work, but having a and b as scalars and using
the & syntax should also.

On Wed, Feb 11, 2015 at 6:50 PM, Dominique Orban 
wrote:

> This works for me:
>
>
> a = Cdouble[100.0]
> b = Cdouble[10.0]
> c = Cdouble[1.0]
> ppmm = ccall((:multiply_, "fsbrtn"), Void,
>  (Ptr{Float64},Ptr{Float64},Ptr{Float64}),a,b,c)
> println(c[1])
>
>
>
>
> On Wednesday, February 11, 2015 at 12:45:57 PM UTC-5, DP wrote:
>>
>> Trying to work with subroutines (I am a MATLAB person without fortran and
>> julia knowledge)
>>
>> File Name : fsbrtn.f90
>>
>> SUBROUTINE MULTIPLY(A,B,C)
>> DOUBLE PRECISION A,B,C
>> C = A*B
>> RETURN
>> END
>>
>> gfortran -shared -O2 fsbrtn.f90 -fPIC -o fsbrtn.so
>>
>> a = 100.0
>> b = 10.0
>> c = 1.0
>> ppmm = ccall((:multiply_, "/home/juser/ManUTD/fortran_try/fsbrtn"),
>>Void,(Ptr{Float64},Ptr{Float64},Ptr{Float64}),&a,&b,c)
>> println(c)
>>
>> Output
>> 1.0
>>
>> Where am I going wrong?
>> ​
>>
>> ergerg
>>
>


[julia-users] Re: Fortran subroutine and Julia

2015-02-11 Thread Dominique Orban
This works for me:

 
a = Cdouble[100.0]
b = Cdouble[10.0]
c = Cdouble[1.0]
ppmm = ccall((:multiply_, "fsbrtn"), Void,
 (Ptr{Float64},Ptr{Float64},Ptr{Float64}),a,b,c)
println(c[1])




On Wednesday, February 11, 2015 at 12:45:57 PM UTC-5, DP wrote:
>
> Trying to work with subroutines (I am a MATLAB person without fortran and 
> julia knowledge)
>
> File Name : fsbrtn.f90
>
> SUBROUTINE MULTIPLY(A,B,C) 
> DOUBLE PRECISION A,B,C 
> C = A*B 
> RETURN 
> END
>
> gfortran -shared -O2 fsbrtn.f90 -fPIC -o fsbrtn.so
>
> a = 100.0
> b = 10.0
> c = 1.0
> ppmm = ccall((:multiply_, "/home/juser/ManUTD/fortran_try/fsbrtn"),
>Void,(Ptr{Float64},Ptr{Float64},Ptr{Float64}),&a,&b,c)
> println(c)
>
> Output
> 1.0
>
> Where am I going wrong?
> ​
>
> ergerg
>


[julia-users] change size of individual plot in gadfly under IJulia

2015-02-11 Thread Andrei Berceanu
set_default_plot_size changes the default size of all following plots, but 
how can i set the size of a certain plot individually?

//A


[julia-users] ODBC (Windows 8)

2015-02-11 Thread Nolan Bradshaw
Hey guys,
Here is my brief code, figured I'd give it a try on ODBC since I'm 
attempting to connect to a MS SQL Server. Here is my code I will blank out 
sensitive information on purpose.

import ODBC

band_query = "select band_lowerFreq, band_upperFreq from Band;"

co = ODBC.connect("productionclient", usr="", pwd="")
ODBC.query(band_query,co;);
ODBC.disconnect(co);


I receive:  on my query
*invalid UTF-8 sequence*
in convert at utf8.jl.162 (repeats 2 times)


[julia-users] Re: Local scope for function storage

2015-02-11 Thread Simon Danisch
Ah yeah, sure...
And also my all time favorite staged functions could be misused:
stagedfunction test5() 
   quote 
a = $(a = rand() )
a + a
end
   end
But I'm actually searching for answers, why it is like this and what speaks 
against a nice solution for such a common use case.
My code using this is absolutely not performance critical ;)

@Mauro I am aware of the scope, and I agree with you. I didn't open an 
issue, as I wanted to discuss this first.

Am Mittwoch, 11. Februar 2015 17:20:02 UTC+1 schrieb Simon Danisch:
>
> Hi, 
> I was trying to find out, what the best way is to have some local, 
> constant storage for a function, which is only accessible from inside the 
> function
> So something like this:
> begin
> const local a = []
> test() = dosomething(a)
> end
>
> I was quite surprised, that you can't do this efficiently.
> Or at least I didn't find a way, or tested it incorrectly.
> Here are the different combinations I tried and the emitted native code:
>
> https://gist.github.com/SimonDanisch/e4bed0a16bdd847a8c2b#file-local_function_storage-jl
>
> test11 && test12 seems to be what julia does internally for test1-test4 
> (and the example here)
> It's especially odd, as a global const seems to be faster than a local 
> const, even though that the local version is more restricted.
> Is this because there has been more time spend on making globals fast?
>
> Best,
> Simon
>


Re: [julia-users] Local scope for function storage

2015-02-11 Thread Mauro
I can't quite tell: are you aware the `begin end` does not introduce a
new scope?

This (sans begin-end) produces 'long'

local const aa = rand()
test2() = aa::Float64 + aa::Float64
@code_native test2()

and this short:

const aa = rand()
test2() = aa::Float64 + aa::Float64
@code_native test2()

Note sure what local is supposed to do but probably not slow down
things.  Bug?

Anyway, I think the "correct" way is your test2 example:

let a::Float64 = rand()::Float64
global test2
test2() = a::Float64 + a::Float64
end

and if that is not fast/short then this is a performance bug, right?

>> Hi,
>> I was trying to find out, what the best way is to have some local, constant
>> storage for a function, which is only accessible from inside the function
>> So something like this:
>> begin
>> const local a = []
>> test() = dosomething(a)
>> end
>> 
>> I was quite surprised, that you can't do this efficiently.
>> Or at least I didn't find a way, or tested it incorrectly.
>> Here are the different combinations I tried and the emitted native code:
>> https://gist.github.com/SimonDanisch/e4bed0a16bdd847a8c2b#file-local_functio
>> n_storage-jl
>> 
>> test11 && test12 seems to be what julia does internally for test1-test4
>> (and the example here)
>> It's especially odd, as a global const seems to be faster than a local
>> const, even though that the local version is more restricted.
>> Is this because there has been more time spend on making globals fast?
>> 
>> Best,
>> Simon



[julia-users] Re: Input parameters for a simulation: Extract variables from a dictionary?

2015-02-11 Thread David P. Sanders


El miércoles, 11 de febrero de 2015, 15:44:46 (UTC-6), Simon Danisch 
escribió:
>
> How about simply iterating over the dict?
>
> for (key, value) in dict
> set(simulation, key, value)
> end
> Note, that you can actually access a type like this:
> type T 
> a::Int
> end 
> x = T(1)
> x.(:a) = 10 #<- :a is a symbol, which can be created like this 
> symbol("string")
> x.(:a) is equivalent to getfield(a, :a)
>

OK, thanks.

I think what was worrying me was having a dict with strings as keys. Is the 
following a sensible way to convert to a dict with symbols as keys?

julia> d = {"a" => 10}
Dict{Any,Any} with 1 entry:
  "a" => 10

julia> d2 = {symbol(s)=>d[s] for s in keys(d)}
Dict{Any,Any} with 1 entry:
  :a => 10

David
 

>
> For function calls, tshort has the answers ;)
> Besides you can do stuff like:
> func(values(dict)...)
> if you want to call a function with the values from the dict, which is a 
> little problematic due to the order.
>
> Am Mittwoch, 11. Februar 2015 21:25:22 UTC+1 schrieb David P. Sanders:
>>
>> Hi,
>>
>> If I have a dictionary
>>
>> params = {"N": 10, "M": 2.0}
>>
>> how can I use it to define two variables N and M with the corresponding 
>> values?
>>
>> This sounds like it should be easy and obvious, say using `eval`?
>> E.g. extract the keys and values into strings and then use
>>
>> eval(parse("N=10"))
>>
>> Is this reasonable?
>>
>> The use case is to load input parameters for a simulation. Maybe there is 
>> a better way? 
>>
>> Thanks,
>> David.
>>
>

Re: [julia-users] Local scope for function storage

2015-02-11 Thread Tim Holy
This is surely cheating, but

julia> a = rand()
0.7573462021713695

julia> @eval begin
   function test5()
   a = $a
   a + a
   end
   end
test5 (generic function with 1 method)

does give you the "short" version.

--Tim

On Wednesday, February 11, 2015 08:20:02 AM Simon Danisch wrote:
> Hi,
> I was trying to find out, what the best way is to have some local, constant
> storage for a function, which is only accessible from inside the function
> So something like this:
> begin
> const local a = []
> test() = dosomething(a)
> end
> 
> I was quite surprised, that you can't do this efficiently.
> Or at least I didn't find a way, or tested it incorrectly.
> Here are the different combinations I tried and the emitted native code:
> https://gist.github.com/SimonDanisch/e4bed0a16bdd847a8c2b#file-local_functio
> n_storage-jl
> 
> test11 && test12 seems to be what julia does internally for test1-test4
> (and the example here)
> It's especially odd, as a global const seems to be faster than a local
> const, even though that the local version is more restricted.
> Is this because there has been more time spend on making globals fast?
> 
> Best,
> Simon



[julia-users] Re: Input parameters for a simulation: Extract variables from a dictionary?

2015-02-11 Thread Simon Danisch
How about simply iterating over the dict?

for (key, value) in dict
set(simulation, key, value)
end
Note, that you can actually access a type like this:
type T 
a::Int
end 
x = T(1)
x.(:a) = 10 #<- :a is a symbol, which can be created like this 
symbol("string")
x.(:a) is equivalent to getfield(a, :a)

For function calls, tshort has the answers ;)
Besides you can do stuff like:
func(values(dict)...)
if you want to call a function with the values from the dict, which is a 
little problematic due to the order.

Am Mittwoch, 11. Februar 2015 21:25:22 UTC+1 schrieb David P. Sanders:
>
> Hi,
>
> If I have a dictionary
>
> params = {"N": 10, "M": 2.0}
>
> how can I use it to define two variables N and M with the corresponding 
> values?
>
> This sounds like it should be easy and obvious, say using `eval`?
> E.g. extract the keys and values into strings and then use
>
> eval(parse("N=10"))
>
> Is this reasonable?
>
> The use case is to load input parameters for a simulation. Maybe there is 
> a better way? 
>
> Thanks,
> David.
>


Re: [julia-users] Input parameters for a simulation: Extract variables from a dictionary?

2015-02-11 Thread David P. Sanders


El miércoles, 11 de febrero de 2015, 14:49:53 (UTC-6), tshort escribió:
>
> You probably don't want `eval` unless there's no other way. It's hard to 
> tell how you want to use the variables, so it's hard to recommend 
> alternatives. Keyword arguments can be useful for this sort of thing: 
>
> function f(; a = 1, b = 2)
> a + b
> end
>
> f(a = 99, b = 2)
>
> You can also use Dict's to pass keyword arguments into functions as 
> follows: 
>
> using Compat
> d = @compat Dict(:a => 3, :b => 4)
>
> f(;d...)
>

I think the dictionary expansion is exactly what I need, many thanks!
 

>
> Creating a type that holds the parameters is another good option, 
> depending on how the parameters will be used.
>

Yes that's also a good idea.

I'm thinking about the best way to read parameters from a file and pass 
them into the simulation.
If the parameters are stored e.g. in JSON format, then the JSON.jl module 
constructs a dictionary.
So the dictionary unpacking seems to be the way to go.

Thanks,
David.
 

>
>
> On Wed, Feb 11, 2015 at 3:25 PM, David P. Sanders  > wrote:
>
>> Hi,
>>
>> If I have a dictionary
>>
>> params = {"N": 10, "M": 2.0}
>>
>> how can I use it to define two variables N and M with the corresponding 
>> values?
>>
>> This sounds like it should be easy and obvious, say using `eval`?
>> E.g. extract the keys and values into strings and then use
>>
>> eval(parse("N=10"))
>>
>> Is this reasonable?
>>
>> The use case is to load input parameters for a simulation. Maybe there is 
>> a better way? 
>>
>> Thanks,
>> David.
>>
>
>

Re: [julia-users] Re: Fortran subroutine and Julia

2015-02-11 Thread Stefan Karpinski
It's actually a zero-dimensional array that holds only one element. I'm not
sure why it's getting written to, but then again, I have no idea how one
correctly writes through a pointer in Fortran, so I can't really say what
I'd expect that code to do.

On Wed, Feb 11, 2015 at 4:15 PM, Dawid Crivelli  wrote:

>
> Try initializing the c with some value, because right now you are creating
> an empty array with no storage allocated.
>
>> a = 100.0
>> b = 10.0
>> c = [0.0] # or Array(Float64,1)
>> ppmm = ccall((:multiply_, "/home/juser/ManUTD/fortran_try/fsbrtn"),
>> Void,(Ptr{Float64},Ptr{Float64},Ptr{Float64}),&a,&b,c)
>> println(c)
>>
>>


[julia-users] Re: Fortran subroutine and Julia

2015-02-11 Thread Dawid Crivelli

Try initializing the c with some value, because right now you are creating 
an empty array with no storage allocated.

> a = 100.0
> b = 10.0
> c = [0.0] # or Array(Float64,1)
> ppmm = ccall((:multiply_, "/home/juser/ManUTD/fortran_try/fsbrtn"),
> Void,(Ptr{Float64},Ptr{Float64},Ptr{Float64}),&a,&b,c)
> println(c)
>
>

Re: [julia-users] Input parameters for a simulation: Extract variables from a dictionary?

2015-02-11 Thread Tom Short
You probably don't want `eval` unless there's no other way. It's hard to
tell how you want to use the variables, so it's hard to recommend
alternatives. Keyword arguments can be useful for this sort of thing:

function f(; a = 1, b = 2)
a + b
end

f(a = 99, b = 2)

You can also use Dict's to pass keyword arguments into functions as
follows:

using Compat
d = @compat Dict(:a => 3, :b => 4)

f(;d...)

Creating a type that holds the parameters is another good option, depending
on how the parameters will be used.


On Wed, Feb 11, 2015 at 3:25 PM, David P. Sanders 
wrote:

> Hi,
>
> If I have a dictionary
>
> params = {"N": 10, "M": 2.0}
>
> how can I use it to define two variables N and M with the corresponding
> values?
>
> This sounds like it should be easy and obvious, say using `eval`?
> E.g. extract the keys and values into strings and then use
>
> eval(parse("N=10"))
>
> Is this reasonable?
>
> The use case is to load input parameters for a simulation. Maybe there is
> a better way?
>
> Thanks,
> David.
>


[julia-users] Input parameters for a simulation: Extract variables from a dictionary?

2015-02-11 Thread David P. Sanders
Hi,

If I have a dictionary

params = {"N": 10, "M": 2.0}

how can I use it to define two variables N and M with the corresponding 
values?

This sounds like it should be easy and obvious, say using `eval`?
E.g. extract the keys and values into strings and then use

eval(parse("N=10"))

Is this reasonable?

The use case is to load input parameters for a simulation. Maybe there is a 
better way? 

Thanks,
David.


[julia-users] "Introducing Julia" wikibook

2015-02-11 Thread David P. Sanders
Just stumbled on this, which seems not bad (though I haven't looked in 
detail):

http://en.wikibooks.org/wiki/Introducing_Julia


Re: [julia-users] Levi-Civita symbol/tensor

2015-02-11 Thread Pablo Zubieta
Hi again,

There were some bugs in my implementations. I updated the gist 
 
with the corrected versions and added a simpler looking function (but of 
O(n²) running time).

I did some tests and found (with my slow processor) that for permutations 
of length <= 5 the quadratic implementation (levicivita_simple) performs as 
fast as the (levicivita_inplace_check). For lengths from 5 to 15, 
levicivita_inplace_check 
is the fastest, followed by levicivita_simple. For lengths from 15 to 25 
levicivita_simple 
and levicivita perform the same (but slower than levicivita_inplace_check). 
For more than 25 elements levicivita_inplace_check is always the fastest, 
2x faster than levicivita and n times faster than levicivita_simple.

For people wanting the 3D Levi-Civita tensor, levicivita_simple and 
levicivita_inplace_check 
should be the same. For people wanting the parity of a permutation for long 
permutations levicivita_inplace_check should work the best.

Greetings!


Re: [julia-users] Re: Display Help in JUNO

2015-02-11 Thread Mike Innes
No problem, glad you like it!

On 11 February 2015 at 16:52, Christoph Ortner 
wrote:

> ah - just saw your reply there
> http://discuss.junolab.org/t/julia-0-4-docile-doc-strings/120/2
>
> Thanks a lot!
>
> Also: I am quite enjoying working with Juno, even though I normally use
> only EMACS - I just found that neither the standard Julia mode more the ESS
> Mode are quite mature enough. So thank you for producing Juno.
>
> Christoph
>


[julia-users] Re: SAVE THE DATE: JuliaCon 2015, June 24 - 28

2015-02-11 Thread Jeremy Cavanagh


Also don't forget the attendees, otherwise we cannot hear their questions 
and comments all of which adds to the experince and knowledge base.


[julia-users] Re: Fortran subroutine and Julia

2015-02-11 Thread DP


a = 100.0
b = 10.0
c = Array(Float64)
ppmm = ccall((:multiply_, "/home/juser/ManUTD/fortran_try/fsbrtn"),
Void,(Ptr{Float64},Ptr{Float64},Ptr{Float64}),&a,&b,c)
println(c)

0.0
​

On Wednesday, February 11, 2015 at 11:15:57 PM UTC+5:30, DP wrote:
>
> Trying to work with subroutines (I am a MATLAB person without fortran and 
> julia knowledge)
>
> File Name : fsbrtn.f90
>
> SUBROUTINE MULTIPLY(A,B,C) 
> DOUBLE PRECISION A,B,C 
> C = A*B 
> RETURN 
> END
>
> gfortran -shared -O2 fsbrtn.f90 -fPIC -o fsbrtn.so
>
> a = 100.0
> b = 10.0
> c = 1.0
> ppmm = ccall((:multiply_, "/home/juser/ManUTD/fortran_try/fsbrtn"),
>Void,(Ptr{Float64},Ptr{Float64},Ptr{Float64}),&a,&b,c)
> println(c)
>
> Output
> 1.0
>
> Where am I going wrong?
> ​
>
> ergerg
>


Re: [julia-users] Re: Display Help in JUNO

2015-02-11 Thread Christoph Ortner
ah - just saw your reply there
http://discuss.junolab.org/t/julia-0-4-docile-doc-strings/120/2

Thanks a lot! 

Also: I am quite enjoying working with Juno, even though I normally use 
only EMACS - I just found that neither the standard Julia mode more the ESS 
Mode are quite mature enough. So thank you for producing Juno.

Christoph


Re: [julia-users] Fortran subroutine and Julia

2015-02-11 Thread Stefan Karpinski
You don't have an & on c – even if you did, the & syntax doesn't let you
write values, so that still wouldn't work. You can try setting c =
Array(Float64) and then pulling the value written to it out as c[].

On Wed, Feb 11, 2015 at 12:45 PM, DP  wrote:

> Trying to work with subroutines (I am a MATLAB person without fortran and
> julia knowledge)
>
> File Name : fsbrtn.f90
>
> SUBROUTINE MULTIPLY(A,B,C)
> DOUBLE PRECISION A,B,C
> C = A*B
> RETURN
> END
>
> gfortran -shared -O2 fsbrtn.f90 -fPIC -o fsbrtn.so
>
> a = 100.0
> b = 10.0
> c = 1.0
> ppmm = ccall((:multiply_, "/home/juser/ManUTD/fortran_try/fsbrtn"),
>Void,(Ptr{Float64},Ptr{Float64},Ptr{Float64}),&a,&b,c)
> println(c)
>
> Output
> 1.0
>
> Where am I going wrong?
> ​
>
> ergerg
>


[julia-users] Fortran subroutine and Julia

2015-02-11 Thread DP


Trying to work with subroutines (I am a MATLAB person without fortran and 
julia knowledge)

File Name : fsbrtn.f90

SUBROUTINE MULTIPLY(A,B,C) 
DOUBLE PRECISION A,B,C 
C = A*B 
RETURN 
END

gfortran -shared -O2 fsbrtn.f90 -fPIC -o fsbrtn.so

a = 100.0
b = 10.0
c = 1.0
ppmm = ccall((:multiply_, "/home/juser/ManUTD/fortran_try/fsbrtn"),
   Void,(Ptr{Float64},Ptr{Float64},Ptr{Float64}),&a,&b,c)
println(c)

Output
1.0

Where am I going wrong?
​

ergerg


Re: [julia-users] Re: Display Help in JUNO

2015-02-11 Thread Mike Innes
The Juno forum isn't dead but is is unfortunately a bit slow, which
basically comes down to the fact that there are far fewer people who know a
lot about its internals (and most of them have other commitments, at least
for the very near future). I try to respond to everything within a few
days, or at least once a week - I've probably left it a bit longer than I
should recently but I'm looking over the backlog now.

On 11 February 2015 at 15:32, Michael Hatherly 
wrote:

> It’s not just Juno — I just haven’t gotten around to implementing output
> for anything but console yet. I won’t have a chance to add this for the
> next few weeks though, so here’s the issue
>  in case you’ve
> got any ideas you’d like to discuss.
>
> — Mike
> ​
>
> On Wednesday, 11 February 2015 18:08:24 UTC+2, Christoph Ortner wrote:
>>
>> [The Juno mailing list seems dead? Hence I am re-posting my question
>> here; apologies for posting twice.]
>>
>> Juno doesn't seem to recognise doc strings, or am I getting something
>> wrong? Here is a short piece of code:
>>
>> using Docile, Lexicon
>> @doc doc"Some doc with `markdown`."->function blah()
>> println("bleh")
>> end
>>
>> If I past this into a REPL, then type ?blah, then I get the correct
>> doc-string displayed. If I type blah into Juno and hit CTRL-D, then I
>> just get blah (generic function with 1 method)
>>
>> On playing around with this, I noticed that even in the REPL help(blah) will
>> display blah (generic function with 1 method), while ?blah will display
>> the correct doc-string.
>>
>> Am I doing something wrong again or is this not expected to work (yet)?
>>
>> Thanks,
>> Christoph
>>
>


Re: [julia-users] Performance issue of signal processing code compared to MATLAB

2015-02-11 Thread Tim Holy
On Wednesday, February 11, 2015 08:27:26 AM Kyunghun Kim wrote:
> By the way, my another wish is about performance of loading packages. 
> In my code, Image.jl takes ~5 sec for just loading libraries, 
> (new) Interpolation.jl takes even ~20 sec. (maybe cause of metaprogramming 
> codes I think)
> 
> I heard there will be some package pre-loading or cache feature in v0.4. 
> Is there any updates about that? 

This is #1 on many people's wish-list. See 
https://github.com/JuliaLang/julia/pull/8745
for the latest status.

--Tim



[julia-users] Re: Cartesian @nref Usage

2015-02-11 Thread Michael Hatherly


For further reference the files base/multidimensional.jl, 
base/broadcast.jl, and some others (just grepping for the cartesian macros) 
are great places to see them in action.

— Mike
​

On Wednesday, 11 February 2015 17:28:09 UTC+2, Christoph Ortner wrote:
>
> that simple - thanks!
> Christoph
>
>

[julia-users] Re: Display Help in JUNO

2015-02-11 Thread Michael Hatherly


It’s not just Juno — I just haven’t gotten around to implementing output 
for anything but console yet. I won’t have a chance to add this for the 
next few weeks though, so here’s the issue 
 in case you’ve 
got any ideas you’d like to discuss.

— Mike
​

On Wednesday, 11 February 2015 18:08:24 UTC+2, Christoph Ortner wrote:
>
> [The Juno mailing list seems dead? Hence I am re-posting my question here; 
> apologies for posting twice.]
>
> Juno doesn't seem to recognise doc strings, or am I getting something 
> wrong? Here is a short piece of code:
>
> using Docile, Lexicon
> @doc doc"Some doc with `markdown`."->function blah()
> println("bleh")
> end
>
> If I past this into a REPL, then type ?blah, then I get the correct 
> doc-string displayed. If I type blah into Juno and hit CTRL-D, then I 
> just get blah (generic function with 1 method)
>
> On playing around with this, I noticed that even in the REPL help(blah) will 
> display blah (generic function with 1 method), while ?blah will display 
> the correct doc-string.
>
> Am I doing something wrong again or is this not expected to work (yet)? 
>
> Thanks,
> Christoph
>


[julia-users] Re: Display Help in JUNO

2015-02-11 Thread Simon Danisch
>From http://michaelhatherly.github.io/Lexicon.jl/manual/#viewing-documentation 
:
Lexicon hooks into the REPL's ? mode (help-mode) once using Lexicon has 
been called from the REPL. Other environments, such as editors, are not 
currently supported.

Am Mittwoch, 11. Februar 2015 17:08:24 UTC+1 schrieb Christoph Ortner:
>
> [The Juno mailing list seems dead? Hence I am re-posting my question here; 
> apologies for posting twice.]
>
> Juno doesn't seem to recognise doc strings, or am I getting something 
> wrong? Here is a short piece of code:
>
> using Docile, Lexicon
> @doc doc"Some doc with `markdown`."->function blah()
> println("bleh")
> end
>
> If I past this into a REPL, then type ?blah, then I get the correct 
> doc-string displayed. If I type blah into Juno and hit CTRL-D, then I 
> just get blah (generic function with 1 method)
>
> On playing around with this, I noticed that even in the REPL help(blah) will 
> display blah (generic function with 1 method), while ?blah will display 
> the correct doc-string.
>
> Am I doing something wrong again or is this not expected to work (yet)? 
>
> Thanks,
> Christoph
>


Re: [julia-users] Performance issue of signal processing code compared to MATLAB

2015-02-11 Thread Kyunghun Kim
Thank you for kind answer. 
Always I am afraid of unintended copy of buffers because I'm a such a C/C++ 
guy. 

I have read the performance tips when it is early 0.3 version. 
I surprised that there were lots of updates since then. (julia is 
developing so fast!)

I will try the profiling functions and find unintended allocations. 

By the way, my another wish is about performance of loading packages. 
In my code, Image.jl takes ~5 sec for just loading libraries, 
(new) Interpolation.jl takes even ~20 sec. (maybe cause of metaprogramming 
codes I think)

I heard there will be some package pre-loading or cache feature in v0.4. 
Is there any updates about that? 

2015년 2월 12일 목요일 오전 12시 34분 12초 UTC+9, Tim Holy 님의 말:
>
> A[:, indx] currently makes a copy. Try replacing it with slice(A, :, indx) 
> (if 
> you're on julia 0.4) or use ArrayViews if you're on julia 0.3. 
>
> For performance questions, if you aren't using the tools advertised at 
> http://docs.julialang.org/en/release-0.3/manual/performance-tips/ 
> you will likely find them to be a big help. 
>
> --Tim 
>
> On Wednesday, February 11, 2015 07:25:38 AM Kyunghun Kim wrote: 
> > Hi, all. 
> > 
> > I am sorry that I am writing repeating these questions again. 
> (performance 
> > compared to ~) 
> > I have some signal processing code written in MATLAB, and rewriting the 
> > code with Julia. 
> > 
> > The signal processing function take about 1024 x 1024 floating number 
> array 
> > as input called in loop about 100~1000 times. 
> > Here is core function of algorithm: 
> > 
> > MATLAB version: https://gist.github.com/moon6pence/3e60772943f206842d31 
> > 
> > >0.16 sec per each call 
> > 
> > Julia version: https://gist.github.com/moon6pence/4b43c63cb4240b31ea10 
> > 
> > >1.4 sec per each call 
> > 
> > Not only julia code is unusually slow, but MATLAB code is also unusually 
> > fast. 
> > (Naive C++ implementation of this code takes 100~200 sec, maybe MATLAB 
> JIT 
> > compiler is doing very well in SIMD) 
> > 
> > I will dig up this julia code line-by-line to find which line takes much 
> > time. 
> > But before then, I want get this code checked if there is any mistake on 
> > code for performance. (or give me tips for vectorized code) 
> > 
> > Thanks. 
>
>

[julia-users] Local scope for function storage

2015-02-11 Thread Simon Danisch
Hi, 
I was trying to find out, what the best way is to have some local, constant 
storage for a function, which is only accessible from inside the function
So something like this:
begin
const local a = []
test() = dosomething(a)
end

I was quite surprised, that you can't do this efficiently.
Or at least I didn't find a way, or tested it incorrectly.
Here are the different combinations I tried and the emitted native code:
https://gist.github.com/SimonDanisch/e4bed0a16bdd847a8c2b#file-local_function_storage-jl

test11 && test12 seems to be what julia does internally for test1-test4 
(and the example here)
It's especially odd, as a global const seems to be faster than a local 
const, even though that the local version is more restricted.
Is this because there has been more time spend on making globals fast?

Best,
Simon


[julia-users] Display Help in JUNO

2015-02-11 Thread Christoph Ortner


[The Juno mailing list seems dead? Hence I am re-posting my question here; 
apologies for posting twice.]

Juno doesn't seem to recognise doc strings, or am I getting something 
wrong? Here is a short piece of code:

using Docile, Lexicon
@doc doc"Some doc with `markdown`."->function blah()
println("bleh")
end

If I past this into a REPL, then type ?blah, then I get the correct 
doc-string displayed. If I type blah into Juno and hit CTRL-D, then I just 
get blah (generic function with 1 method)

On playing around with this, I noticed that even in the REPL help(blah) will 
display blah (generic function with 1 method), while ?blah will display the 
correct doc-string.

Am I doing something wrong again or is this not expected to work (yet)? 

Thanks,
Christoph


Re: [julia-users] Re: SAVE THE DATE: JuliaCon 2015, June 24 - 28

2015-02-11 Thread Jiahao Chen
Yes, we had neglected to mic speakers. we'll try not to forget this time.


Re: [julia-users] Performance issue of signal processing code compared to MATLAB

2015-02-11 Thread Tim Holy
A[:, indx] currently makes a copy. Try replacing it with slice(A, :, indx) (if 
you're on julia 0.4) or use ArrayViews if you're on julia 0.3.

For performance questions, if you aren't using the tools advertised at
http://docs.julialang.org/en/release-0.3/manual/performance-tips/
you will likely find them to be a big help.

--Tim

On Wednesday, February 11, 2015 07:25:38 AM Kyunghun Kim wrote:
> Hi, all.
> 
> I am sorry that I am writing repeating these questions again. (performance
> compared to ~)
> I have some signal processing code written in MATLAB, and rewriting the
> code with Julia.
> 
> The signal processing function take about 1024 x 1024 floating number array
> as input called in loop about 100~1000 times.
> Here is core function of algorithm:
> 
> MATLAB version: https://gist.github.com/moon6pence/3e60772943f206842d31
> 
> >0.16 sec per each call
> 
> Julia version: https://gist.github.com/moon6pence/4b43c63cb4240b31ea10
> 
> >1.4 sec per each call
> 
> Not only julia code is unusually slow, but MATLAB code is also unusually
> fast.
> (Naive C++ implementation of this code takes 100~200 sec, maybe MATLAB JIT
> compiler is doing very well in SIMD)
> 
> I will dig up this julia code line-by-line to find which line takes much
> time.
> But before then, I want get this code checked if there is any mistake on
> code for performance. (or give me tips for vectorized code)
> 
> Thanks.



[julia-users] Re: Cartesian @nref Usage

2015-02-11 Thread Christoph Ortner
that simple - thanks!
Christoph



[julia-users] Performance issue of signal processing code compared to MATLAB

2015-02-11 Thread Kyunghun Kim
Hi, all. 

I am sorry that I am writing repeating these questions again. (performance 
compared to ~)
I have some signal processing code written in MATLAB, and rewriting the 
code with Julia. 

The signal processing function take about 1024 x 1024 floating number array 
as input called in loop about 100~1000 times. 
Here is core function of algorithm: 

MATLAB version: https://gist.github.com/moon6pence/3e60772943f206842d31
>0.16 sec per each call

Julia version: https://gist.github.com/moon6pence/4b43c63cb4240b31ea10
>1.4 sec per each call

Not only julia code is unusually slow, but MATLAB code is also unusually 
fast. 
(Naive C++ implementation of this code takes 100~200 sec, maybe MATLAB JIT 
compiler is doing very well in SIMD)

I will dig up this julia code line-by-line to find which line takes much 
time. 
But before then, I want get this code checked if there is any mistake on 
code for performance. (or give me tips for vectorized code)

Thanks. 


Re: [julia-users] way around passing functions as arguments

2015-02-11 Thread Kevin Squire
Just curious--in this case how slow?

Anyway, you should check out the FastAnonymous.jl package--it should help
in your case.

Cheers,
   Kevin

On Wednesday, February 11, 2015, Andrei Berceanu 
wrote:

> I have written the following function for generating a sparse matrix:
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> *function genspmat(ω0::Float64; N=pm["N"], α=pm["α"], γ=pm["γ"],
> κ=pm["κ"])# Determine memory usagenz = countnonzeros(; N=N)#
> PreallocateI = Array(Int64,nz)J = Array(Int64,nz)V =
> Array(Complex{Float64},nz)function setnzelem(i,n,m; pos="self")
> if pos=="left"k += 1J[k] = i-N; I[k] = i; V[k] =
> 1elseif pos=="right"k += 1J[k] = i+N; I[k]
> = i; V[k] = 1elseif pos=="up"k += 1J[k] =
> i-1; I[k] = i; V[k] = exp(-im*2π*α*m)elseif pos=="down"
> k += 1J[k] = i+1; I[k] = i; V[k] = exp(im*2π*α*m)elseif
> pos=="self"k += 1J[k] = i; I[k] = i; V[k] = ω0 +
> im*γ - 1/2*κ*(m^2+n^2)endend# maximum value of
> m or n indicesmaxm = div(N-1,2)k = 0for i in 1:N^2m =
> getm(i; N=N)n = getn(i; N=N)#self interaction is always
> presentsetnzelem(i,n,m)#corners#top leftif
> n==maxm && m==-maxmsetnzelem(i,n,m; pos="right")
> setnzelem(i,n,m; pos="down")#top rightelseif n==maxm &&
> m==maxmsetnzelem(i,n,m; pos="left")setnzelem(i,n,m;
> pos="down")#bottom rightelseif n==-maxm && m==maxm
> setnzelem(i,n,m; pos="left")setnzelem(i,n,m;
> pos="up")#bottom leftelseif n==-maxm && m==-maxm
> setnzelem(i,n,m; pos="right")setnzelem(i,n,m;
> pos="up")#edges#topelseif n == maxm
> setnzelem(i,n,m; pos="right")setnzelem(i,n,m;
> pos="left")setnzelem(i,n,m; pos="down")#right
> elseif m == maxmsetnzelem(i,n,m; pos="left")
> setnzelem(i,n,m; pos="up")setnzelem(i,n,m; pos="down")
> #bottomelseif n == -maxmsetnzelem(i,n,m;
> pos="left")setnzelem(i,n,m; pos="up")
> setnzelem(i,n,m; pos="right")#leftelseif m ==
> -maxmsetnzelem(i,n,m; pos="down")setnzelem(i,n,m;
> pos="up")setnzelem(i,n,m; pos="right")else
> #bulksetnzelem(i,n,m; pos="down")setnzelem(i,n,m;
> pos="up")setnzelem(i,n,m; pos="right")
> setnzelem(i,n,m; pos="left")endendreturn sparse(I,J,V)end*
> Notice that inside this function I have defined *setnzelem(i,n,m;
> pos="self"), *which is where all the action really takes place :)
> Now I would like to generalize the *genspmat* to values V[k] which can be
> arbitrary functions of m and n. One way of doing so is to define
>
>
> *function
> genspmat(l::Function,r::Function,u::Function,d::Function,s::Function,
> N::Int,nz::Int)*and then inside setnzelem do
>
>  function setnzelem(i::Int,n::Int,m::Int; pos="self")
> if pos=="left"
> k += 1
> J[k] = i-N; I[k] = i; V[k] = l(m,n)
> elseif pos=="right"
> k += 1
> J[k] = i+N; I[k] = i; V[k] = r(m,n)
> elseif pos=="up"
> k += 1
> J[k] = i-1; I[k] = i; V[k] = u(m,n)
> elseif pos=="down"
> k += 1
> J[k] = i+1; I[k] = i; V[k] = d(m,n)
> elseif pos=="self"
> k += 1
> J[k] = i; I[k] = i; V[k] = s(m,n)
> end
>  end
>
> The problem with this is that passing functions as arguments tends to be
> slow in Julia.
> So what is your advice on accomplishing what I want?
>
> Tnx!
>
>
>


Re: [julia-users] Re: Quantitative Economics with Julia (great PDF doc)

2015-02-11 Thread Jiahao Chen
I do not see how this comment is appropriate for this list.

Thanks,

Jiahao Chen
Staff Research Scientist
MIT Computer Science and Artificial Intelligence Laboratory

On Wed, Feb 11, 2015 at 7:06 AM, Andreas Lobinger 
wrote:

> this falls under SCNR and is some kind of insider joke.
>
> Economics was for long time (and successfully) a social sciences. Economic
> problems were e.g. Unemployment or Market Failure.
> Nowadays this Quantitative Economics view drives topics like HFT where
> they intentionally create market failure ...
>
>
>
>
>
>
>


[julia-users] way around passing functions as arguments

2015-02-11 Thread Andrei Berceanu
I have written the following function for generating a sparse matrix:





















































































*function genspmat(ω0::Float64; N=pm["N"], α=pm["α"], γ=pm["γ"], 
κ=pm["κ"])# Determine memory usagenz = countnonzeros(; N=N)# 
PreallocateI = Array(Int64,nz)J = Array(Int64,nz)V = 
Array(Complex{Float64},nz)function setnzelem(i,n,m; pos="self")
if pos=="left"k += 1J[k] = i-N; I[k] = i; V[k] = 
1elseif pos=="right"k += 1J[k] = i+N; I[k] 
= i; V[k] = 1elseif pos=="up"k += 1J[k] = 
i-1; I[k] = i; V[k] = exp(-im*2π*α*m)elseif pos=="down"
k += 1J[k] = i+1; I[k] = i; V[k] = exp(im*2π*α*m)elseif 
pos=="self"k += 1J[k] = i; I[k] = i; V[k] = ω0 + 
im*γ - 1/2*κ*(m^2+n^2)endend# maximum value of 
m or n indicesmaxm = div(N-1,2)k = 0for i in 1:N^2m = 
getm(i; N=N)n = getn(i; N=N)#self interaction is always 
presentsetnzelem(i,n,m)#corners#top leftif 
n==maxm && m==-maxmsetnzelem(i,n,m; pos="right")
setnzelem(i,n,m; pos="down")#top rightelseif n==maxm && 
m==maxmsetnzelem(i,n,m; pos="left")setnzelem(i,n,m; 
pos="down")#bottom rightelseif n==-maxm && m==maxm 
setnzelem(i,n,m; pos="left")setnzelem(i,n,m; 
pos="up")#bottom leftelseif n==-maxm && m==-maxm 
setnzelem(i,n,m; pos="right")setnzelem(i,n,m; 
pos="up")#edges#topelseif n == maxm
setnzelem(i,n,m; pos="right")setnzelem(i,n,m; 
pos="left")setnzelem(i,n,m; pos="down")#right
elseif m == maxmsetnzelem(i,n,m; pos="left")
setnzelem(i,n,m; pos="up")setnzelem(i,n,m; pos="down")
#bottomelseif n == -maxmsetnzelem(i,n,m; 
pos="left")setnzelem(i,n,m; pos="up")
setnzelem(i,n,m; pos="right")#leftelseif m == 
-maxmsetnzelem(i,n,m; pos="down")setnzelem(i,n,m; 
pos="up")setnzelem(i,n,m; pos="right")else 
#bulksetnzelem(i,n,m; pos="down")setnzelem(i,n,m; 
pos="up")setnzelem(i,n,m; pos="right")
setnzelem(i,n,m; pos="left")endendreturn sparse(I,J,V)end*
Notice that inside this function I have defined *setnzelem(i,n,m; 
pos="self"), *which is where all the action really takes place :)
Now I would like to generalize the *genspmat* to values V[k] which can be 
arbitrary functions of m and n. One way of doing so is to define 


*function 
genspmat(l::Function,r::Function,u::Function,d::Function,s::Function, 
N::Int,nz::Int)*and then inside setnzelem do 

 function setnzelem(i::Int,n::Int,m::Int; pos="self")
if pos=="left"
k += 1
J[k] = i-N; I[k] = i; V[k] = l(m,n)
elseif pos=="right"
k += 1
J[k] = i+N; I[k] = i; V[k] = r(m,n)
elseif pos=="up"
k += 1
J[k] = i-1; I[k] = i; V[k] = u(m,n)
elseif pos=="down"
k += 1
J[k] = i+1; I[k] = i; V[k] = d(m,n)
elseif pos=="self"
k += 1
J[k] = i; I[k] = i; V[k] = s(m,n)
end
 end

The problem with this is that passing functions as arguments tends to be 
slow in Julia.
So what is your advice on accomplishing what I want?

Tnx!




[julia-users] Ruby on Rails with Julia trough ZMQ, a proof-of-concept

2015-02-11 Thread Eric Forgy
Hi Ken,

That looks awesome. You're much further along than I am, so I can learn alot 
from what you've done. Thank you for sharing!



Re: [julia-users] Various questions about Int(), int(), and 0.4

2015-02-11 Thread Tony Fong
Agreed and done.

On Tuesday, February 10, 2015 at 7:24:57 AM UTC+7, Ivar Nesje wrote:
>
> I definitely agree that the info message has some confusing aspects. 
>  Please open an issue (or a PR) with Lint.jl so that the info reads 
> something like. 
>
> INFO: In 0.4, replace int() with Int(), or some of the other explicit 
> rounding functions. (round, trunc, etc...) 
>
>

Re: [julia-users] Re: Quantitative Economics with Julia (great PDF doc)

2015-02-11 Thread Andreas Lobinger
this falls under SCNR and is some kind of insider joke.

Economics was for long time (and successfully) a social sciences. Economic 
problems were e.g. Unemployment or Market Failure. 
Nowadays this Quantitative Economics view drives topics like HFT where they 
intentionally create market failure ...




 



Re: [julia-users] Re: Quantitative Economics with Julia (great PDF doc)

2015-02-11 Thread Tamas Papp
Hmmm, did you read the whole file? After getting through the
preliminaries, it starts with dynamic programming, and discusses
economics problems. It looks like it follows the spirit of the
Sargent-Ljungqvist book quite closely.

On Wed, Feb 11 2015, Andreas Lobinger  wrote:

> 1) is it julia 0.3 or 0.4 centric?
> 2) ... it's a nice description of statistics and statistical models, but
> why do they call this economics?


[julia-users] Re: Quantitative Economics with Julia (great PDF doc)

2015-02-11 Thread Andreas Lobinger
1) is it julia 0.3 or 0.4 centric?
2) ... it's a nice description of statistics and statistical models, but 
why do they call this economics?




Re: [julia-users] Re: moving Julia/Ijulia around my Windows Disk

2015-02-11 Thread Big Stone
ok,

I understood my error: I forgot to copy hidden .git files when doing moves.
No I have the versions of each package, and I guess Pkg.Build('ZMQ') would 
"burn and redo everything from git".

Why can't these absolute path in  deps.jl files be only relative paths per 
default ? 
(at least when they are nicely under .julia directory)





[julia-users] Re: Cartesian @nref Usage

2015-02-11 Thread Michael Hatherly


You need to wrap the @nref macro is parenthesis to avoid it consuming the = 
t:

(@nref $N A i) = t

or

@nref($N, A, i) = t

Note that in your example $N and A should be swapped around.

— Mike
​


On Wednesday, 11 February 2015 11:24:31 UTC+2, Christoph Ortner wrote:
>
> I just discovered the Cartesian package; what a nice set of tools! I have 
> a question though and couldn't find the answer anywhere (apologies if I've 
> missed it):
>
> within an `@nloops`  construct I can write, say
>
>  t = @nref A $N i
>
> but I cannot write
>
>   @nref A $N i = t
>
>
> and instead have to write
>
>  setindex!(A, t, (@ntuple $N j)...)
>
> Am I just using this incorrectly, or is this indeed a restriction? Either 
> way, it might be useful to add something to the documentation?
>
> Thanks,
> Christoph
> 
>   
>


Re: [julia-users] Re: request for feature: modify field in immutable object in a container

2015-02-11 Thread Mauro
> I'm not sure what you are saying with that code. There is no possible way
> to define the function mutate_immutable!! such that it modified a or c.

This does it:

immutable A; a::Int end
a = A(1)
c = a
A.mutable = true
a.a = 99
A.mutable = false
@show a, c #(A(99),A(99))
a===c===A(99) # true

I don't think a trick like this can be pulled with an Int.  Although
putting it into a function errors.  I also tried the pointer trick Simon
was using but couldn't get it to work.

> You example works the same with Ints, but might be easier to see:
>
> a=1
> c=a
> mutate_immutable!(a,5)
> c === a === 1 # true
>
> function mutate_immutable(x,y) ... end
> On Tue, Feb 10, 2015 at 5:22 PM Mauro  wrote:
>
>> @Stefan, thanks that was a good read. I liked the example.
>>
>> > The take away should be that you can't mutate an immutable any more than
>> > you can mutate an integer. If you have an array [1,2,3] and you assign 4
>> to
>> > the 2, you don't change the value of 2, you change what value exists in
>> the
>> > second position of the array.
>>
>> This applies to isbits immutables, right? Then the only surprise could
>> be when two bindings point to the same immutable:
>>
>> julia> immutable A; a::Int end
>>
>> julia> a = A(1)
>> A(1)
>>
>> julia> c = a
>> A(1)
>>
>> julia> mutate_immutable!!(a, 5)
>>
>> julia> c
>> A(5)
>>
>> Which couldn't happen with an Int.
>>
>> > As Stefan just mentioned, we do need to add some mechanisms for more
>> easily
>> > creating them from existing ones.
>> > On Tue, Feb 10, 2015 at 9:41 AM Mauro  wrote:
>> >
>> >> Thanks Lex, Jameson and Michael for this interesting discussion.  I read
>> >> it a few times but still cannot quite follow:
>> >>
>> >> Is the take-home that it is ok to mutate immutables?  No repercussions
>> >> from Julia itself, just confused library users?  This is not what the
>> >> manual suggests:
>> >> http://docs.julialang.org/en/latest/manual/types/#
>> >> immutable-composite-types
>> >>
>> >> Or is it just ok to mutate immutables stored in a mutable container?
>> >>
>> >> Or is the take-home that for library interface code it's ok to mutate
>> >> immutables because there is no other way to mirror C-structs?
>> >>
>>
>>



[julia-users] Cartesian @nref Usage

2015-02-11 Thread Christoph Ortner
I just discovered the Cartesian package; what a nice set of tools! I have a 
question though and couldn't find the answer anywhere (apologies if I've 
missed it):

within an `@nloops`  construct I can write, say

 t = @nref A $N i

but I cannot write

  @nref A $N i = t


and instead have to write

 setindex!(A, t, (@ntuple $N j)...)

Am I just using this incorrectly, or is this indeed a restriction? Either 
way, it might be useful to add something to the documentation?

Thanks,
Christoph