Re: [julia-users] Linked list. Julia -- C

2015-02-16 Thread Jameson Nash
Gtk.jl has a rather substantial example (I think it's in
src/GLib/glists.jl, but I'm on an iPhone)
On Mon, Feb 16, 2015 at 9:14 AM J Luis jmfl...@gmail.com wrote:

 Hi,

 Does anyone know of an example showing how to make linked lists in Julia
 and interface that with C (that will modify them)?

 Thanks



Re: [julia-users] Linked list. Julia -- C

2015-02-16 Thread Douglas Bates
On Monday, February 16, 2015 at 10:30:36 AM UTC-6, Jameson wrote:

 Gtk.jl has a rather substantial example (I think it's in 
 src/GLib/glists.jl, but I'm on an iPhone)


If Fermat were alive today his famous enigmatic note would be I have 
discovered a wonderful proof of this but I'm on an iPhone.
 

 On Mon, Feb 16, 2015 at 9:14 AM J Luis jmf...@gmail.com javascript: 
 wrote:

 Hi,

 Does anyone know of an example showing how to make linked lists in Julia 
 and interface that with C (that will modify them)?

 Thanks 



Re: [julia-users] Re: Values from Dict assigned to variables (symbols?) named as keys?

2015-02-16 Thread Tim Holy
Composite types are much higher performance but have no flexibility (you can't 
add new fields later). So which you choose depends on your use case.

--Tim

On Monday, February 16, 2015 09:38:13 AM Martin Johansson wrote:
 Yes, composite types could well be the answer! I'm still trying to find out
 which choices would make the transition from Matlab easiest. This (old)
 discussion https://github.com/JuliaLang/julia/issues/1263didn't seem to
 recommend one over the other (of Dicts and composite types).
 
 Regards, m
 
 On Monday, February 16, 2015 at 4:30:31 PM UTC+1, Johan Sigfrids wrote:
  Could you not use a composite type for this? It would seem more Julian to
  me.



Re: [julia-users] Julia on Raspberry Pi 2

2015-02-16 Thread Sto Forest
Thank you very much  for your help viral,


*GCC *is set to the latest version

*root@pithree:/opt/julia/deps/openlibm/src# gcc --version*
*gcc (Raspbian 4.8.2-21~rpi3rpi1) 4.8.2*
*Copyright (C) 2013 Free Software Foundation, Inc.*


I've added the override to Make.user.

I pulled the latest version and ran make again with compiler version checks 
disabled.

*ARM.inc*
*...*
override LLVM_ASSERTIONS=1
LLVM_FLAGS+=--with-cpu=cortex-a9 --disable-compiler-version-checks 
--with-float=hard --with-abi=aapcs-vfp --with-fpu=neon --enable-targets=arm 
--enable-optimized --enable-assertions
...



There still seems to be a, compiler is not new enough,  error for llvm

checking build system type... arm-unknown-linux-gnueabihf
checking host system type... arm-unknown-linux-gnueabihf
checking target system type... arm-unknown-linux-gnueabihf
checking type of operating system we're going to host on... Linux
checking type of operating system we're going to target... Linux
checking target architecture... ARM
checking whether GCC is new enough... no
configure: error:
The selected GCC C++ compiler is not new enough to build LLVM. Please 
upgrade
to GCC 4.7. You may pass --disable-compiler-version-checks to configure to
bypass these sanity checks.
Makefile:528: recipe for target 
'llvm-3.5.1/build_Release+Asserts/config.status' failed
make[1]: *** [llvm-3.5.1/build_Release+Asserts/config.status] Error 1
Makefile:64: recipe for target 'julia-deps' failed
make: *** [julia-deps] Error 2
root@pithree:/opt/julia# 











On Monday, 16 February 2015 11:45:23 UTC, Viral Shah wrote:

 You can avoid the openlibm issue for now by adding

 override USE_SYSTEM_LIBM=1

 in your Make.user. Can you also file this issue in the openlibm github 
 repo, so that it can be fixed. Should be easy.

 I wonder why your build is picking up llvm 3.5.0, since ARM.inc uses 3.5.1 
 now. I don't know if that will fix the build problem. Make sure that your 
 new gcc is the default (with gcc -v), and if still trouble, add 
 --disable-compiler-version-checks to LLVM_FLAGS in the ARM.inc.

 -viral

 On Monday, February 16, 2015 at 4:12:06 PM UTC+5:30, Sto Forest wrote:

 After adding in a couple of dependencies, gfortran, cmake, compilation is 
 getting further.

 There are two current problems:

 *openlibm*

 s_creall.c:(.text+0x0): multiple definition of `creal'
 src/s_creal.c.o:s_creal.c:(.text+0x0): first defined here
 collect2: error: ld returned 1 exit status
 Makefile:35: recipe for target 'libopenlibm.so' failed
 make[3]: *** [libopenlibm.so] Error 1
 Makefile:686: recipe for target 'openlibm/libopenlibm.so' failed
 make[2]: *** [openlibm/libopenlibm.so] Error 2
 make[2]: *** Waiting for unfinished jobs
 ar: creating libcerbla.a


 *LLVM*
 I upgraded gcc to version *gcc (Raspbian 4.8.2-21~rpi3rpi1) 4.8.2.*

 However, when running *make *in /opt/julia it errors over the compiler 
 version not being new enough:

 checking whether GCC is new enough... no
 configure: error:
 The selected GCC C++ compiler is not new enough to build LLVM. Please 
 upgrade
 to GCC 4.7. You may pass --disable-compiler-version-checks to configure to
 bypass these sanity checks.
 Makefile:508: recipe for target 
 'llvm-3.5.0/build_Release+Asserts/config.status' failed
 make[2]: *** [llvm-3.5.0/build_Release+Asserts/config.status] Error 1
 Making all in UTIL


 Any suggestions from more knowledgeable people would be welcomed :)








 On Sunday, 15 February 2015 19:04:02 UTC, Sto Forest wrote:

 Thanks Steve I'll give that a try and see how far I get. :)


 On Sunday, 15 February 2015 01:06:39 UTC, Steve Kelly wrote:

 Sto, 

 I got Julia running on a BeagleBone Black running Debian Jessie a 
 couple months back using this process: 
 https://github.com/JuliaLang/julia/blob/master/README.arm.md. It 
 depends on a few system libraries to run, so I needed to update from 
 Wheezy 
 to Jessie so it would work. I think some improvements have been made since 
 then so the build is more self contained. I am pretty sure Raspbian is 
 based on Wheezy, but it might be worth a shot with the latest master.

 Best,
 Steve

 On Sat, Feb 14, 2015 at 3:11 PM, Sto Forest stochast...@gmail.com 
 wrote:

 Is there a way to get Julia running on the new Raspberry Pi 2, perhaps 
 under raspbian ? 





[julia-users] Re: Values from Dict assigned to variables (symbols?) named as keys?

2015-02-16 Thread Martin Johansson
Yes, composite types could well be the answer! I'm still trying to find out 
which choices would make the transition from Matlab easiest. This (old) 
discussion 
https://github.com/JuliaLang/julia/issues/1263didn't seem to recommend 
one over the other (of Dicts and composite types). 

Regards, m


On Monday, February 16, 2015 at 4:30:31 PM UTC+1, Johan Sigfrids wrote:

 Could you not use a composite type for this? It would seem more Julian to 
 me. 



[julia-users] Array of functions - gets overwritten

2015-02-16 Thread Stepa Solntsev
Hi, I'd like the first call to return 1 and the second to return 2, but 
both return 2 !!! What could be the problem? 

function nice()
i=1
while true
f() = i
produce(f)
i+=1
end
end
 ye = Task(() - nice())
funcs = Function[]
 for i in [1:2]
push!(funcs,consume(ye))
end

println(funcs[1]())
println(funcs[2]())




Re: [julia-users] Array of functions - gets overwritten

2015-02-16 Thread Jameson Nash
Both function closures created refer to the same variable i. You need a
let block to create a new local variable for each iteration of the whole
loop.
On Mon, Feb 16, 2015 at 12:57 PM Stepa Solntsev solnt...@gmail.com wrote:

 Hi, I'd like the first call to return 1 and the second to return 2, but
 both return 2 !!! What could be the problem?

 function nice()
   i=1
   while true
   f() = i
   produce(f)
   i+=1
   end
 end
  ye = Task(() - nice())
 funcs = Function[]
  for i in [1:2]
   push!(funcs,consume(ye))
 end

 println(funcs[1]())
 println(funcs[2]())





Re: [julia-users] Array of functions - gets overwritten

2015-02-16 Thread Jameson Nash
And the function itself needs to either not have a name or be local.
Otherwise, you are creating one generic function and just changing the
dispatch for ()
On Mon, Feb 16, 2015 at 1:27 PM Stepa Solntsev solnt...@gmail.com wrote:

 I tried, it still doesn't work.

 function nice()
 i=1
 while true
 let j=i
 f() = j
 produce(f)
 end

 i+=1
 end
 end
 ye = Task(() - nice())
 funcs = Function[]
 for k in [1:2]
 push!(funcs,consume(ye))
 end

 println(funcs[1]())
 println(funcs[2]())



[julia-users] Re: Introducing Julia wikibook

2015-02-16 Thread Paulo Castro
It's not bad, but could be much better. We currently need contributors, 
since some of them gone work on other manuals/books sometime ago. If anyone 
here could help, we would be glad.

Em quarta-feira, 11 de fevereiro de 2015 18:18:39 UTC-2, David P. Sanders 
escreveu:

 Just stumbled on this, which seems not bad (though I haven't looked in 
 detail):

 http://en.wikibooks.org/wiki/Introducing_Julia



[julia-users] Re: HttpServer seg fault

2015-02-16 Thread Joel Nelson
The issue was found to be with HttParser and more specifically the shared 
library libhttp-parser. In the deps.jl file under HttpParser the 
@checked_lib was pointing to /usr/lib64/libhttp_parser.so. However, our 
libhttp_parser.so in lib64 had a symbolic link to libhttp_parser.so.2.0. We 
figured we were running into an issue with the versions of libhttp_parser, 
so we decided to manually build libhttp_parser following the steps in 
build.jl and update the @checked_lib within deps.jl to point to the 
libhttp_parser.so under HttpParser/deps/usr/lib. After making those updates 
we were able to run as expected.

On Friday, February 13, 2015 at 9:39:18 AM UTC-5, Joel Nelson wrote:

 I'm using RHEL 6.5 and running into an issue using the very basic example 
 for HttpServer. When using the example for HttpServer, I see that it 
 listens on port 8000.

 using HttpServer

 http = HttpHandler() do req::Request, res::Response
 Response( ismatch(r^/hello/,req.resource) ? string(Hello , 
 split(req.resource,'/')[3], !) : 404 )end

 http.events[error]  = ( client, err ) - println( err )
 http.events[listen] = ( port )- println(Listening on $port...)

 server = Server( http )run( server, 8000 )


 However, when I simply do a curl command to localhost:8000/hello/name I 
 get a seg fault error.


 signal (11): Segmentation fault
 memcpy at /lib64/libc.so.6 (unknown line)
 write at /usr/bin/../lib64/julia/sys.so (unknown line)
 unknown function (ip: 936587280)
 Memory fault(coredump)


 I installed Julia last week using the RPM and nalimilan-julia-epel-6.repo 
 https://copr.fedoraproject.org/coprs/nalimilan/julia/repo/epel-6/nalimilan-julia-epel-6.repo
  from 
 the downloads section. I also downloaded the tarball and did a 'make debug' 
 to attempt to get more information on the seg fault. However, when running 
 from the REPL of Julia from the made version I do not get a seg fault. 
 Instead the curl command just hangs and nothing is ever returned.

 I originally thought this was an issue with Morsel and I posted an issue 
 here :https://github.com/JuliaWeb/Morsel.jl/issues/33



 Thanks,
 Joel



Re: [julia-users] Array of functions - gets overwritten

2015-02-16 Thread Stepa Solntsev
I tried, it still doesn't work.

function nice()
i=1
while true
let j=i
f() = j
produce(f)
end
i+=1
end
end
ye = Task(() - nice())
funcs = Function[]
for k in [1:2]
push!(funcs,consume(ye))
end

println(funcs[1]())
println(funcs[2]())



[julia-users] ANN: Meetup for the Bay area Julia users on Feb 21 evening

2015-02-16 Thread Xiaodong Qi
Dear Julia users in the Bay area,

I am glad to announce a meetup session in Berkeley, California, USA on Feb 
21. There are three invited speakers talking about their experiences on 
using Julia for optimization, statistics, parallel computing and quantum 
science applications. People from SQuInT (southwest quantum information 
network) workshop, developers  researchers from Stanford and universities 
nearby are also invited for discussions of developing related Julia 
packages during the free interaction session starting from 9:25pm. 

Time: 7:30pm-10:00pm. 
Place: Room Berkeley, DoubleTree Hilton Hotel, 200 Marina Blvd. Berkeley, 
California 94710 USA
Register: http://goo.gl/forms/T5qnGPndSE

Talks: 

Talk 1: Predictive Analysis in Julia - An overview of the
JuMP package for optimization
Speaker: Philip Thomas from StaffJoy
Content: This talk focuses on expressing problems including linear
programming and integer programming in the JuMP metalanguage. Possibly
with some introduction to general optimization problems.

Talk 2: Convex.jl: Optimization for Everyone
Speakers: David Deng and Karanveer, possibly also with Jenny Hong
and Madeleine Udell from Stanford.
Content: This talk will start with a brief overview of how the
Convex.jl package works and the types of problems it can solve, and
really showcase how convenient it is to use. It will be clear that
Convex.jl is easily usable by just about anyone for their basic
optimization needs. One or two more involved applications of using
Convex.jl to solve real world problems will be demonstrated from a good
pool of examples. Hopefully there will be an example on quantum tomography.

Talk 3: Quantum Statistical Simulations with Julia
Speaker: Katharine Hyatt from UCSB
Content: Using computers to probe quantum systems is becoming more
and more common in condensed matter physics research. Many of the
commonly used languages and techniques in this space are either
difficult to learn or not performant. Julia has allowed us to quickly
develop and test codes for a variety of commonly used algorithms,
including exact diagonalization and quantum Monte Carlo. Its parallel
features, including some MPI and GPGPU integration, make it particularly
attractive for many quantum simulation problems. I'll discuss what
features of Julia have been most useful for us when working on these
simulations and the developments we're most excited about.

More details can be found here: 
https://github.com/JuliaQuantum/JuliaQuantum.github.io/issues/15

Feel free to forward this message to anyone who might be interested. Thanks.

Contact: JuliaQuantum organization via  quantumjulia AT gmail DoT com


[julia-users] Re: geting the pointer of composite type to send to C

2015-02-16 Thread J Luis
OK, I found it (actually, I remembered it). The trick is simply to do 

ptrptr = pointer([M])



segunda-feira, 16 de Fevereiro de 2015 às 19:31:53 UTC, J Luis escreveu:

 Hi,

 Sorry, but I'm again in troubles and needing help with pointers and 
 conversation with C

 On C, I have

   if (GMT_Destroy_Data (API, M) != GMT_NOERROR)

 and the prototype of GMT_Destroy_Data() is (void *, void *)

 in Julia I need to get the pointer of that M

 typeof(M) = Ptr{GMT.GMT_MATRIX}

 so I could do

 if (GMT_Destroy_Data(API, ptr) != GMT_NOERROR)

 but I'm not able to get that 'ptr'.

 (M is a C struct that I'm able to see correctly in the Julia side)



Re: [julia-users] Re: Distance computation between two vectors versus a column in a matrix and a vector.

2015-02-16 Thread Andreas Noack
You could Use `AbstractVecOrMat` and `idx=1` as default value, but you
could also use `sub` or `slice` to avoid copying the matrix columns since

*julia **A = randn(2,2)*

*2x2 Array{Float64,2}:*

* -0.124018   1.3846 *

* -0.259116  -1.19279*


*julia **isa(sub(A, :,1), AbstractVector{Float64})*

*true*

2015-02-16 8:11 GMT-05:00 Kristoffer Carlsson kcarlsso...@gmail.com:

 Actually the best thing would be if I could call a function with the

 dist(point_1::AbstractVector{T}, point_2::AbstractVector{T})

 signature because then I could just use all the stuff in the Distances.jl
 package



 On Monday, February 16, 2015 at 2:05:17 PM UTC+1, Kristoffer Carlsson
 wrote:

 I am writing some code where I sometimes want to calculate the distance
 between two vectors and sometimes the distance between a column in a matrix
 and another vector.

 This is in a very hot spot of my program so any unnecessary copying is
 unacceptable.

 What I am currently doing is basically having two versions of the
 distance function, one for vector-vector and one for column-vector:

 # Reduced euclidian distances

 # Vector-Vector
 @inline function euclidean_distance_red{T : FloatingPoint}(point_1::
 AbstractVector{T},
 point_2::
 AbstractVector{T})
 dist = 0.0
 for i = 1:length(point_1)
 @inbounds dist += abs2(point_1[i] - point_2[i])
 end
 return dist
 end

 # Column-Vector
 @inline function euclidean_distance_red{T : FloatingPoint}(data::
 AbstractMatrix{T},
 point::
 AbstractVector{T},
 idx::Int)
 dist = 0.0
 for i = 1:length(point)
 @inbounds dist += abs2(data[i, idx] - point[i])
 end
 return dist
 end


 This feels unnecessary since the transformation from the second to the
 first function is obvious. Is there anyway I can collapse these two
 functions into one that will come with zero performance loss?

 Best regards,
 Kristoffer Carlsson




[julia-users] Calling functions that expect a Function type argument with a FastAnonymous function instead

2015-02-16 Thread Yakir Gagnon


Tried to make the title as explanatory as possible...

Say I have an anonymous function and I @anonize it with FastAnonymous. I 
then want to pass that anonized function to another function from *some 
other module*. That other function is expecting a Function type as one of 
its arguments, not the ##integer type from FastAnonymous. 
Even if I declare that other module's function as:

function myfunction{f}(::Type{f}, args...)
# Do stuff using f just like an ordinary functionend

f's type is still ##9215 (or something) and I can't really pass it to the 
original myfunction without rewriting myfunction, which really isn't *my 
function* but comes from another module.

I must have misunderstood something. 

Thanks in advance!

P.S.
I'm trying to anonize a function that I want to pass to optimize (with 
Optim.jl).


Re: [julia-users] Calling functions that expect a Function type argument with a FastAnonymous function instead

2015-02-16 Thread Yakir Gagnon
I'll do both. So basically I need to (manually) add an optimize method with 
an argument DataType instead of the normal Function. Not too crazy.

On Tuesday, February 17, 2015 at 1:31:28 PM UTC+10, Tim Holy wrote:

 The ##9215 is just a name (a gensym(), see the Metaprogramming section of 
 the 
 manual) for a hidden type that gets created. 

 Unfortunately, currently there isn't a good alternative to modifying the 
 call 
 syntax of the function in Optim. You could open an issue over there to 
 discuss 
 it. (Of course, you can always make such changes on your own machine 
 without 
 asking for permission :-). ) 

 --Tim 

 On Monday, February 16, 2015 07:21:45 PM Yakir Gagnon wrote: 
  Tried to make the title as explanatory as possible... 
  
  Say I have an anonymous function and I @anonize it with FastAnonymous. I 
  then want to pass that anonized function to another function from *some 
  other module*. That other function is expecting a Function type as one 
 of 
  its arguments, not the ##integer type from FastAnonymous. 
  Even if I declare that other module's function as: 
  
  function myfunction{f}(::Type{f}, args...) 
  # Do stuff using f just like an ordinary functionend 
  
  f's type is still ##9215 (or something) and I can't really pass it to 
 the 
  original myfunction without rewriting myfunction, which really isn't *my 
  function* but comes from another module. 
  
  I must have misunderstood something. 
  
  Thanks in advance! 
  
  P.S. 
  I'm trying to anonize a function that I want to pass to optimize (with 
  Optim.jl). 



Re: [julia-users] Calling functions that expect a Function type argument with a FastAnonymous function instead

2015-02-16 Thread Tim Holy
The ##9215 is just a name (a gensym(), see the Metaprogramming section of the 
manual) for a hidden type that gets created.

Unfortunately, currently there isn't a good alternative to modifying the call 
syntax of the function in Optim. You could open an issue over there to discuss 
it. (Of course, you can always make such changes on your own machine without 
asking for permission :-). )

--Tim

On Monday, February 16, 2015 07:21:45 PM Yakir Gagnon wrote:
 Tried to make the title as explanatory as possible...
 
 Say I have an anonymous function and I @anonize it with FastAnonymous. I
 then want to pass that anonized function to another function from *some
 other module*. That other function is expecting a Function type as one of
 its arguments, not the ##integer type from FastAnonymous.
 Even if I declare that other module's function as:
 
 function myfunction{f}(::Type{f}, args...)
 # Do stuff using f just like an ordinary functionend
 
 f's type is still ##9215 (or something) and I can't really pass it to the
 original myfunction without rewriting myfunction, which really isn't *my
 function* but comes from another module.
 
 I must have misunderstood something.
 
 Thanks in advance!
 
 P.S.
 I'm trying to anonize a function that I want to pass to optimize (with
 Optim.jl).



[julia-users] Distance computation between two vectors versus a column in a matrix and a vector.

2015-02-16 Thread Kristoffer Carlsson
I am writing some code where I sometimes want to calculate the distance 
between two vectors and sometimes the distance between a column in a matrix 
and another vector.

This is in a very hot spot of my program so any unnecessary copying is 
unacceptable.

What I am currently doing is basically having two versions of the distance 
function, one for vector-vector and one for column-vector:

# Reduced euclidian distances

# Vector-Vector
@inline function euclidean_distance_red{T : FloatingPoint}(point_1::
AbstractVector{T},
point_2::
AbstractVector{T})
dist = 0.0
for i = 1:length(point_1)
@inbounds dist += abs2(point_1[i] - point_2[i])
end
return dist
end

# Column-Vector
@inline function euclidean_distance_red{T : FloatingPoint}(data::
AbstractMatrix{T},
point::
AbstractVector{T},
idx::Int)
dist = 0.0
for i = 1:length(point)
@inbounds dist += abs2(data[i, idx] - point[i])
end
return dist
end


This feels unnecessary since the transformation from the second to the 
first function is obvious. Is there anyway I can collapse these two 
functions into one that will come with zero performance loss?

Best regards,
Kristoffer Carlsson


[julia-users] Re: Distance computation between two vectors versus a column in a matrix and a vector.

2015-02-16 Thread Kristoffer Carlsson
Actually the best thing would be if I could call a function with the

dist(point_1::AbstractVector{T}, point_2::AbstractVector{T})

signature because then I could just use all the stuff in the Distances.jl 
package


On Monday, February 16, 2015 at 2:05:17 PM UTC+1, Kristoffer Carlsson wrote:

 I am writing some code where I sometimes want to calculate the distance 
 between two vectors and sometimes the distance between a column in a matrix 
 and another vector.

 This is in a very hot spot of my program so any unnecessary copying is 
 unacceptable.

 What I am currently doing is basically having two versions of the distance 
 function, one for vector-vector and one for column-vector:

 # Reduced euclidian distances

 # Vector-Vector
 @inline function euclidean_distance_red{T : FloatingPoint}(point_1::
 AbstractVector{T},
 point_2::
 AbstractVector{T})
 dist = 0.0
 for i = 1:length(point_1)
 @inbounds dist += abs2(point_1[i] - point_2[i])
 end
 return dist
 end

 # Column-Vector
 @inline function euclidean_distance_red{T : FloatingPoint}(data::
 AbstractMatrix{T},
 point::
 AbstractVector{T},
 idx::Int)
 dist = 0.0
 for i = 1:length(point)
 @inbounds dist += abs2(data[i, idx] - point[i])
 end
 return dist
 end


 This feels unnecessary since the transformation from the second to the 
 first function is obvious. Is there anyway I can collapse these two 
 functions into one that will come with zero performance loss?

 Best regards,
 Kristoffer Carlsson



[julia-users] Re: [Winston] Can marker sized be changed?

2015-02-16 Thread Ariel Keselman
scatter does the job. Just answered my own question :)



[julia-users] [Winston] Can marker sized be changed?

2015-02-16 Thread Ariel Keselman
in Matlab for e.g.:

plot( , 'markersize',8)

Is this feature present in Winston?

Thanks!




[julia-users] Re: Values from Dict assigned to variables (symbols?) named as keys?

2015-02-16 Thread Martin Johansson
Thanks, that's a good point regarding the scope and not a side effect I 
would want.

If I understand the keyword part correctly, I would have to explicitly 
state all the variables in the function definition to be able to use them 
locally which, as you write, can become verbose (very verbose for many of 
my functions, actually). In addition, the Dict that I call the function 
with is longer preserved (correct?) as a variable. I want to supply that 
Dict to many other functions at various levels below the function I first 
call. So that will not work. I guess explicitly using the Dict is the best 
way forward. That's the way I do with structs in Matlab, but with a 
somewhat more compact notation, so I can live with that.

Regards, m

On Monday, February 16, 2015 at 12:01:24 PM UTC+1, Michael Hatherly wrote:

 One thing to note about using @eval as mentioned in Sean’s reply is that 
 eval (and by extension @eval) does not evaluate expressions in a 
 function’s scope, but rather in the global scope of the module.
 This might have unexpected results:

 julia function f(d)
for (k, v) in d
@eval $k = $v
end
end
 f (generic function with 1 method)

 julia x
 ERROR: UndefVarError: x not defined

 julia f(Dict(:x = 1))

 julia x
 1

 You could instead use a function with keyword arguments and pass it a 
 dictionary:

 julia g(; x = error(provide x), args...) = nothing
 g (generic function with 1 method)

 where args... discards any variables you don’t want to handle. x could 
 also be given a default value rather than throwing an error when not 
 provided.
 You’d then call it like so:

 julia g(; Dict(:x = 1, :y = 2)...)

 Although it’s slightly more verbose, this way won’t result in side-effects 
 caused by using eval.

 — Mike
 On Monday, 16 February 2015 00:17:54 UTC+2, Martin Johansson wrote:

 Is there a way to assign the values of a Dict to the corresponding keys 
 such that I get variables (symbols?) with names given by the keys? Let's 
 assume the keys are for example strings which would work as variables.

 The reason for why I would want to do this is to have functions where 
 explicit variables are used rather than Dicts, primarily for readability. 
 I'm suspecting that this is not a recommended way of doing things (since I 
 couldn't find any info along these lines when searching), and if this is 
 the case please set me straight.

 Regards, m




Re: [julia-users] Re: Distance computation between two vectors versus a column in a matrix and a vector.

2015-02-16 Thread Kristoffer Carlsson
I tried with sub/slice a while ago and I think it gave me a very large 
performance decrease. I will try again when I get to a good computer to 
test it.

The `AbstractVecOrMat` with a default value for idx-method sounds 
interesting though. I will try it out.

Thank you.

On Monday, February 16, 2015 at 2:34:23 PM UTC+1, Andreas Noack wrote:

 You could Use `AbstractVecOrMat` and `idx=1` as default value, but you 
 could also use `sub` or `slice` to avoid copying the matrix columns since

 *julia **A = randn(2,2)*

 *2x2 Array{Float64,2}:*

 * -0.124018   1.3846 *

 * -0.259116  -1.19279*


 *julia **isa(sub(A, :,1), AbstractVector{Float64})*

 *true*

 2015-02-16 8:11 GMT-05:00 Kristoffer Carlsson kcarl...@gmail.com 
 javascript::

 Actually the best thing would be if I could call a function with the

 dist(point_1::AbstractVector{T}, point_2::AbstractVector{T})

 signature because then I could just use all the stuff in the Distances.jl 
 package



 On Monday, February 16, 2015 at 2:05:17 PM UTC+1, Kristoffer Carlsson 
 wrote:

 I am writing some code where I sometimes want to calculate the distance 
 between two vectors and sometimes the distance between a column in a matrix 
 and another vector.

 This is in a very hot spot of my program so any unnecessary copying is 
 unacceptable.

 What I am currently doing is basically having two versions of the 
 distance function, one for vector-vector and one for column-vector:

 # Reduced euclidian distances

 # Vector-Vector
 @inline function euclidean_distance_red{T : FloatingPoint}(point_1::
 AbstractVector{T},
 point_2::
 AbstractVector{T})
 dist = 0.0
 for i = 1:length(point_1)
 @inbounds dist += abs2(point_1[i] - point_2[i])
 end
 return dist
 end

 # Column-Vector
 @inline function euclidean_distance_red{T : FloatingPoint}(data::
 AbstractMatrix{T},
 point::
 AbstractVector{T},
 idx::Int)
 dist = 0.0
 for i = 1:length(point)
 @inbounds dist += abs2(data[i, idx] - point[i])
 end
 return dist
 end


 This feels unnecessary since the transformation from the second to the 
 first function is obvious. Is there anyway I can collapse these two 
 functions into one that will come with zero performance loss?

 Best regards,
 Kristoffer Carlsson




Re: [julia-users] Julia on Raspberry Pi 2

2015-02-16 Thread Viral Shah
This is annoying. I have no idea what to do about the gcc version here. 
Perhaps file an issue for now. Maybe it needs gcc 4.7 to get through.

-viral

On Monday, February 16, 2015 at 10:54:14 PM UTC+5:30, Sto Forest wrote:

 Thank you very much  for your help viral,


 *GCC *is set to the latest version

 *root@pithree:/opt/julia/deps/openlibm/src# gcc --version*
 *gcc (Raspbian 4.8.2-21~rpi3rpi1) 4.8.2*
 *Copyright (C) 2013 Free Software Foundation, Inc.*


 I've added the override to Make.user.

 I pulled the latest version and ran make again with compiler version 
 checks disabled.

 *ARM.inc*
 *...*
 override LLVM_ASSERTIONS=1
 LLVM_FLAGS+=--with-cpu=cortex-a9 --disable-compiler-version-checks 
 --with-float=hard --with-abi=aapcs-vfp --with-fpu=neon --enable-targets=arm 
 --enable-optimized --enable-assertions
 ...



 There still seems to be a, compiler is not new enough,  error for llvm

 checking build system type... arm-unknown-linux-gnueabihf
 checking host system type... arm-unknown-linux-gnueabihf
 checking target system type... arm-unknown-linux-gnueabihf
 checking type of operating system we're going to host on... Linux
 checking type of operating system we're going to target... Linux
 checking target architecture... ARM
 checking whether GCC is new enough... no
 configure: error:
 The selected GCC C++ compiler is not new enough to build LLVM. Please 
 upgrade
 to GCC 4.7. You may pass --disable-compiler-version-checks to configure to
 bypass these sanity checks.
 Makefile:528: recipe for target 
 'llvm-3.5.1/build_Release+Asserts/config.status' failed
 make[1]: *** [llvm-3.5.1/build_Release+Asserts/config.status] Error 1
 Makefile:64: recipe for target 'julia-deps' failed
 make: *** [julia-deps] Error 2
 root@pithree:/opt/julia# 











 On Monday, 16 February 2015 11:45:23 UTC, Viral Shah wrote:

 You can avoid the openlibm issue for now by adding

 override USE_SYSTEM_LIBM=1

 in your Make.user. Can you also file this issue in the openlibm github 
 repo, so that it can be fixed. Should be easy.

 I wonder why your build is picking up llvm 3.5.0, since ARM.inc uses 
 3.5.1 now. I don't know if that will fix the build problem. Make sure that 
 your new gcc is the default (with gcc -v), and if still trouble, add 
 --disable-compiler-version-checks to LLVM_FLAGS in the ARM.inc.

 -viral

 On Monday, February 16, 2015 at 4:12:06 PM UTC+5:30, Sto Forest wrote:

 After adding in a couple of dependencies, gfortran, cmake, compilation 
 is getting further.

 There are two current problems:

 *openlibm*

 s_creall.c:(.text+0x0): multiple definition of `creal'
 src/s_creal.c.o:s_creal.c:(.text+0x0): first defined here
 collect2: error: ld returned 1 exit status
 Makefile:35: recipe for target 'libopenlibm.so' failed
 make[3]: *** [libopenlibm.so] Error 1
 Makefile:686: recipe for target 'openlibm/libopenlibm.so' failed
 make[2]: *** [openlibm/libopenlibm.so] Error 2
 make[2]: *** Waiting for unfinished jobs
 ar: creating libcerbla.a


 *LLVM*
 I upgraded gcc to version *gcc (Raspbian 4.8.2-21~rpi3rpi1) 4.8.2.*

 However, when running *make *in /opt/julia it errors over the compiler 
 version not being new enough:

 checking whether GCC is new enough... no
 configure: error:
 The selected GCC C++ compiler is not new enough to build LLVM. Please 
 upgrade
 to GCC 4.7. You may pass --disable-compiler-version-checks to configure 
 to
 bypass these sanity checks.
 Makefile:508: recipe for target 
 'llvm-3.5.0/build_Release+Asserts/config.status' failed
 make[2]: *** [llvm-3.5.0/build_Release+Asserts/config.status] Error 1
 Making all in UTIL


 Any suggestions from more knowledgeable people would be welcomed :)








 On Sunday, 15 February 2015 19:04:02 UTC, Sto Forest wrote:

 Thanks Steve I'll give that a try and see how far I get. :)


 On Sunday, 15 February 2015 01:06:39 UTC, Steve Kelly wrote:

 Sto, 

 I got Julia running on a BeagleBone Black running Debian Jessie a 
 couple months back using this process: 
 https://github.com/JuliaLang/julia/blob/master/README.arm.md. It 
 depends on a few system libraries to run, so I needed to update from 
 Wheezy 
 to Jessie so it would work. I think some improvements have been made 
 since 
 then so the build is more self contained. I am pretty sure Raspbian is 
 based on Wheezy, but it might be worth a shot with the latest master.

 Best,
 Steve

 On Sat, Feb 14, 2015 at 3:11 PM, Sto Forest stochast...@gmail.com 
 wrote:

 Is there a way to get Julia running on the new Raspberry Pi 2, 
 perhaps under raspbian ? 





[julia-users] Re: Values from Dict assigned to variables (symbols?) named as keys?

2015-02-16 Thread Michael Hatherly


I would have to explicitly state all the variables

Yes, you would.

the Dict that I call the function with is longer “preserved” (correct?) as 
a variable

Assuming this was meant to read “is no longer”, it would not be possible to 
pass it to additional calls.

You might be able to use a composite type (ie. type) if the number and name 
of fields don’t change arbitrarily. Something like:

type State
x::Int
y::Int
z::Float64
# ...
end

and then pass the entire state object as one of the arguments to your 
methods.

It should also be possible to “unpack” the fields of the state into local 
variables using something like:

macro vars(t, T)
out = Expr(:block)
for field in names(getfield(current_module(), T))
push!(out.args, :($(field) = $(t).$(field)))
end
esc(out)
end

and then unpack the fields in the required methods:

function f(s::State)
@vars s State
x + y + z
end

(Disclaimer: I’ve not tried to use that @vars macro yet.)

— Mike
​

On Monday, 16 February 2015 15:13:25 UTC+2, Martin Johansson wrote:

 Thanks, that's a good point regarding the scope and not a side effect I 
 would want.

 If I understand the keyword part correctly, I would have to explicitly 
 state all the variables in the function definition to be able to use them 
 locally which, as you write, can become verbose (very verbose for many of 
 my functions, actually). In addition, the Dict that I call the function 
 with is longer preserved (correct?) as a variable. I want to supply that 
 Dict to many other functions at various levels below the function I first 
 call. So that will not work. I guess explicitly using the Dict is the best 
 way forward. That's the way I do with structs in Matlab, but with a 
 somewhat more compact notation, so I can live with that.

 Regards, m



[julia-users] Re: Levi-Civita symbol/tensor

2015-02-16 Thread lapeyre . math122a
Maybe its too late, but here is a fast implementation giving the signature 
of the output of randperm(n)

https://github.com/jlapeyre/PermPlain.jl/blob/master/src/PermPlain.jl#L271


On Monday, February 9, 2015 at 8:41:58 PM UTC+1, Blake Johnson wrote:

 I keep writing code that needs the Levi-Civita symbol, i.e.:

 \epsilon_{ijk} = { 1 if (i,j,k) is an even permutation, -1 if (i,j,k) is 
 an odd permutation, otherwise 0 }

 It is used frequently in physics, so I keep expecting it to be in Julia's 
 stdlib, perhaps under a different name (in Mathematica, it is called 
 Signature[]). But, if it is there, I haven't found it yet. Is it there 
 and I just can't find it?



[julia-users] Re: Levi-Civita symbol/tensor

2015-02-16 Thread lapeyre . math122a
I should note that a couple of routines in PermPlain are not MIT license 
compatible.  But all the code to compute
the signature of a permutation most definitely is MIT compatible. I wrote 
it, although the algorithm is well known.

There is also this code which calls PermPlain:
https://github.com/jlapeyre/PermutationsA.jl

On Monday, February 16, 2015 at 4:12:10 PM UTC+1, lapeyre@gmail.com 
wrote:

 Maybe its too late, but here is a fast implementation giving the signature 
 of the output of randperm(n)

 https://github.com/jlapeyre/PermPlain.jl/blob/master/src/PermPlain.jl#L271


 On Monday, February 9, 2015 at 8:41:58 PM UTC+1, Blake Johnson wrote:

 I keep writing code that needs the Levi-Civita symbol, i.e.:

 \epsilon_{ijk} = { 1 if (i,j,k) is an even permutation, -1 if (i,j,k) is 
 an odd permutation, otherwise 0 }

 It is used frequently in physics, so I keep expecting it to be in Julia's 
 stdlib, perhaps under a different name (in Mathematica, it is called 
 Signature[]). But, if it is there, I haven't found it yet. Is it there 
 and I just can't find it?



[julia-users] Re: Values from Dict assigned to variables (symbols?) named as keys?

2015-02-16 Thread Johan Sigfrids
Could you not use a composite type for this? It would seem more Julian to me. 

[julia-users] Re: Values from Dict assigned to variables (symbols?) named as keys?

2015-02-16 Thread Michael Hatherly


One thing to note about using @eval as mentioned in Sean’s reply is that 
eval (and by extension @eval) does not evaluate expressions in a function’s 
scope, but rather in the global scope of the module.
This might have unexpected results:

julia function f(d)
   for (k, v) in d
   @eval $k = $v
   end
   end
f (generic function with 1 method)

julia x
ERROR: UndefVarError: x not defined

julia f(Dict(:x = 1))

julia x
1

You could instead use a function with keyword arguments and pass it a 
dictionary:

julia g(; x = error(provide x), args...) = nothing
g (generic function with 1 method)

where args... discards any variables you don’t want to handle. x could also 
be given a default value rather than throwing an error when not provided.
You’d then call it like so:

julia g(; Dict(:x = 1, :y = 2)...)

Although it’s slightly more verbose, this way won’t result in side-effects 
caused by using eval.

— Mike
On Monday, 16 February 2015 00:17:54 UTC+2, Martin Johansson wrote:

 Is there a way to assign the values of a Dict to the corresponding keys 
 such that I get variables (symbols?) with names given by the keys? Let's 
 assume the keys are for example strings which would work as variables.

 The reason for why I would want to do this is to have functions where 
 explicit variables are used rather than Dicts, primarily for readability. 
 I'm suspecting that this is not a recommended way of doing things (since I 
 couldn't find any info along these lines when searching), and if this is 
 the case please set me straight.

 Regards, m




[julia-users] Re: statistical accumulator

2015-02-16 Thread David van Leeuwen
Alternatively, there still is https://github.com/davidavdav/BigData.jl 
which has some infrastructure for storing large data on disc, and there is 
a stats() function that doesthe accumulation for you. 

---david

On Thursday, February 12, 2015 at 6:26:03 PM UTC+1, Christian Peel wrote:

 Perfect!  And just in time.
 Thanks

 On Wednesday, February 11, 2015 at 9:18:05 PM UTC-8, Iain Dunning wrote:

 JMW just released StreamStats.jl:
 https://github.com/johnmyleswhite/StreamStats.jl

 Which is what you want I think?

 Cheers,
 Iain

 On Wednesday, February 11, 2015 at 10:53:10 PM UTC-5, Christian Peel 
 wrote:

 I'm curious if someone has implemented a statistical accumulator in 
 julia similar to that in boost:
 http://www.boost.org/doc/libs/1_55_0/doc/html/accumulators.html

 I'm aware of the accumulator in DataStructures.jl, but if I read it 
 right it doesn't do statistical accumulation, just a running sum or a 
 running histogram. Looking at accumulator.jl (
 https://github.com/JuliaLang/DataStructures.jl/blob/master/src/accumulator.jl
  
 ) I see a + symbol at the end
 push!{T,V:Number}(ct::Accumulator{T,V}, x::T, a::V) = (ct.map[x] = 
 ct[x] + a)
 I'm looking for code that can (for example) calculate the variance on 
 the fly using only the second moment and mean as illustrated in eq 1.21 of 
 this page from the boost docs:  
 http://www.boost.org/doc/libs/1_55_0/doc/html/boost/accumulators/impl/lazy_variance_impl.html
 I've done this before in Matlab, just don't want to repeat it in Julia 
 if I don't need to.

 Thanks!



Re: [julia-users] Julia on Raspberry Pi 2

2015-02-16 Thread Viral Shah
You can avoid the openlibm issue for now by adding

override USE_SYSTEM_LIBM=1

in your Make.user. Can you also file this issue in the openlibm github 
repo, so that it can be fixed. Should be easy.

I wonder why your build is picking up llvm 3.5.0, since ARM.inc uses 3.5.1 
now. I don't know if that will fix the build problem. Make sure that your 
new gcc is the default (with gcc -v), and if still trouble, add 
--disable-compiler-version-checks to LLVM_FLAGS in the ARM.inc.

-viral

On Monday, February 16, 2015 at 4:12:06 PM UTC+5:30, Sto Forest wrote:

 After adding in a couple of dependencies, gfortran, cmake, compilation is 
 getting further.

 There are two current problems:

 *openlibm*

 s_creall.c:(.text+0x0): multiple definition of `creal'
 src/s_creal.c.o:s_creal.c:(.text+0x0): first defined here
 collect2: error: ld returned 1 exit status
 Makefile:35: recipe for target 'libopenlibm.so' failed
 make[3]: *** [libopenlibm.so] Error 1
 Makefile:686: recipe for target 'openlibm/libopenlibm.so' failed
 make[2]: *** [openlibm/libopenlibm.so] Error 2
 make[2]: *** Waiting for unfinished jobs
 ar: creating libcerbla.a


 *LLVM*
 I upgraded gcc to version *gcc (Raspbian 4.8.2-21~rpi3rpi1) 4.8.2.*

 However, when running *make *in /opt/julia it errors over the compiler 
 version not being new enough:

 checking whether GCC is new enough... no
 configure: error:
 The selected GCC C++ compiler is not new enough to build LLVM. Please 
 upgrade
 to GCC 4.7. You may pass --disable-compiler-version-checks to configure to
 bypass these sanity checks.
 Makefile:508: recipe for target 
 'llvm-3.5.0/build_Release+Asserts/config.status' failed
 make[2]: *** [llvm-3.5.0/build_Release+Asserts/config.status] Error 1
 Making all in UTIL


 Any suggestions from more knowledgeable people would be welcomed :)








 On Sunday, 15 February 2015 19:04:02 UTC, Sto Forest wrote:

 Thanks Steve I'll give that a try and see how far I get. :)


 On Sunday, 15 February 2015 01:06:39 UTC, Steve Kelly wrote:

 Sto, 

 I got Julia running on a BeagleBone Black running Debian Jessie a couple 
 months back using this process: 
 https://github.com/JuliaLang/julia/blob/master/README.arm.md. It 
 depends on a few system libraries to run, so I needed to update from Wheezy 
 to Jessie so it would work. I think some improvements have been made since 
 then so the build is more self contained. I am pretty sure Raspbian is 
 based on Wheezy, but it might be worth a shot with the latest master.

 Best,
 Steve

 On Sat, Feb 14, 2015 at 3:11 PM, Sto Forest stochast...@gmail.com 
 wrote:

 Is there a way to get Julia running on the new Raspberry Pi 2, perhaps 
 under raspbian ?