[julia-users] Immutable type with a function datatype

2016-01-18 Thread Anonymous
Is the following code considered bad form in Julia?

immutable Foo
func::Function
end

foo = Foo(x->x^2)
foo.func(3)

This mimics the behavior of OOP since just like in OOP the internal method 
cannot be changed (since the type is immutable).  Sometimes it really does 
make the most sense to attach a function to an instance of a type, do I 
take a performance hit doing things this way?


Re: [julia-users] @everywhere using Images gives error

2016-01-18 Thread Tim Holy
Notice all those messages about ImageMagick not being installed? E.g.,

WARNING: FileIO.NotInstalledError(:ImageMagick,"")

You can fix your problem by installing it.

If you try your code first in a single process, FileIO will prompt you to 
install ImageMagick. This doesn't work in a multiprocess situation because 
there is no terminal available on the workers.

Best,
--Tim


On Monday, January 18, 2016 05:32:46 AM Abhinanda Ranjit wrote:
> Hi all,
> 
> I set a Julia cluster on Windows 7 machines.
> Julia  Version 0.4.2
> 
> However, using the Image package on all node gives error.
> 
> My code is :
>  addprocs(["user@x.x.x.x"],tunnel = true, dir = "C:\\Julia-0.4.2\\bin",
> exename = "julia")
>  @everywhere using Images
>  @spawnat 2 load"image.bmp")
> 
>  I get the following error :
> julia> @spawnat 2 load("E:\\Cadenza_Files\\Cadenza_Node_Files\\image.bmp")
> RemoteRef{Channel{Any}}(2,1,5)
> 
> julia> WARNING: FileIO.NotInstalledError(:ImageMagick,"")
>  in checked_import at
> C:\Users\abhinanda.ranjit\.julia\v0.4\FileIO\src\loadsave.jl:12
>  in load at C:\Users\abhinanda.ranjit\.julia\v0.4\FileIO\src\loadsave.jl:76
>  in load at C:\Users\abhinanda.ranjit\.julia\v0.4\FileIO\src\loadsave.jl:42
>  in anonymous at multi.jl:1358
>  in anonymous at multi.jl:904
>  in run_work_thunk at multi.jl:645
>  in run_work_thunk at multi.jl:654
>  in anonymous at task.jl:58
> From worker 2:  Library ImageMagick is not installed but can load
> format: FileIO.File{FileIO.DataFormat{:BMP}}("E:\\Cadenz
> a_Files\\Cadenza_Node_Files\\image.bmp")
> julia>
> 
> julia> fetch(ans)
> ERROR: On worker 2:
> ImageMagick is not installed.
> 
>  in checked_import at
> C:\Users\abhinanda.ranjit\.julia\v0.4\FileIO\src\loadsave.jl:12
>  in load at C:\Users\abhinanda.ranjit\.julia\v0.4\FileIO\src\loadsave.jl:76
>  in load at C:\Users\abhinanda.ranjit\.julia\v0.4\FileIO\src\loadsave.jl:42
>  in anonymous at multi.jl:1358
>  in anonymous at multi.jl:904
>  in run_work_thunk at multi.jl:645
>  in run_work_thunk at multi.jl:654
>  in anonymous at task.jl:58
>  in remotecall_fetch at multi.jl:731
>  in call_on_owner at multi.jl:777
>  in fetch at multi.jl:795
> 
> 
> I am able to load Images on each node locally.
> 
> Please help.
> 
> Thanks
> Abhinanda



[julia-users] @everywhere using Images gives error

2016-01-18 Thread Abhinanda Ranjit
Hi all,

I set a Julia cluster on Windows 7 machines.
Julia  Version 0.4.2

However, using the Image package on all node gives error.

My code is :
 addprocs(["user@x.x.x.x"],tunnel = true, dir = "C:\\Julia-0.4.2\\bin", 
exename = "julia")
 @everywhere using Images
 @spawnat 2 load"image.bmp")

 I get the following error : 
julia> @spawnat 2 load("E:\\Cadenza_Files\\Cadenza_Node_Files\\image.bmp")
RemoteRef{Channel{Any}}(2,1,5)

julia> WARNING: FileIO.NotInstalledError(:ImageMagick,"")
 in checked_import at 
C:\Users\abhinanda.ranjit\.julia\v0.4\FileIO\src\loadsave.jl:12
 in load at C:\Users\abhinanda.ranjit\.julia\v0.4\FileIO\src\loadsave.jl:76
 in load at C:\Users\abhinanda.ranjit\.julia\v0.4\FileIO\src\loadsave.jl:42
 in anonymous at multi.jl:1358
 in anonymous at multi.jl:904
 in run_work_thunk at multi.jl:645
 in run_work_thunk at multi.jl:654
 in anonymous at task.jl:58
From worker 2:  Library ImageMagick is not installed but can load 
format: FileIO.File{FileIO.DataFormat{:BMP}}("E:\\Cadenz
a_Files\\Cadenza_Node_Files\\image.bmp")
julia>

julia> fetch(ans)
ERROR: On worker 2:
ImageMagick is not installed.

 in checked_import at 
C:\Users\abhinanda.ranjit\.julia\v0.4\FileIO\src\loadsave.jl:12
 in load at C:\Users\abhinanda.ranjit\.julia\v0.4\FileIO\src\loadsave.jl:76
 in load at C:\Users\abhinanda.ranjit\.julia\v0.4\FileIO\src\loadsave.jl:42
 in anonymous at multi.jl:1358
 in anonymous at multi.jl:904
 in run_work_thunk at multi.jl:645
 in run_work_thunk at multi.jl:654
 in anonymous at task.jl:58
 in remotecall_fetch at multi.jl:731
 in call_on_owner at multi.jl:777
 in fetch at multi.jl:795


I am able to load Images on each node locally.

Please help.

Thanks 
Abhinanda



[julia-users] Re: Expression object for "unquote"

2016-01-18 Thread Ismael Venegas Castelló
That's because in the case of expressions, we are interested in the AST, 
show representation is just an abstraction of that, there is also dump and 
xdump:

julia> ex = :(:($x))
:($(Expr(:quote, :($(Expr(:$, :x))

julia> Meta.show_sexpr(ex); println()
(:quote, (:$, :x))

julia> dump(ex)

Expr  
  head: Symbol quote
  args: Array(Any,(1,))
1: Expr  
  head: Symbol $
  args: Array(Any,(1,))
1: Symbol x
  typ: Any
  typ: Any

julia> ex.args[1].args[1] = :y
:y

julia> ex
:($(Expr(:quote, :($(Expr(:$, :y))

julia> Meta.show_sexpr(ex); println()
(:quote, (:$, :y))

julia> dump(ex)
Expr  
  head: Symbol quote
  args: Array(Any,(1,))
1: Expr  
  head: Symbol $
  args: Array(Any,(1,))
1: Symbol y
  typ: Any
  typ: Any

You could overwrite those methods or write your own that prints them the 
way you want to.

El domingo, 17 de enero de 2016, 1:50:05 (UTC-6), vis...@stanford.edu 
escribió:
>
> Hi!
>
> Was messing around with exceptions, and trying to see under the hood and 
> construct macro expressions. 
>
> Eg, Meta.show_sexpr(:(:(+ 1 2))) -> (:quote, (:call, :+, 1, 2))
>
> but how do you build an unquote expression?
>
> Meta.show_sexpr(:(:($(x+5) + 1))) -> (:quote, (:call, :+, (:$, (:call, :+, 
> :x, 5)), 1))
>
>
> But Expr(:quote, Expr(:call, :+, Expr(:$, Expr(:call, :+, :x, 5)), 1)) -> 
> :($(Expr(:quote, :($(Expr(:$, :(x + 5))) + 1
>
> In other words, it's not processing into the appropriate expression, and 
> there's some weird intermediate syntax going on. 
>
>
> How does one build an unquote expression? I.e, an expression that would 
> eval to unquoting a variable?
>


Re: [julia-users] Immutable type with a function datatype

2016-01-18 Thread Yichao Yu
On Mon, Jan 18, 2016 at 10:08 AM, Anonymous  wrote:
> Is the following code considered bad form in Julia?

Yes.

>
> immutable Foo
> func::Function
> end
>
> foo = Foo(x->x^2)
> foo.func(3)
>
> This mimics the behavior of OOP since just like in OOP the internal method
> cannot be changed (since the type is immutable).  Sometimes it really does
> make the most sense to attach a function to an instance of a type, do I take
> a performance hit doing things this way?

You do, by a huge amount.

What I've tried and kind of works is to do basically

immutable FooFunc{T}
self::T
end

type Foo
func::FooFunc{Foo}
Foo() = (self = new(); self.func = FooFunc{Foo}(self); self)
end

call(func::FooFunc{Foo}, x) = x^2

This can be simpler with the closure improvement and getfield overload.


[julia-users] Re: help with a macro

2016-01-18 Thread Jeffrey Sarnoff
If you revise the macro as Stefan suggests, would you post the revision as 
a response here?

On Thursday, January 14, 2016 at 8:09:23 AM UTC-5, 
richard@maths.ox.ac.uk wrote:
>
> This macro:
>
> macro clenshaw(x, c...)
> bk1,bk2 = :(zero(t)),:(zero(t))
> N = length(c)
> for k = N:-1:2
> bk2, bk1 = bk1, :(muladd(t,$bk1,$(esc(c[k]))-$bk2))
> end
> ex = :(muladd(t/2,$bk1,$(esc(c[1]))-$bk2))
> Expr(:block, :(t = $(esc(2))*$(esc(x))), ex)
> end
>
> implements Clenshaw's algorithm to sum Chebyshev series. It successfully 
> "unrolls" the loop, but is impractical for more than 24 coefficients. The 
> resulting LLVM code is theoretically only 50% longer than unrolling 
> Horner's rule:
>
> f(x) = 
> @evalpoly(x,1.0,1/2,1/3,1/4,1/5,1/6,1/7,1/8,1/9,1/10,1/11,1/12,1/13,1/14,1/15,1/16,1/17,1/18,1/19,1/20)
>
> @code_llvm f(1.0)
>
> g(x) = 
> @clenshaw(x,1.0,1/2,1/3,1/4,1/5,1/6,1/7,1/8,1/9,1/10,1/11,1/12,1/13,1/14,1/15,1/16,1/17,1/18,1/19,1/20)
>
> @code_llvm g(1.0)
>
> How could I write the macro differently? How else could I end up with the 
> same efficient LLVM code? using a staged function?
>


Re: [julia-users] VirtualArrays.jl

2016-01-18 Thread Eric Davies


On Friday, 15 January 2016 17:14:26 UTC-6, Yichao Yu wrote:
>
> FYI, don't call eval in functions 
>

What's wrong with calling eval in functions when you're evaluating 
expressions without side-effects? 


[julia-users] Re: Project ideas in julia

2016-01-18 Thread Jeffrey Sarnoff
And where do your interests lie?

On Sunday, January 17, 2016 at 4:11:37 PM UTC-5, Patrick Kofod Mogensen 
wrote:
>
> What do you study?
>
> On Sunday, January 17, 2016 at 8:47:05 PM UTC+1, noufal n wrote:
>>
>> I'm a student and i wish to study and contribute to julia community. As a 
>> part my professional degree i like to do a project in julia. Need 
>> suggestions 
>>
>

[julia-users] Re: Immutable type with a function datatype

2016-01-18 Thread Anonymous
wow that is crazy complicated


Re: [julia-users] VirtualArrays.jl

2016-01-18 Thread Yichao Yu
On Mon, Jan 18, 2016 at 10:42 AM, Eric Davies  wrote:
>
>
> On Friday, 15 January 2016 17:14:26 UTC-6, Yichao Yu wrote:
>>
>> FYI, don't call eval in functions
>
>
> What's wrong with calling eval in functions when you're evaluating
> expressions without side-effects?

Orders of magnitude slower
Not necessary
Breaks type inference


[julia-users] Importing Package and not running external code.

2016-01-18 Thread Lutfullah Tomak
Maybe send specific argument to the file then instead of checking argument 
length and use the actual argument values to run code. Or define a global value 
before including the file and check if it is defined and valued desirably so 
that in included file some extra codes will run.

[julia-users] Re: Make Julia package requirements depends on Julia version?

2016-01-18 Thread Oliver Schulz
Hi Ian,

Its definitely safe to have it on 0.4. Are you putting it in /REQUIRE or 
> /test/REQUIRE? You probably want to put it in the latter. (Apologies if I'm 
> telling you something you already know!)
>

In "/REQUIRE" - this is for EasyPkg 
(https://github.com/oschulz/EasyPkg.jl), so it's a special
case: EasyPkg provides a function "runalltests()" that uses functionality 
from BaseTestNext.

I does mean that packages using EasyPkg in "/REQUIRE" (EasyPkg also 
provides things
not related to testing) will indirectly require BaseTestNext as well - but 
as it is part of
Julia in v0.5 anyhow, I guess that's Ok, right?

Otherwise, I'd have to split EasyPkg in two packages.

Cheers,

Oliver



Re: [julia-users] Re: Immutable type with a function datatype

2016-01-18 Thread Erik Schnetter
I think in this case it does make sense to attach a function to an
object. (This is different from the OO discussion above, which is
about attaching a function to a type.)

immutable Sphere <: Manifold
dim::Int
metric::Function
end

sphere::Sphere
sphere.metric = ... implementation ...
sphere.metric(... arguments ...)

Nevertheless, this is probably not efficient at the moment, since
there is a performance penalty for anonymous functions. You might try
an approach like this instead, which is as flexible as the one above,
but more efficient:

immutable Sphere <: Manifold
dim::Int
metric
end

# Introduce an empty new type for the metric function
immutable Metric1 end
call(::Metric1, ... other arguments ...) = ... implementation ...

This is equivalent to using an anonymous function, but should be
currently more efficient.

Then you can write e.g.

sphere::Sphere
sphere.metric(... arguments ...)

-erik

On Mon, Jan 18, 2016 at 11:54 AM, Anonymous  wrote:
> This came up as I was trying to define a sphere manifold type which can have
> distinct metric structures put on it, here is how I have to do it in Julia:
>
> abstract Manifold
> abstract Metric
>
> immutable Metric1 <: Metric
> end
>
> immutable Metric2 <: Metric
> end
>
> immutable Sphere{T<:Metric} <: Manifold
> dim::int
> end
>
> metric(M::Sphere{Metric1}, x1, x2) = ...
> metric(M::Sphere{Metric2}, x1, x2) = ...
>
> As you can see, I have to define a whole other type tree just so my metric
> function can distinguish between sphere manifolds depending on what metric
> structure I want it to have.  It would be both simpler and make more sense
> conceptually to attach the metric to the manifold itself when I instantiate
> it.  Of course you might say I should just define two functions metric1 and
> metric2, but this doesn't really make sense either because these metrics do
> not cut across multiple manifolds, and thus there would be no way to take
> advantage of the multiple dispatch functionality.
>
>



-- 
Erik Schnetter 
http://www.perimeterinstitute.ca/personal/eschnetter/


Re: [julia-users] Access to Fortran global array with cglobal

2016-01-18 Thread Erik Schnetter
Chris

This array does not use Fortran 90 features; you're fine. Note that
the array indices will be different in Julia -- -n:n will be 1:(2*n+1)
instead.

What is "n" in your setup? You should declare the array size as
3*(2*n+1) in Julia, or as 2d-array via 3, 2*n+1.

-erik

On Mon, Jan 18, 2016 at 12:05 PM, Chris <7hunderstr...@gmail.com> wrote:
> Erik,
>
> The array declaration line in the Fortran code is something like
>
> integer,parameter:: fp_kind = kind(0.d0)
> real(fp_kind):: v1(3,-n:n)
>
> Does this appear to use any Fortran 90-specific features?
>
> I also tried declaring the type as Float64, then doing
> pointer_to_array(v1,3) gives me a 3x1 array of zeros. This is not the
> expected result of the code, but again, it might be an issue within the
> Fortran code itself.
>
>
> Adrian,
>
> Thank you for the link, there's a lot of useful information in there,
> although, unfortunately, nothing that I saw relating to my specific problem.
> In any case, do you know if the example is waiting in a PR somewhere to get
> into the docs? I agree that it would be very useful.
>
> Thanks,
> Chris
>
> On Saturday, January 16, 2016 at 10:59:44 AM UTC-5, Adrian Cuthbertson
> wrote:
>>
>> I happened to make a note of a post some time ago about getting fortran
>> and julia working together. Searching for that again returned this link:
>>
>>
>> http://julia-programming-language.2336112.n4.nabble.com/example-for-ccall-use-and-fortran-td7737.html
>>
>> Hth, Adrian.
>>
>>
>> On Sat, Jan 16, 2016 at 3:32 PM, Chris <7hunde...@gmail.com> wrote:
>>>
>>> The Fortran code I'm working with assigns results to a number of global
>>> variables. One of those results is a 3x1 real array - let's call it v1.
>>>
>>> I'm trying to understand how to access this. Here's what I have:
>>>
>>> v1 = cglobal((:__libkl_mod_MOD_v1,"libkl.so"),Ptr{Float64})
>>>
>>> This gives me a Ptr{Ptr{Float64}}. I'm not sure what to do from here --
>>> using pointer_to_array just gives
>>>
>>> julia> pointer_to_array(v1,3)
>>> 3-element Array{Ptr{Float64},1}:
>>>  Ptr{Float64} @0x
>>>  Ptr{Float64} @0x
>>>  Ptr{Float64} @0x
>>>
>>> And then doing an unsafe_load on any of those elements gives me a
>>> segfault.
>>>
>>> Am I taking the right approach here? The Fortran code is not my own, so
>>> it's possible this is due to an error in that code, but I'm trying to rule
>>> out Julia interface issues first.
>>>
>>> Thanks in advance,
>>> Chris
>>
>>
>



-- 
Erik Schnetter 
http://www.perimeterinstitute.ca/personal/eschnetter/


[julia-users] Re: Immutable type with a function datatype

2016-01-18 Thread Matt Bauman
On Monday, January 18, 2016 at 11:54:49 AM UTC-5, Anonymous wrote:
>
> As you can see, I have to define a whole other type tree just so my metric 
> function can distinguish between sphere manifolds depending on what metric 
> structure I want it to have.  It would be both simpler and make more sense 
> conceptually to attach the metric to the manifold itself when I instantiate 
> it.
>

Here's another alternative:

@enum Metric RIEMANNIAN LORENTZIAN # ...
immutable Sphere
dim::Int
metric::Metric
end
function metric(s::Sphere)
if s.metric == RIEMANNIAN
# …
elseif s.metric == LORENTZIAN #…
end
end 

The key point is that you don't always need to attach a function — you can 
attach data that describes the object's properties, and then use that in 
your external functions.

(This is crossposted 
at 
http://stackoverflow.com/questions/34841635/immutable-type-with-function-fields-in-julia)


Re: [julia-users] Using Dot Syntax in Julia

2016-01-18 Thread Stefan Karpinski
This is usually not what you want to do.

On Sunday, January 17, 2016, Steve Kelly  wrote:

> It has always been this way because of multiple dispatch. However you can
> do something like:
>
> type Wallet
>   dotTest::Function
> end
>
> Which might have ambiguous performance impact.
> On Jan 17, 2016 12:45 PM, "Bryan Rivera"  > wrote:
>
>> I have seen some code out in the wild that allows us to use dot syntax
>> like so:
>>
>> function dotTest!(wallet::Wallet, valueToAdd::Int):
>>
>> ...
>>
>> end
>>
>> wallet = Wallet(100)
>>
>> wallet.dotTest!(5)  # Does not work
>> dotTest!(wallet, 5)  # Works
>>
>> However I cannot get it to work, the method is not found because I am not
>> passing wallet as the arg.
>>
>> So did the language change, or am I doing it wrong?
>>
>


Re: [julia-users] VirtualArrays.jl

2016-01-18 Thread Eric Davies
It's gone now 
:) https://github.com/invenia/VirtualArrays.jl/pull/2#event-518201512

On Monday, 18 January 2016 09:47:15 UTC-6, Yichao Yu wrote:
>
> On Mon, Jan 18, 2016 at 10:42 AM, Eric Davies  > wrote: 
> > 
> > 
> > On Friday, 15 January 2016 17:14:26 UTC-6, Yichao Yu wrote: 
> >> 
> >> FYI, don't call eval in functions 
> > 
> > 
> > What's wrong with calling eval in functions when you're evaluating 
> > expressions without side-effects? 
>
> Orders of magnitude slower 
> Not necessary 
> Breaks type inference 
>


[julia-users] Re: Immutable type with a function datatype

2016-01-18 Thread Anonymous
this is a good solution, although it prevents the user from defining their 
own custom metric.

On Monday, January 18, 2016 at 9:33:05 AM UTC-8, Matt Bauman wrote:
>
> On Monday, January 18, 2016 at 11:54:49 AM UTC-5, Anonymous wrote:
>>
>> As you can see, I have to define a whole other type tree just so my 
>> metric function can distinguish between sphere manifolds depending on what 
>> metric structure I want it to have.  It would be both simpler and make more 
>> sense conceptually to attach the metric to the manifold itself when I 
>> instantiate it.
>>
>
> Here's another alternative:
>
> @enum Metric RIEMANNIAN LORENTZIAN # ...
> immutable Sphere
> dim::Int
> metric::Metric
> end
> function metric(s::Sphere)
> if s.metric == RIEMANNIAN
> # …
> elseif s.metric == LORENTZIAN #…
> end
> end 
>
> The key point is that you don't always need to attach a function — you can 
> attach data that describes the object's properties, and then use that in 
> your external functions.
>
> (This is crossposted at 
> http://stackoverflow.com/questions/34841635/immutable-type-with-function-fields-in-julia
> )
>


[julia-users] Re: Dictionary lookup using only the hash value

2016-01-18 Thread Jeffrey Sarnoff
Steve, 
afaik, depending upon the hash implementation, the only advantage might be 
moderately faster lookup but the cost in generality often would outweigh 
that.

On Thursday, January 14, 2016 at 10:44:01 PM UTC-5, vav...@uwaterloo.ca 
wrote:
>
> Could I ask what would be an application of this capability (if it were 
> possible)?  -- Steve Vavasis
>
> On Wednesday, January 13, 2016 at 3:59:57 PM UTC-5, Ritchie Lee wrote:
>>
>> As I understand it, Julia dicts use Base.hash when storing custom types 
>> as keys.  Is there a way to look up based on the hash alone?
>>
>> e.g.,
>>
>> type MyType
>> a::Int
>> end
>>
>> Base.hash(x::MyType) = hash(x.a, hash(MyType))
>>
>> x = MyType(1)
>> D = Dict{MyType, Bool}()
>> D[x] = true
>>
>> Now, is there a way to lookup using h only? 
>>
>> h = hash(x)
>> D[h]
>>
>> Is this be safe to do?
>>
>> Thanks!
>>
>

[julia-users] Matrix multiplication - A' x B' much slower than A x B, or A' x B, or A' x B'

2016-01-18 Thread Franco Venturi
I wrote the Julia scripts below that just test the performance of the 
matrix multiplication in four cases (A' stands for A transpose, and B' 
stands for B transpose):

   1. A x B
   2. A' x B
   3. A x B'
   4. A' x B'

I ran these scripts multiplying two 1,000 x 1,000 matrices 20 times, i.e.:


   1. axb.jl -n 1000 -k 1000 -m 1000 -t 20
   2. atxb.jl -n 1000 -k 1000 -m 1000 -t 20
   3. axbt.jl -n 1000 -k 1000 -m 1000 -t 20
   4. atxbt.jl -n 1000 -k 1000 -m 1000 -t 20

Cases 1, 2, 3 run in about the same time with a performance of about 70 
GFLOPS on my desktop, however case 4 is about 10 times slower.
I run Linux Fedora 23, julia version 0.4.3, and openblas version 0.2.15.
I also compiled Julia directly from the latest version on GitHub, used 
Intel MKL, and the results are similar (the absolute times are different, 
but the A' x B' case is still about 10 times slower than the other three 
cases).

Since Julia uses openblas/MKL function 'dgemm' (or 'sgemm' in the single 
precision case), I also wrote the same examples in C directly invoking 
openblas/MKL 'cblas_dgemm' function and there's no difference in these four 
cases using the native calls (the arguments for the function 'cblas_dgemm' 
have transpose flags for both A and B).

I am not sure why Julia is so much slower multiplying A' x B', so I thought 
I would bring it to your attention.

Regards,
Franco Venturi

#!/bin/julia -q

using ArgParse

s = ArgParseSettings()

@add_arg_table s begin
"-m"
"-k"
"-n"
"-t"
end

parsed_args = parse_args(s)
m = parse(Int, parsed_args["m"])
k = parse(Int, parsed_args["k"])
n = parse(Int, parsed_args["n"])
t = parse(Int, parsed_args["t"])

a = 2 * rand(m, k) - 1
b = 2 * rand(k, n) - 1

tstart = time()
for i = 1:t
c = a * b
end
tend = time()
duration = (tend - tstart) / t
mflops = (2 * m * n * k) / duration * 1.0e-6
@printf("%dx%dx%d\t%f s\t%f MFLOPS\n", m, n, k, duration, mflops)


#!/bin/julia -q

using ArgParse

s = ArgParseSettings()

@add_arg_table s begin
"-m"
"-k"
"-n"
"-t"
end

parsed_args = parse_args(s)
m = parse(Int, parsed_args["m"])
k = parse(Int, parsed_args["k"])
n = parse(Int, parsed_args["n"])
t = parse(Int, parsed_args["t"])

a = (2 * rand(m, k) - 1)'
b = 2 * rand(k, n) - 1

tstart = time()
for i = 1:t
c = a' * b
end
tend = time()
duration = (tend - tstart) / t
mflops = (2 * m * n * k) / duration * 1.0e-6
@printf("%dx%dx%d\t%f s\t%f MFLOPS\n", m, n, k, duration, mflops)



#!/bin/julia -q

using ArgParse

s = ArgParseSettings()

@add_arg_table s begin
"-m"
"-k"
"-n"
"-t"
end

parsed_args = parse_args(s)
m = parse(Int, parsed_args["m"])
k = parse(Int, parsed_args["k"])
n = parse(Int, parsed_args["n"])
t = parse(Int, parsed_args["t"])

a = 2 * rand(m, k) - 1
b = (2 * rand(k, n) - 1)'

tstart = time()
for i = 1:t
c = a * b'
end
tend = time()
duration = (tend - tstart) / t
mflops = (2 * m * n * k) / duration * 1.0e-6
@printf("%dx%dx%d\t%f s\t%f MFLOPS\n", m, n, k, duration, mflops)



#!/bin/julia -q

using ArgParse

s = ArgParseSettings()

@add_arg_table s begin
"-m"
"-k"
"-n"
"-t"
end

parsed_args = parse_args(s)
m = parse(Int, parsed_args["m"])
k = parse(Int, parsed_args["k"])
n = parse(Int, parsed_args["n"])
t = parse(Int, parsed_args["t"])

a = (2 * rand(m, k) - 1)'
b = (2 * rand(k, n) - 1)'

tstart = time()
for i = 1:t
c = a' * b'
end
tend = time()
duration = (tend - tstart) / t
mflops = (2 * m * n * k) / duration * 1.0e-6
@printf("%dx%dx%d\t%f s\t%f MFLOPS\n", m, n, k, duration, mflops)






[julia-users] Running a julia script on the web

2016-01-18 Thread parth patel
Hey guys, I am new to Julia. I was wondering if there is a way to run a 
Julia script on the backend of a web page? My goal is for the users of the 
web page to pass in parameters as inputs to the Julia script and then run 
the script on the web and display the results. I am totally new to Julia 
and any help would be appreciated.


[julia-users] Running julia script on the web

2016-01-18 Thread parth patel
I am a new user to Julia. I was wondering if you guys know a mechanism 
where I can run a Julia script from the web. My goal is for the user to go 
to a website pass in inputs and based on those inputs I run the script on 
the web and display the result.


[julia-users] strange error

2016-01-18 Thread Fabrizio Lacalandra
Does anyone knows what this message below can mean ? Apparently the same 
code runs with a previous version of Julia, 0.4.0 i think. Now i am running 
the latest 0.4.3
The line error in MyNetworkcns.jl is fake as that is simply the last line 
of the file itself


Thanks 

Fabrizio

ERROR: LoadError: LoadError: AssertionError: x.head == :escape
 in include at boot.jl:261
 in include_from_node1 at loading.jl:304
 in include at boot.jl:261
 in include_from_node1 at loading.jl:304
while loading E:\CodiciSorgente\ProgrammiJulia\UCOTS\MyNetworkcns.jl, in 
expression starting on line 159


[julia-users] Re: Setting up cluster on Windows(Win7,64bit) with Julia0.4.2

2016-01-18 Thread Abhinanda Ranjit
Thanks for replying.
It was solved by installing bash through Cygwin. It is working now.

On Thursday, December 17, 2015 at 11:02:52 PM UTC+5:30, Tony Kelman wrote:
>
> I suspect the remote ssh workers are not set up to work on Windows. It's 
> trying to spawn `sh` which won't exist on Windows (though you will have one 
> from cygwin, that's not guaranteed to be there).
>
> Maybe if the `sh` part of the command were taken out it could be made to 
> work.
>


Re: [julia-users] Immutable type with a function datatype

2016-01-18 Thread Joshua Ballanco
On January 18, 2016 at 17:08:46, Anonymous (espr...@gmail.com) wrote:

This mimics the behavior of OOP since just like in OOP the internal method 
cannot be changed (since the type is immutable).  Sometimes it really does make 
the most sense to attach a function to an instance of a type...
I don’t believe you.

Not trying to be snide, but after spending ~10 years as an “OOP programmer” 
(mostly Java & Ruby) and ~3 as a “FP programmer” (mostly Clojure and now 
Julia), I’ve come to realize that the difference between:

    foo.bar(baz)

and:

    bar(foo, baz)

is little more than a case of what you’re comfortable with. I could *almost* 
see a case for the former over the latter if you were dynamically changing the 
definition of `bar` (which has its own problems), but the example you gave has 
the type being immutable.

I’m curious what scenario you’re picturing where having a method attached to an 
instance makes more sense than the other way around?



Re: [julia-users] Immutable type with a function datatype

2016-01-18 Thread Yichao Yu
On Mon, Jan 18, 2016 at 11:01 AM, Joshua Ballanco  wrote:
> On January 18, 2016 at 17:08:46, Anonymous (espr...@gmail.com) wrote:
>
>
> This mimics the behavior of OOP since just like in OOP the internal method
> cannot be changed (since the type is immutable).  Sometimes it really does
> make the most sense to attach a function to an instance of a type...
>
> I don’t believe you.
>
> Not trying to be snide, but after spending ~10 years as an “OOP programmer”
> (mostly Java & Ruby) and ~3 as a “FP programmer” (mostly Clojure and now
> Julia), I’ve come to realize that the difference between:
>
> foo.bar(baz)
>
> and:
>
> bar(foo, baz)
>
> is little more than a case of what you’re comfortable with. I could *almost*
> see a case for the former over the latter if you were dynamically changing
> the definition of `bar` (which has its own problems), but the example you
> gave has the type being immutable.
>
> I’m curious what scenario you’re picturing where having a method attached to
> an instance makes more sense than the other way around?

I think this has already been brought up elsewhere but IMHO the
advantage of the first is namespace. You can define independent
`bar`'s for different types without collision between them or with
local variables.


[julia-users] Re: Error running examples of parallel computing with Julia

2016-01-18 Thread Daniel Arndt
This was cross-posted on Stack Overflow and answered 
there: 
http://stackoverflow.com/questions/34755454/how-to-run-a-function-in-parallel-with-julia-language/34772735

Charles, I believe the community preference is that you choose one medium 
or the other, and do not cross post questions like this.

Cheers,
Dan

On Tuesday, 12 January 2016 18:03:18 UTC-4, Charles Santana wrote:
>
> Hi,
>
> I am trying to figure out how to work with parallel computing with Julia. 
> The documentation looks great, even for someone that has never worked with 
> Parallel Computing (and that does not understand most of the concepts 
> behind the documentation ;)).
>
> Just to mention: I am working in a PC with Ubuntu. It has a 4-core 
> processor. 
>
> To run the code I describe below I am calling the julia terminal as:
>
> $ julia -p 4
>
> I am following the documentation here: 
> http://docs.julialang.org/en/latest/manual/parallel-computing/
>
> For one example, specifically, I am facing some problems. 
>
> In this section: 
> http://docs.julialang.org/en/latest/manual/parallel-computing/#id2
>
> I am trying to run the following piece of code:
>
> @everywhere advection_shared_chunk!(q, u) = advection_chunk!(q, u, 
> myrange(q)..., 1:size(q,3)-1)
>
> function advection_shared!(q, u)
> @sync begin
> for p in procs(q)
> @async remotecall_wait(advection_shared_chunk!, p, q, u)
> end
> end
> q
> end
>
> q = SharedArray(Float64, (500,500,500))
> u = SharedArray(Float64, (500,500,500))
>
> #Run once to JIT-compile
> advection_shared!(q,u)
>
> But I am facing the following error:
>
> ERROR: MethodError: `remotecall_wait` has no method matching 
> remotecall_wait(::Function, ::Int64, ::SharedArray{Float64,3}, 
> ::SharedArray{Float64,3})
> Closest candidates are:
>   remotecall_wait(::LocalProcess, ::Any, ::Any...)
>   remotecall_wait(::Base.Worker, ::Any, ::Any...)
>   remotecall_wait(::Integer, ::Any, ::Any...)
>  in anonymous at task.jl:447
>
> ...and 3 other exceptions.
>
>  in sync_end at ./task.jl:413
>  [inlined code] from task.jl:422
>  in advection_shared! at none:2
>
> What am I doing wrong here? As far as I know I am just reproducing the 
> example in the docs... or not?
>
> Thanks for any help,
>
> Charles
>
>
> -- 
> Um axé! :)
>
> --
> Charles Novaes de Santana, PhD
> https://github.com/cndesantana
>


[julia-users] Re: Immutable type with a function datatype

2016-01-18 Thread Anonymous
This came up as I was trying to define a sphere manifold type which can 
have distinct metric structures put on it, here is how I have to do it in 
Julia:

abstract Manifold
abstract Metric

immutable Metric1 <: Metric
end

immutable Metric2 <: Metric
end 

immutable Sphere{T<:Metric} <: Manifold
dim::int
end

metric(M::Sphere{Metric1}, x1, x2) = ...
metric(M::Sphere{Metric2}, x1, x2) = ...

As you can see, I have to define a whole other type tree just so my metric 
function can distinguish between sphere manifolds depending on what metric 
structure I want it to have.  It would be both simpler and make more sense 
conceptually to attach the metric to the manifold itself when I instantiate 
it.  Of course you might say I should just define two functions metric1 and 
metric2, but this doesn't really make sense either because these metrics do 
not cut across multiple manifolds, and thus there would be no way to take 
advantage of the multiple dispatch functionality.




Re: [julia-users] Access to Fortran global array with cglobal

2016-01-18 Thread Chris
Erik,

The array declaration line in the Fortran code is something like 

integer,parameter:: fp_kind = kind(0.d0)
real(fp_kind):: v1(3,-n:n)

Does this appear to use any Fortran 90-specific features?

I also tried declaring the type as Float64, then doing 
pointer_to_array(v1,3) gives me a 3x1 array of zeros. This is not the 
expected result of the code, but again, it might be an issue within the 
Fortran code itself.


Adrian,

Thank you for the link, there's a lot of useful information in there, 
although, unfortunately, nothing that I saw relating to my specific 
problem. In any case, do you know if the example is waiting in a PR 
somewhere to get into the docs? I agree that it would be very useful.

Thanks,
Chris

On Saturday, January 16, 2016 at 10:59:44 AM UTC-5, Adrian Cuthbertson 
wrote:
>
> I happened to make a note of a post some time ago about getting fortran 
> and julia working together. Searching for that again returned this link:
>
>
> http://julia-programming-language.2336112.n4.nabble.com/example-for-ccall-use-and-fortran-td7737.html
>
> Hth, Adrian.
>
>
> On Sat, Jan 16, 2016 at 3:32 PM, Chris <7hunde...@gmail.com > 
> wrote:
>
>> The Fortran code I'm working with assigns results to a number of global 
>> variables. One of those results is a 3x1 real array - let's call it v1.
>>
>> I'm trying to understand how to access this. Here's what I have:
>>
>> v1 = cglobal((:__libkl_mod_MOD_v1,"libkl.so"),Ptr{Float64})
>>
>> This gives me a Ptr{Ptr{Float64}}. I'm not sure what to do from here -- 
>> using pointer_to_array just gives
>>
>> julia> pointer_to_array(v1,3)
>> 3-element Array{Ptr{Float64},1}:
>>  Ptr{Float64} @0x
>>  Ptr{Float64} @0x
>>  Ptr{Float64} @0x
>>
>> And then doing an unsafe_load on any of those elements gives me a 
>> segfault.
>>
>> Am I taking the right approach here? The Fortran code is not my own, so 
>> it's possible this is due to an error in that code, but I'm trying to rule 
>> out Julia interface issues first.
>>
>> Thanks in advance,
>> Chris
>>
>
>

Re: [julia-users] Access to Fortran global array with cglobal

2016-01-18 Thread Chris
Thanks Erik. n, in this case, is zero, so I'm expecting a 3x1 array.

On Monday, January 18, 2016 at 12:26:33 PM UTC-5, Erik Schnetter wrote:
>
> Chris 
>
> This array does not use Fortran 90 features; you're fine. Note that 
> the array indices will be different in Julia -- -n:n will be 1:(2*n+1) 
> instead. 
>
> What is "n" in your setup? You should declare the array size as 
> 3*(2*n+1) in Julia, or as 2d-array via 3, 2*n+1. 
>
> -erik 
>
> On Mon, Jan 18, 2016 at 12:05 PM, Chris <7hunde...@gmail.com > 
> wrote: 
> > Erik, 
> > 
> > The array declaration line in the Fortran code is something like 
> > 
> > integer,parameter:: fp_kind = kind(0.d0) 
> > real(fp_kind):: v1(3,-n:n) 
> > 
> > Does this appear to use any Fortran 90-specific features? 
> > 
> > I also tried declaring the type as Float64, then doing 
> > pointer_to_array(v1,3) gives me a 3x1 array of zeros. This is not the 
> > expected result of the code, but again, it might be an issue within the 
> > Fortran code itself. 
> > 
> > 
> > Adrian, 
> > 
> > Thank you for the link, there's a lot of useful information in there, 
> > although, unfortunately, nothing that I saw relating to my specific 
> problem. 
> > In any case, do you know if the example is waiting in a PR somewhere to 
> get 
> > into the docs? I agree that it would be very useful. 
> > 
> > Thanks, 
> > Chris 
> > 
> > On Saturday, January 16, 2016 at 10:59:44 AM UTC-5, Adrian Cuthbertson 
> > wrote: 
> >> 
> >> I happened to make a note of a post some time ago about getting fortran 
> >> and julia working together. Searching for that again returned this 
> link: 
> >> 
> >> 
> >> 
> http://julia-programming-language.2336112.n4.nabble.com/example-for-ccall-use-and-fortran-td7737.html
>  
> >> 
> >> Hth, Adrian. 
> >> 
> >> 
> >> On Sat, Jan 16, 2016 at 3:32 PM, Chris <7hunde...@gmail.com> wrote: 
> >>> 
> >>> The Fortran code I'm working with assigns results to a number of 
> global 
> >>> variables. One of those results is a 3x1 real array - let's call it 
> v1. 
> >>> 
> >>> I'm trying to understand how to access this. Here's what I have: 
> >>> 
> >>> v1 = cglobal((:__libkl_mod_MOD_v1,"libkl.so"),Ptr{Float64}) 
> >>> 
> >>> This gives me a Ptr{Ptr{Float64}}. I'm not sure what to do from here 
> -- 
> >>> using pointer_to_array just gives 
> >>> 
> >>> julia> pointer_to_array(v1,3) 
> >>> 3-element Array{Ptr{Float64},1}: 
> >>>  Ptr{Float64} @0x 
> >>>  Ptr{Float64} @0x 
> >>>  Ptr{Float64} @0x 
> >>> 
> >>> And then doing an unsafe_load on any of those elements gives me a 
> >>> segfault. 
> >>> 
> >>> Am I taking the right approach here? The Fortran code is not my own, 
> so 
> >>> it's possible this is due to an error in that code, but I'm trying to 
> rule 
> >>> out Julia interface issues first. 
> >>> 
> >>> Thanks in advance, 
> >>> Chris 
> >> 
> >> 
> > 
>
>
>
> -- 
> Erik Schnetter  
> http://www.perimeterinstitute.ca/personal/eschnetter/ 
>


[julia-users] Problem with types used in macro from a different module!!

2016-01-18 Thread Julia Tylors
#
#IMPLEMENTATION CODE:
#

module Y
export modify

"""
A LOT OF CODE HERE, REMOVED FOR SIMPLICITY
"""

function _modify(expr)
is_lambda,f,args,body = decompose_function(expr)
if is_lambda
quote
($(args...)) -> $(transform(body))
end
else
quote
function $(esc(f))($(args...))
  $(transform(body))
end
end
end
end

macro modify(func)
_modify(func)
end
end

#
#USAGE CODE:
#

module X
using Y
type Arg{T}
x::T
end
println(macroexpand( quote 
@modify function f(x::Arg{Bool})
 y = 12
 if x
   y+1
 else
   y-1
end
end
end))
end

#
#GENERATED CODE:
#

begin
begin 
function f(#22#x::Y.Arg{Y.Bool}) 
begin  
#21#y = 12 # 
begin 
"LOTS OF CODE HERE"
end
end
end
end
end

#
#QUESTION
#

In the generated code, the argument Arg{Bool} is treated as if it is an 
element of module Y, but it is an element of module Y. So when the code 
executes, the interpreter complains and says:
ERROR: LoadError: UndefVarError: Arg not defined

Why $(args...) in the implementation code doesn't work?

Thanks




Re: [julia-users] Multiple Dispatch Question

2016-01-18 Thread Erik Schnetter
Chris

In this case, you could write an auxiliary third function that takes
an additional Bool parameter. Both your functions call the third
function with this Bool parameter.

An alternative solution is to make this a Val{Bool} parameter, which
would likely specialize the functions at build time. This might
improve performance if the functions are called in a
performance-critial region.

-erik


On Mon, Jan 18, 2016 at 4:36 PM, Cedric St-Jean  wrote:
> my_func(fcn1::Function, passedIn::Float64) =
> my_func(fcn1, (y, z, passedin)->default_fcn(0.0, y, z, passedin),
> passedIn)
>
> You could achieve the same effect in one definition if you put fcn2 as a
> keyword argument. Also check out FastAnonymous.jl if performance matters.
>
>
> On Monday, January 18, 2016 at 4:25:20 PM UTC-5, Christopher Alexander
> wrote:
>>
>> Thanks for the response!  As a follow-up, what would I do in a situation
>> where the passed-in second function (fcn2) and the default function take a
>> different number of arguments?
>>
>> Thanks!
>>
>> Chris
>>
>> On Monday, January 18, 2016 at 4:06:41 PM UTC-5, Erik Schnetter wrote:
>>>
>>> Define the second function like this:
>>> ```
>>> my_func(fcn1::Function, passedIn::Float64) = my_func(fcn1,
>>> default_fcn, passedIn)
>>> ```
>>>
>>> -erik
>>>
>>>
>>> On Mon, Jan 18, 2016 at 4:02 PM, Christopher Alexander
>>>  wrote:
>>> > Hello all, I had a question concerning a best practice in a particular
>>> > case
>>> > of multiple dispatch which is as follows.
>>> >
>>> > Let's say I have a function with two different methods.
>>> >
>>> > function my_func(fcn1::Function,fcn2::Function, passedIn::Float64)
>>> >  x = 0.0
>>> >  y = 1.0
>>> >  z = 2.0
>>> >  val1 = fcn(x, y, passedIn)
>>> >  val2 = fcn2(y, z, passedIn)
>>> >  return val1, val2
>>> > end
>>> >
>>> > function my_func(fcn1::Function, passedIn::Float64)
>>> >  x = 0.0
>>> >  y = 1.0
>>> >  z = 2.0
>>> >  val1 = fcn(x, y, passedIn)
>>> >  val2 = default_fcn(x, y, z, passedIn)
>>> >  return val1, val2
>>> > end
>>> >
>>> > My question is basically, what would be the best way to do this without
>>> > massive code duplication?  The actual situation I am working with has
>>> > much
>>> > more going on in the function, so it's not like I could create some
>>> > init
>>> > function to set up x, y, & z.  But literally the only different
>>> > behavior
>>> > between the two methods is whether or not a second function is passed
>>> > in.
>>> >
>>> > Thanks!
>>> >
>>> > Chris
>>> >
>>>
>>>
>>>
>>> --
>>> Erik Schnetter 
>>> http://www.perimeterinstitute.ca/personal/eschnetter/



-- 
Erik Schnetter 
http://www.perimeterinstitute.ca/personal/eschnetter/


[julia-users] Re: Immutable type with a function datatype

2016-01-18 Thread Anonymous
The issue is that there is no cross section of metrics, one from each 
manifold, which can all be grouped under the heading of RIEMANNIAN, same 
with LORENTZIAN, etc.  Each manifold will have its own set of idiosyncratic 
metrics, and the user might require the functionality of being able to 
customize their own metric for a manifold.  That's why it makes so much 
sense to attach the metric to the manifold when you instantiate it. 
 Manifolds are prototypical objects with lots of internal structure, and 
thus the code gets messy when you try to shoehorn them into the role of a 
datatype.

On Monday, January 18, 2016 at 2:33:10 PM UTC-8, Simon Danisch wrote:
>
> Am I missing something, or why isn't there this solution:
>
> @enum Metric RIEMANNIAN LORENTZIAN # ...
> immutable Sphere{Matric}
> dim::Int
> end
> function metric(s::Sphere{RIEMANNIAN})
>
> end
> function metric(s::Sphere{LORENTZIAN})
>
> end
>


Re: [julia-users] Re: Immutable type with a function datatype

2016-01-18 Thread Erik Schnetter
In this case you can try:

abstract Metric

immutable Sphere{M <: Metric}
dim::Int
end

immutable Riemannian <: Metric end
function metric(s::Sphere{Riemannian}) = ...

immutable Lorentzian <: Metric end
function metric(s::Sphere{Lorentzian}) = ...

In fact, you don't need to declare "Metric"; you can just use "Any"
instead. I like it because it adds documenation to the code.

-erik



On Mon, Jan 18, 2016 at 7:58 PM, Anonymous  wrote:
> The issue is that there is no cross section of metrics, one from each
> manifold, which can all be grouped under the heading of RIEMANNIAN, same
> with LORENTZIAN, etc.  Each manifold will have its own set of idiosyncratic
> metrics, and the user might require the functionality of being able to
> customize their own metric for a manifold.  That's why it makes so much
> sense to attach the metric to the manifold when you instantiate it.
> Manifolds are prototypical objects with lots of internal structure, and thus
> the code gets messy when you try to shoehorn them into the role of a
> datatype.
>
> On Monday, January 18, 2016 at 2:33:10 PM UTC-8, Simon Danisch wrote:
>>
>> Am I missing something, or why isn't there this solution:
>>
>> @enum Metric RIEMANNIAN LORENTZIAN # ...
>> immutable Sphere{Matric}
>> dim::Int
>> end
>> function metric(s::Sphere{RIEMANNIAN})
>>
>> end
>> function metric(s::Sphere{LORENTZIAN})
>>
>> end



-- 
Erik Schnetter 
http://www.perimeterinstitute.ca/personal/eschnetter/


Re: [julia-users] Matrix multiplication - A' x B' much slower than A x B, or A' x B, or A' x B'

2016-01-18 Thread Franco Venturi
Thanks Tim for the quick reply.
I rewrote the same four cases as functions, ran them once for the JIT 
compilation and now the results are consistent (and pretty good) - I also 
kept the code from the old case a' x b' at the end and you can see how much 
slower it is on my computer (but now I understand why).
This is the output and below is my code (again two 1000x1000 matrixes and 
20 iterations):

JIT Compilation:
  0.225234 seconds (584.87 k allocations: 27.644 MB)
  0.043811 seconds (64.50 k allocations: 3.020 MB)
  0.003492 seconds (4.09 k allocations: 219.771 KB)
  0.002444 seconds (3.30 k allocations: 173.763 KB)
Full run:
  0.363003 seconds (104 allocations: 152.590 MB, 1.78% gc time)
a x b 1000x1000x1000 0.383239 s 104373.502648 MFLOPS
  0.355820 seconds (18.49 k allocations: 161.055 MB, 2.97% gc time)
a' x b 1000x1000x1000 0.358574 s 111553.011766 MFLOPS
  0.333026 seconds (109 allocations: 160.220 MB, 3.34% gc time)
a x b' 1000x1000x1000 0.336589 s 118839.346176 MFLOPS
  0.371250 seconds (114 allocations: 167.850 MB, 13.26% gc time)
a' x b' 1000x1000x1000 0.373455 s 107107.937773 MFLOPS
old a' x b' 1000x1000x1000 0.308466 s 6483.691371 MFLOPS



Thanks again.
Franco

#!/bin/julia -q

using ArgParse

s = ArgParseSettings()

@add_arg_table s begin
"-m"
"-k"
"-n"
"-t"
end

parsed_args = parse_args(s)
m = parse(Int, parsed_args["m"])
k = parse(Int, parsed_args["k"])
n = parse(Int, parsed_args["n"])
t = parse(Int, parsed_args["t"])

a = 2 * rand(m, k) - 1
b = 2 * rand(k, n) - 1

function axb(a::Array{Float64,2}, b::Array{Float64,2}, t::Int64)
for i = 1:t
c = a * b
end
end

function atxb(a::Array{Float64,2}, b::Array{Float64,2}, t::Int64)
for i = 1:t
c = a' * b
end
end

function axbt(a::Array{Float64,2}, b::Array{Float64,2}, t::Int64)
for i = 1:t
c = a * b'
end
end

function atxbt(a::Array{Float64,2}, b::Array{Float64,2}, t::Int64)
for i = 1:t
c = a' * b'
end
end


println("JIT Compilation:")
@time axb(reshape([1.0], 1, 1), reshape([1.0], 1, 1), 1);
@time atxb(reshape([1.0], 1, 1), reshape([1.0], 1, 1), 1);
@time axbt(reshape([1.0], 1, 1), reshape([1.0], 1, 1), 1);
@time atxbt(reshape([1.0], 1, 1), reshape([1.0], 1, 1), 1);

println("Full run:")

tstart = time()
@time axb(a, b, t)
tend = time()
duration = tend - tstart
mflops = (2 * m * n * k) / (duration / t) * 1.0e-6
@printf("a x b\t%dx%dx%d\t%f s\t%f MFLOPS\n", m, n, k, duration, mflops)

tstart = time()
@time atxb(a', b, t)
tend = time()
duration = tend - tstart
mflops = (2 * m * n * k) / (duration / t) * 1.0e-6
@printf("a' x b\t%dx%dx%d\t%f s\t%f MFLOPS\n", m, n, k, duration, mflops)

tstart = time()
@time axbt(a, b', t)
tend = time()
duration = tend - tstart
mflops = (2 * m * n * k) / (duration / t) * 1.0e-6
@printf("a x b'\t%dx%dx%d\t%f s\t%f MFLOPS\n", m, n, k, duration, mflops)

tstart = time()
@time atxbt(a', b', t)
tend = time()
duration = tend - tstart
mflops = (2 * m * n * k) / (duration / t) * 1.0e-6
@printf("a' x b'\t%dx%dx%d\t%f s\t%f MFLOPS\n", m, n, k, duration, mflops)

# old version of a' x b'
start = time()
for i = 1:t
c = a' * b'
end
tend = time()
duration = (tend - tstart) / t
mflops = (2 * m * n * k) / duration * 1.0e-6
@printf("old a' x b'\t%dx%dx%d\t%f s\t%f MFLOPS\n", m, n, k, duration, 
mflops)





Re: [julia-users] Simultaneous audio playback / recording.

2016-01-18 Thread Spencer Russell
AudioIO is going to be going deprecated soon in favor of a family of packages 
that are each a bit more focused and simpler to interface with. They’re not 
quite release-ready but have been making a lot of progress lately, and I wanted 
folks to know what’s coming before you sink a bunch of time into working with 
the AudioIO implementation. Currently there’s a mostly working JACK library, 
and I’ll probably port over the PortAudio support from AudioIO after that.

-s

> On Jan 17, 2016, at 1:30 PM, CrocoDuck O'Ducks  
> wrote:
> 
> Hi there!
> 
> I have a number MATLAB scripts that I use for electro-acoustic measurements 
> (mainly impulse responses) and I would like to port them to JULIA. I also 
> written the data acquisition in MATLAB. It works by streaming the test signal 
> to the soundcard outputs while recording from the soundcard inputs. I would 
> like to implement that as well. I was looking at AudioIO but there are few 
> issues:
> 
> I cannot find documentation/examples on how to record from soundcard input.
> I cannot find documentation/examples on selecting the audio device to use.
> I cannot find documentation/examples on setting sampling variables (it is in 
> my best interest to use the highest sample rate available).
> Are these things possible with JULIA? I think I can use aplayer and arecord 
> through JULIA (I am on Linux), but I was wishing to have all the code 
> contained within JULIA.
> 
> Many thanks, I hope you can point me in the right direction.
> 



[julia-users] Re: Using MNE with Julia using PyCall

2016-01-18 Thread Steven G. Johnson


On Monday, January 18, 2016 at 6:03:15 PM UTC-5, Rob wrote:
>
> I am trying to use the python MNE library in Julia.
>
> When I call the python function it returns a `Dict{Any,Any}` instead of a 
> type `info`, when I pass this variable back to another python function I 
> get the error 
>

You can use the lower-level "pycall" function to have more control over the 
return type and any conversions.

e.g. you can do

 pycall(somepythonfunction, PyObject, args)

to return a "raw" Python object with no conversions. 


Re: [julia-users] Bug in eigfact and svd for large odd-size matrices

2016-01-18 Thread Steven White
Yes, it looks like the same thing.  If I do norm(zeros(129,129)) on my 
machine, I also get an Abort.  Also eigfact(zeros(129,129)).

My machine is a Mac:
versioninfo()
Julia Version 0.4.2
Commit bb73f34 (2015-12-06 21:47 UTC)
Platform Info:
  System: Darwin (x86_64-apple-darwin13.4.0)
  CPU: Intel(R) Core(TM) i7-2600 CPU @ 3.40GHz
  WORD_SIZE: 64
  BLAS: libopenblas (USE64BITINT DYNAMIC_ARCH NO_AFFINITY Sandybridge)
  LAPACK: libopenblas64_
  LIBM: libopenlibm
  LLVM: libLLVM-3.3


On Monday, January 18, 2016 at 3:13:52 PM UTC-8, Kevin Squire wrote:
>
> Hi Steven,
>
> This seems to be a problem with specific machines--I don't run into it, 
> for example.
>
> Are you on a Mac?  If so, check out 
> https://github.com/JuliaLang/julia/issues/14507 and 
> https://github.com/staticfloat/homebrew-julia/issues/194 and see if 
> they're the same issue.  If they are, can you comment there?  (Additional 
> comments on open issues sometimes helps them get resolved faster)
>
> Cheers,
>Kevin
>
> On Mon, Jan 18, 2016 at 2:57 PM, Steven White  > wrote:
>
>> I discovered this with matrices that were not random, but this simple test
>> illustrates the problem:   julia aborts when it tries to diagonalize a 
>> 705x705 matrix
>>
>> mat = rand(704,704)
>> mat = mat' + mat
>> e = eigfact(mat)
>> @show "done 704"
>>
>> mat = rand(705,705)
>> mat = mat' + mat
>> e = eigfact(mat)
>> @show "done 705"
>>
>> Results:
>> "done 704" = "done 704"
>> Abort
>>
>> I also get this when I try an svd:
>> mat = rand(705,705)
>> svd(mat)
>>
>> This gives 
>> Abort
>>
>> So, to find what the key problem sizes are:
>> for i=1:705
>>mat = rand(i,i)
>>svd(mat)
>>@show i
>>  end
>>
>> This dies on i=129.   Note that 129 is odd.
>> But replacing the for loop to count by twos is fine:
>> for i=2:2:1705
>>mat = rand(i,i)
>>svd(mat)
>>@show i
>>  end
>> No problem here. 
>>
>>
>

[julia-users] CurveFit Parckage. Why the code below does not work?

2016-01-18 Thread jmarcellopereira


Code 

x = [0.0 0.2 0.4 1.0 1.6 1.8 2.0 2.6 2.8 3.0 3.8 4.8 5.0 5.2 6.0 6.2 7.4 
7.6 7.8 8.6 8.8 9.0 9.2 9.4 10.0 10.6 10.8 11.2 11.6 11.8 12.2 12.4];
y = [-0.183 -0.131 0.027 0.3 0.579 0.853 0.935 1.133 1.269 1.102 1.092 
1.143 0.811 0.91 0.417 0.46 -0.516 -0.334 -0.504 -0.946 -0.916 -0.975 
-1.099 -1.113 -1.297 -1.234 -0.954 -1.122 -0.609 -0.593 -0.403 -0.51];


x0 = vec(x)
y0 = vec(y)

a = [1.5 1.5 1.0]
eps = 0.0001
maxiter= 200.0

fun(xm,a) = a[1].*sin(a[2].*xm - a[3]);

xy=hcat(x0,y0);

coefs,converged,iter = CurveFit.nonlinear_fit(xy,fun,a,eps,maxiter);



[julia-users] CurveFit Package. why the code below does not work?

2016-01-18 Thread jmarcellopereira


Code: 

x = [0.0 0.2 0.4 1.0 1.6 1.8 2.0 2.6 2.8 3.0 3.8 4.8 5.0 5.2 6.0 6.2 7.4 
7.6 7.8 8.6 8.8 9.0 9.2 9.4 10.0 10.6 10.8 11.2 11.6 11.8 12.2 12.4];
y = [-0.183 -0.131 0.027 0.3 0.579 0.853 0.935 1.133 1.269 1.102 1.092 
1.143 0.811 0.91 0.417 0.46 -0.516 -0.334 -0.504 -0.946 -0.916 -0.975 
-1.099 -1.113 -1.297 -1.234 -0.954 -1.122 -0.609 -0.593 -0.403 -0.51];


x0 = vec(x)
y0 = vec(y)

a = [1.5 1.5 1.0]
eps = 0.0001
maxiter= 200.0

fun(xm,a) = a[1].*sin(a[2].*xm - a[3]);

xy=hcat(x0,y0);

coefs,converged,iter = CurveFit.nonlinear_fit(xy,fun,a,eps,maxiter);



Re: [julia-users] Using Dot Syntax in Julia

2016-01-18 Thread Bryan Rivera
Yes I am starting to see that - Julia has math-like syntax.

I asked the question on SO and got an answer to use `MacroTools.jl`

So now its just `@> wallet dotTest!(5)`

I am loving the macro system.  This is what we need to proceed with 
language development.

I am coming from the Scala world, and I see that Julia is very much like 
it, but better in terms of performance, but *arguably* worse in terms of 
syntax.

But most syntax issues should be solvable with the use of macros.  Julia is 
young.

In particular the @> clarifies quite a bit with functional operation chains 
on arrays. 

Interesting stuff.


On Monday, January 18, 2016 at 12:50:59 PM UTC-5, Stefan Karpinski wrote:
>
> This is usually not what you want to do.
>
> On Sunday, January 17, 2016, Steve Kelly  
> wrote:
>
>> It has always been this way because of multiple dispatch. However you can 
>> do something like:
>>
>> type Wallet
>>   dotTest::Function
>> end
>>
>> Which might have ambiguous performance impact. 
>> On Jan 17, 2016 12:45 PM, "Bryan Rivera"  wrote:
>>
>>> I have seen some code out in the wild that allows us to use dot syntax 
>>> like so:
>>>
>>> function dotTest!(wallet::Wallet, valueToAdd::Int):
>>>
>>> ...  
>>>
>>> end
>>>
>>> wallet = Wallet(100)
>>>
>>> wallet.dotTest!(5)  # Does not work
>>> dotTest!(wallet, 5)  # Works
>>>
>>> However I cannot get it to work, the method is not found because I am 
>>> not passing wallet as the arg.
>>>
>>> So did the language change, or am I doing it wrong?
>>>
>>

Re: [julia-users] Intel Xeon Phi support?

2016-01-18 Thread Chris Rackauckas
Any updates on Julia for the Phi? I know that MKL automatic offload works, 
but am looking for the whole thing. I do research in stochastic dynamical 
systems (easy to parallelize) and have a 5110 working, so I am ready to 
test it once Julia has it!


[julia-users] Re: CurveFit Package. why the code below does not work?

2016-01-18 Thread Cedric St-Jean
What happens, what error do you get?

On Monday, January 18, 2016 at 9:40:37 PM UTC-5, jmarcell...@ufpi.edu.br 
wrote:
>
>
>
> Code: 
>
> x = [0.0 0.2 0.4 1.0 1.6 1.8 2.0 2.6 2.8 3.0 3.8 4.8 5.0 5.2 6.0 6.2 7.4 
> 7.6 7.8 8.6 8.8 9.0 9.2 9.4 10.0 10.6 10.8 11.2 11.6 11.8 12.2 12.4];
> y = [-0.183 -0.131 0.027 0.3 0.579 0.853 0.935 1.133 1.269 1.102 1.092 
> 1.143 0.811 0.91 0.417 0.46 -0.516 -0.334 -0.504 -0.946 -0.916 -0.975 
> -1.099 -1.113 -1.297 -1.234 -0.954 -1.122 -0.609 -0.593 -0.403 -0.51];
>
>
> x0 = vec(x)
> y0 = vec(y)
>
> a = [1.5 1.5 1.0]
> eps = 0.0001
> maxiter= 200.0
>
> fun(xm,a) = a[1].*sin(a[2].*xm - a[3]);
>
> xy=hcat(x0,y0);
>
> coefs,converged,iter = CurveFit.nonlinear_fit(xy,fun,a,eps,maxiter);
>
>

[julia-users] Re: Dictionary lookup using only the hash value

2016-01-18 Thread vavasis
Jeffrey,

Your interpretation is that the original poster wanted to read or write 
dictionary entries that had already been found via previous hashing and 
searching without again hashing the key (presumably in order to attain 
higher performance).  That was also my guess.  I am interested in this 
issue because Julia currently has a mechanism to refer to a specific object 
inside a container: the Ref{T} type.  My understanding is that this type 
was mainly created in order to build safer Julia/C interfaces, but one 
could imagine using it more generally.  Unfortunately (last time I checked) 
there is a performance problem with Ref{T}: these objects are 
heap-allocated and so shouldn't be created in inner loops.  In addition, as 
far as I know, the implementation has not yet been extended to elements of 
Dict{K,V}.  The sorted containers in DataStructures.jl have a 
token/semitoken type associated with them for the purpose of 
high-performance access and iteration, but the approach taken in 
DataStructures.jl does not generalize to other Julia containers such as the 
standard Dict{K,V}.

The C++ standard containers all have 'iterators' for this purpose.  Maybe 
some thought could be given to extending the Ref{T} framework to 
across-the-board high-performance references to objects inside containers 
in some future version of Julia.

-- Steve

On Monday, January 18, 2016 at 1:22:09 PM UTC-5, Jeffrey Sarnoff wrote:
>
> Steve, 
> afaik, depending upon the hash implementation, the only advantage might be 
> moderately faster lookup but the cost in generality often would outweigh 
> that.
>
> On Thursday, January 14, 2016 at 10:44:01 PM UTC-5, vav...@uwaterloo.ca 
> wrote:
>>
>> Could I ask what would be an application of this capability (if it were 
>> possible)?  -- Steve Vavasis
>>
>> On Wednesday, January 13, 2016 at 3:59:57 PM UTC-5, Ritchie Lee wrote:
>>>
>>> As I understand it, Julia dicts use Base.hash when storing custom types 
>>> as keys.  Is there a way to look up based on the hash alone?
>>>
>>> e.g.,
>>>
>>> type MyType
>>> a::Int
>>> end
>>>
>>> Base.hash(x::MyType) = hash(x.a, hash(MyType))
>>>
>>> x = MyType(1)
>>> D = Dict{MyType, Bool}()
>>> D[x] = true
>>>
>>> Now, is there a way to lookup using h only? 
>>>
>>> h = hash(x)
>>> D[h]
>>>
>>> Is this be safe to do?
>>>
>>> Thanks!
>>>
>>

[julia-users] Playing with Cxx.jl

2016-01-18 Thread Ryuichi YAMAMOTO
Hi all,

I've been playing with Cxx.jl (https://github.com/Keno/Cxx.jl) for a couple 
of months and I'd like to share my work to everyone, in particular for 
those who might interested in developing Julia wrappers for C++ libraries.

In most of playing with Cxx, I have been developing Julia packages for 
computer vision:

   - https://github.com/r9y9/OpenCV.jl
   - https://github.com/r9y9/Libfreenect2.jl
   - https://github.com/r9y9/PCL.jl
   
If you are interested in one of those, please checkout the repository. I 
think that you can easily try these package once you are able to build Cxx.
I'm working on PCL.jl right now, so you might find the best fun code in it. 

Let me show an example on top of these packages; Real-time point cloud 
visualization using Kinect v2:
https://www.youtube.com/watch?v=rGdsNoK3n9Q
I never thought I was able to do that without Cxx. I think it's cool :)

Fortunately, Julia now officially supports llvm 3.7.1, so building Cxx is 
getting easier than before (actually, to make Cxx works property was a bit 
hard since it depends on development version of llvm, clang and Julia. 
Upstream changes often broke). I think it's time to start playing with Cxx. 
It's really fun.

Cheers,
Ryuichi


Re: [julia-users] @everywhere using Images gives error

2016-01-18 Thread Abhinanda Ranjit
Hi Tim, 

Thanks for replying. 
All nodes have the package 'Images' and 'ImageMagick' installed and working 
in them. I am able to load images on all nodes locally.
This issue occurs when including the Image package on a remote system only. 
A lot of warning appear at the time Images are loaded

These are the warnings i get after adding the node and loading Images 
everywhere.

julia> @everywhere using Images
WARNING: replacing module Images
WARNING: Method definition width(AbstractArray) in module Images at 
C:\Users\abhinanda.ranjit\.julia\v0.4\Images\src\core.jl:1190
overwritten in module Images at 
C:\Users\abhinanda.ranjit\.julia\v0.4\Images\src\core.jl:1190.
WARNING: Method definition height(AbstractArray) in module Images at 
C:\Users\abhinanda.ranjit\.julia\v0.4\Images\src\core.jl:1196
 overwritten in module Images at 
C:\Users\abhinanda.ranjit\.julia\v0.4\Images\src\core.jl:1196.
WARNING: Method definition ufixed12(AbstractArray{#C<:ColorTypes.Colorant, 
N<:Any}) in module Images at C:\Users\abhinanda.ranjit\
.julia\v0.4\Images\src\map.jl:684 overwritten in module Images at 
C:\Users\abhinanda.ranjit\.julia\v0.4\Images\src\map.jl:684.
WARNING: Method definition ufixed8(AbstractArray{#C<:ColorTypes.Colorant, 
N<:Any}) in module Images at C:\Users\abhinanda.ranjit\.
julia\v0.4\Images\src\map.jl:684 overwritten in module Images at 
C:\Users\abhinanda.ranjit\.julia\v0.4\Images\src\map.jl:684.
WARNING: Method definition ufixed14(AbstractArray{#C<:ColorTypes.Colorant, 
N<:Any}) in module Images at C:\Users\abhinanda.ranjit\
.julia\v0.4\Images\src\map.jl:684 overwritten in module Images at 
C:\Users\abhinanda.ranjit\.julia\v0.4\Images\src\map.jl:684.
WARNING: Method definition ufixed16(AbstractArray{#C<:ColorTypes.Colorant, 
N<:Any}) in module Images at C:\Users\abhinanda.ranjit\
.julia\v0.4\Images\src\map.jl:684 overwritten in module Images at 
C:\Users\abhinanda.ranjit\.julia\v0.4\Images\src\map.jl:684.
WARNING: Method definition ufixed10(AbstractArray{#C<:ColorTypes.Colorant, 
N<:Any}) in module Images at C:\Users\abhinanda.ranjit\
.julia\v0.4\Images\src\map.jl:684 overwritten in module Images at 
C:\Users\abhinanda.ranjit\.julia\v0.4\Images\src\map.jl:684.
WARNING: Method definition red(AbstractArray{#CV<:ColorTypes.Color, 
N<:Any}) in module Images at C:\Users\abhinanda.ranjit\.julia\
v0.4\Images\src\algorithms.jl:209 overwritten in module Images at 
C:\Users\abhinanda.ranjit\.julia\v0.4\Images\src\algorithms.jl:2
09.
WARNING: Method definition red(AbstractArray) in module Images at 
C:\Users\abhinanda.ranjit\.julia\v0.4\Images\src\algorithms.jl:2
18 overwritten in module Images at 
C:\Users\abhinanda.ranjit\.julia\v0.4\Images\src\algorithms.jl:218.
WARNING: Method definition blue(AbstractArray{#CV<:ColorTypes.Color, 
N<:Any}) in module Images at C:\Users\abhinanda.ranjit\.julia
\v0.4\Images\src\algorithms.jl:209 overwritten in module Images at 
C:\Users\abhinanda.ranjit\.julia\v0.4\Images\src\algorithms.jl:
209.
WARNING: Method definition blue(AbstractArray) in module Images at 
C:\Users\abhinanda.ranjit\.julia\v0.4\Images\src\algorithms.jl:
218 overwritten in module Images at 
C:\Users\abhinanda.ranjit\.julia\v0.4\Images\src\algorithms.jl:218.
WARNING: Method definition green(AbstractArray{#CV<:ColorTypes.Color, 
N<:Any}) in module Images at C:\Users\abhinanda.ranjit\.juli
a\v0.4\Images\src\algorithms.jl:209 overwritten in module Images at 
C:\Users\abhinanda.ranjit\.julia\v0.4\Images\src\algorithms.jl
:209.
WARNING: Method definition green(AbstractArray) in module Images at 
C:\Users\abhinanda.ranjit\.julia\v0.4\Images\src\algorithms.jl
:218 overwritten in module Images at 
C:\Users\abhinanda.ranjit\.julia\v0.4\Images\src\algorithms.jl:218.
WARNING: Method definition mimewritable(Base.Multimedia.MIME{:image/png}, 
AbstractArray{#C<:ColorTypes.Colorant, N<:Any}) in modul
e Images at C:\Users\abhinanda.ranjit\.julia\v0.4\Images\src\writemime.jl:7 
overwritten in module Images at C:\Users\abhinanda.ran
jit\.julia\v0.4\Images\src\writemime.jl:7.
WARNING: Method definition writemime(Base.IO, 
Base.Multimedia.MIME{:image/png}, AbstractArray{#C<:ColorTypes.Colorant, 
N<:Any}) in
 module Images at 
C:\Users\abhinanda.ranjit\.julia\v0.4\Images\src\writemime.jl:38 
overwritten in module Images at C:\Users\abhina
nda.ranjit\.julia\v0.4\Images\src\writemime.jl:38.
WARNING: Method definition writemime(Array, Base.IO, 
Base.Multimedia.MIME{:image/png}, AbstractArray{#C<:ColorTypes.Colorant, 
N<:A
ny}) in module Images overwritten in module Images.
WARNING: Method definition zero(Type{Graphics.Vec2}) in module Images at 
C:\Users\abhinanda.ranjit\.julia\v0.4\Images\src\edge.jl:
356 overwritten in module Images at 
C:\Users\abhinanda.ranjit\.julia\v0.4\Images\src\edge.jl:356.
WARNING: Method definition float64(AbstractArray{#C<:ColorTypes.Colorant, 
N<:Any}) in module Images at C:\Users\abhinanda.ranjit\.
julia\v0.4\Images\src\map.jl:684 overwritten in module Images at 

Re: [julia-users] Multiple Dispatch Question

2016-01-18 Thread Erik Schnetter
Define the second function like this:
```
my_func(fcn1::Function, passedIn::Float64) = my_func(fcn1,
default_fcn, passedIn)
```

-erik


On Mon, Jan 18, 2016 at 4:02 PM, Christopher Alexander
 wrote:
> Hello all, I had a question concerning a best practice in a particular case
> of multiple dispatch which is as follows.
>
> Let's say I have a function with two different methods.
>
> function my_func(fcn1::Function,fcn2::Function, passedIn::Float64)
>  x = 0.0
>  y = 1.0
>  z = 2.0
>  val1 = fcn(x, y, passedIn)
>  val2 = fcn2(y, z, passedIn)
>  return val1, val2
> end
>
> function my_func(fcn1::Function, passedIn::Float64)
>  x = 0.0
>  y = 1.0
>  z = 2.0
>  val1 = fcn(x, y, passedIn)
>  val2 = default_fcn(x, y, z, passedIn)
>  return val1, val2
> end
>
> My question is basically, what would be the best way to do this without
> massive code duplication?  The actual situation I am working with has much
> more going on in the function, so it's not like I could create some init
> function to set up x, y, & z.  But literally the only different behavior
> between the two methods is whether or not a second function is passed in.
>
> Thanks!
>
> Chris
>



-- 
Erik Schnetter 
http://www.perimeterinstitute.ca/personal/eschnetter/


[julia-users] Re: Problem with types used in macro from a different module!!

2016-01-18 Thread Cedric St-Jean
We're missing your `decompose_function` so I couldn't run it, but from the 
last time I worked with function-definition macros, I believe the correct 
way to write it should be to add `esc` to the arguments so that it doesn't 
apply hygiene, this way:

function $(esc(f))($(map(esc, args)...))

Unfortunately, last I checked there was a bug in Julia that prevented this, 
and instead I have to cancel hygiene on the whole expression (and use 
`gensym` where appropriate)

esc(:(function $f($(args...))
...)

Does that solve your issue?

On Monday, January 18, 2016 at 2:33:00 PM UTC-5, Julia Tylors wrote:
>
> #
> #IMPLEMENTATION CODE:
> #
>
> module Y
> export modify
>
> """
> A LOT OF CODE HERE, REMOVED FOR SIMPLICITY
> """
>
> function _modify(expr)
> is_lambda,f,args,body = decompose_function(expr)
> if is_lambda
> quote
> ($(args...)) -> $(transform(body))
> end
> else
> quote
> function $(esc(f))($(args...))
>   $(transform(body))
> end
> end
> end
> end
>
> macro modify(func)
> _modify(func)
> end
> end
>
> #
> #USAGE CODE:
> #
>
> module X
> using Y
> type Arg{T}
> x::T
> end
> println(macroexpand( quote 
> @modify function f(x::Arg{Bool})
>  y = 12
>  if x
>y+1
>  else
>y-1
> end
> end
> end))
> end
>
> #
> #GENERATED CODE:
> #
>
> begin
> begin 
> function f(#22#x::Y.Arg{Y.Bool}) 
> begin  
> #21#y = 12 # 
> begin 
> "LOTS OF CODE HERE"
> end
> end
> end
> end
> end
>
> #
> #QUESTION
> #
>
> In the generated code, the argument Arg{Bool} is treated as if it is an 
> element of module Y, but it is an element of module Y. So when the code 
> executes, the interpreter complains and says:
> ERROR: LoadError: UndefVarError: Arg not defined
>
> Why $(args...) in the implementation code doesn't work?
>
> Thanks
>
>
>

Re: [julia-users] Julia way of filling columns of a matrix

2016-01-18 Thread Júlio Hoffimann
That is a good catch Kevin, thanks!

2016-01-18 12:53 GMT-08:00 Kevin Squire :

> As long as each row has a fixed, known size, you could do
>
> a = []
> for i=1:n
> append!(a, [1,2,3])
> end
> A = reshape(a, 3, n)
>
> The 1-D array grows as needed, and reshape still points to the original
> data, so no copying is done.
>
> Cheers,
>Kevin
>
> On Mon, Jan 18, 2016 at 12:48 PM, Júlio Hoffimann <
> julio.hoffim...@gmail.com> wrote:
>
>> Yes, I will rely on the classical hcat() approach...
>>
>> A = []
>> for i=1:n
>>   push!(A, [1,2,3])
>> end
>> A = hcat(A...)
>>
>> Thank you.
>>
>> 2016-01-18 11:42 GMT-08:00 Júlio Hoffimann :
>>
>>> Hi,
>>>
>>> Suppose I want to fill the columns of a matrix which size I don't know
>>> beforehand:
>>>
>>> A = zeros(3,0)
>>> for i=1:n
>>>   A = [A [1,2,3]]
>>> end
>>>
>>> Is there a memory efficient way of doing that in Julia?
>>>
>>> I understand that the above syntax is allocating 3*i entries at
>>> iteration i which gives 3*(1+2+...+n) = 3*(n+1)n/2 allocations as opposed
>>> to 3n.
>>>
>>> -Júlio
>>>
>>
>>
>


[julia-users] Re: Examples of integrating Fortran code in Julia

2016-01-18 Thread Pieterjan Robbe
I think you need to specify the path that points to the module, not just the 
module name.

[julia-users] Re: strange error

2016-01-18 Thread Fabrizio Lacalandra
Hi Kevin,

thanks, the code is way too complicated but i think i have isolated the 
issue that seems not pure julia but of JuMP. Try to involve the developers

cheers,
Fabrizio

On Monday, January 18, 2016 at 7:39:39 PM UTC+1, Fabrizio Lacalandra wrote:
>
> Does anyone knows what this message below can mean ? Apparently the same 
> code runs with a previous version of Julia, 0.4.0 i think. Now i am running 
> the latest 0.4.3
> The line error in MyNetworkcns.jl is fake as that is simply the last line 
> of the file itself
>
>
> Thanks 
>
> Fabrizio
>
> ERROR: LoadError: LoadError: AssertionError: x.head == :escape
>  in include at boot.jl:261
>  in include_from_node1 at loading.jl:304
>  in include at boot.jl:261
>  in include_from_node1 at loading.jl:304
> while loading E:\CodiciSorgente\ProgrammiJulia\UCOTS\MyNetworkcns.jl, in 
> expression starting on line 159
>


[julia-users] Bug in eigfact and svd for large odd-size matrices

2016-01-18 Thread Steven White
I discovered this with matrices that were not random, but this simple test
illustrates the problem:   julia aborts when it tries to diagonalize a 
705x705 matrix

mat = rand(704,704)
mat = mat' + mat
e = eigfact(mat)
@show "done 704"

mat = rand(705,705)
mat = mat' + mat
e = eigfact(mat)
@show "done 705"

Results:
"done 704" = "done 704"
Abort

I also get this when I try an svd:
mat = rand(705,705)
svd(mat)

This gives 
Abort

So, to find what the key problem sizes are:
for i=1:705
   mat = rand(i,i)
   svd(mat)
   @show i
 end

This dies on i=129.   Note that 129 is odd.
But replacing the for loop to count by twos is fine:
for i=2:2:1705
   mat = rand(i,i)
   svd(mat)
   @show i
 end
No problem here. 



Re: [julia-users] Bug in eigfact and svd for large odd-size matrices

2016-01-18 Thread Kevin Squire
Hi Steven,

This seems to be a problem with specific machines--I don't run into it, for
example.

Are you on a Mac?  If so, check out
https://github.com/JuliaLang/julia/issues/14507 and
https://github.com/staticfloat/homebrew-julia/issues/194 and see if they're
the same issue.  If they are, can you comment there?  (Additional comments
on open issues sometimes helps them get resolved faster)

Cheers,
   Kevin

On Mon, Jan 18, 2016 at 2:57 PM, Steven White  wrote:

> I discovered this with matrices that were not random, but this simple test
> illustrates the problem:   julia aborts when it tries to diagonalize a
> 705x705 matrix
>
> mat = rand(704,704)
> mat = mat' + mat
> e = eigfact(mat)
> @show "done 704"
>
> mat = rand(705,705)
> mat = mat' + mat
> e = eigfact(mat)
> @show "done 705"
>
> Results:
> "done 704" = "done 704"
> Abort
>
> I also get this when I try an svd:
> mat = rand(705,705)
> svd(mat)
>
> This gives
> Abort
>
> So, to find what the key problem sizes are:
> for i=1:705
>mat = rand(i,i)
>svd(mat)
>@show i
>  end
>
> This dies on i=129.   Note that 129 is odd.
> But replacing the for loop to count by twos is fine:
> for i=2:2:1705
>mat = rand(i,i)
>svd(mat)
>@show i
>  end
> No problem here.
>
>


[julia-users] Re: Simultaneous audio playback / recording.

2016-01-18 Thread CrocoDuck O'Ducks
Thanks for the tips. I guess this is a sign of destiny: time for me to look 
deep into PortAudio.

On Sunday, 17 January 2016 22:01:04 UTC, STAR0SS wrote:
>
> When dealing with small packages you often need to look at the code, 
> because the documentation is sometimes lacking.
>
> AudioIO.jl uses the C library PortAudio, so in theory anything that can be 
> done with PortAudio can be done in Julia, you need
> to have the right wrappers for the C functions. It seems AudioIO.jl 
> implementation isn't complete, so probably some things cannot
> be done currently without getting your hands dirty (meaning reading the 
> Julia code and the PortAudio doc and trying to understand what's going on).
>
> if you look the constructor of PortAudioStream you can see you can change 
> the sampling rate there (the PortAudioStream can then be passed to play it 
> seems), 
> however it call the default stream (Pa_OpenDefaultStream) so I'm not sure 
> you can select the audio device.
>
> There's also a get_portaudio_devices function, but it's not used anywhere 
> it seems (you can search the repository to see where things are used)
>
>
> https://github.com/ssfrr/AudioIO.jl/blob/26fd1fdf232fbe8a0115203f8c253c8cff7a0827/src/portaudio.jl#L53
>
> I don't know much about PortAudio, so take that with a grain of salt, I'm 
> just guessing.
>


Re: [julia-users] Matrix multiplication - A' x B' much slower than A x B, or A' x B, or A' x B'

2016-01-18 Thread Tim Holy
I can't reproduce this at the REPL or if I put your loop in a function. (I can 
replicate your result---in my case a factor of 3---if I run your scripts.) But 
since you're also measuring JIT-compiling time, I'm not sure how seriously to 
take that (and it's kinda irrelevant anyway, since you never, ever do anything 
performance sensitive outside a function).

Check out the performance tips page:
http://docs.julialang.org/en/stable/manual/performance-tips/

--Tim

On Monday, January 18, 2016 10:17:26 AM Franco Venturi wrote:
> I wrote the Julia scripts below that just test the performance of the
> matrix multiplication in four cases (A' stands for A transpose, and B'
> stands for B transpose):
> 
>1. A x B
>2. A' x B
>3. A x B'
>4. A' x B'
> 
> I ran these scripts multiplying two 1,000 x 1,000 matrices 20 times, i.e.:
> 
> 
>1. axb.jl -n 1000 -k 1000 -m 1000 -t 20
>2. atxb.jl -n 1000 -k 1000 -m 1000 -t 20
>3. axbt.jl -n 1000 -k 1000 -m 1000 -t 20
>4. atxbt.jl -n 1000 -k 1000 -m 1000 -t 20
> 
> Cases 1, 2, 3 run in about the same time with a performance of about 70
> GFLOPS on my desktop, however case 4 is about 10 times slower.
> I run Linux Fedora 23, julia version 0.4.3, and openblas version 0.2.15.
> I also compiled Julia directly from the latest version on GitHub, used
> Intel MKL, and the results are similar (the absolute times are different,
> but the A' x B' case is still about 10 times slower than the other three
> cases).
> 
> Since Julia uses openblas/MKL function 'dgemm' (or 'sgemm' in the single
> precision case), I also wrote the same examples in C directly invoking
> openblas/MKL 'cblas_dgemm' function and there's no difference in these four
> cases using the native calls (the arguments for the function 'cblas_dgemm'
> have transpose flags for both A and B).
> 
> I am not sure why Julia is so much slower multiplying A' x B', so I thought
> I would bring it to your attention.
> 
> Regards,
> Franco Venturi
> 
> #!/bin/julia -q
> 
> using ArgParse
> 
> s = ArgParseSettings()
> 
> @add_arg_table s begin
> "-m"
> "-k"
> "-n"
> "-t"
> end
> 
> parsed_args = parse_args(s)
> m = parse(Int, parsed_args["m"])
> k = parse(Int, parsed_args["k"])
> n = parse(Int, parsed_args["n"])
> t = parse(Int, parsed_args["t"])
> 
> a = 2 * rand(m, k) - 1
> b = 2 * rand(k, n) - 1
> 
> tstart = time()
> for i = 1:t
> c = a * b
> end
> tend = time()
> duration = (tend - tstart) / t
> mflops = (2 * m * n * k) / duration * 1.0e-6
> @printf("%dx%dx%d\t%f s\t%f MFLOPS\n", m, n, k, duration, mflops)
> 
> 
> #!/bin/julia -q
> 
> using ArgParse
> 
> s = ArgParseSettings()
> 
> @add_arg_table s begin
> "-m"
> "-k"
> "-n"
> "-t"
> end
> 
> parsed_args = parse_args(s)
> m = parse(Int, parsed_args["m"])
> k = parse(Int, parsed_args["k"])
> n = parse(Int, parsed_args["n"])
> t = parse(Int, parsed_args["t"])
> 
> a = (2 * rand(m, k) - 1)'
> b = 2 * rand(k, n) - 1
> 
> tstart = time()
> for i = 1:t
> c = a' * b
> end
> tend = time()
> duration = (tend - tstart) / t
> mflops = (2 * m * n * k) / duration * 1.0e-6
> @printf("%dx%dx%d\t%f s\t%f MFLOPS\n", m, n, k, duration, mflops)
> 
> 
> 
> #!/bin/julia -q
> 
> using ArgParse
> 
> s = ArgParseSettings()
> 
> @add_arg_table s begin
> "-m"
> "-k"
> "-n"
> "-t"
> end
> 
> parsed_args = parse_args(s)
> m = parse(Int, parsed_args["m"])
> k = parse(Int, parsed_args["k"])
> n = parse(Int, parsed_args["n"])
> t = parse(Int, parsed_args["t"])
> 
> a = 2 * rand(m, k) - 1
> b = (2 * rand(k, n) - 1)'
> 
> tstart = time()
> for i = 1:t
> c = a * b'
> end
> tend = time()
> duration = (tend - tstart) / t
> mflops = (2 * m * n * k) / duration * 1.0e-6
> @printf("%dx%dx%d\t%f s\t%f MFLOPS\n", m, n, k, duration, mflops)
> 
> 
> 
> #!/bin/julia -q
> 
> using ArgParse
> 
> s = ArgParseSettings()
> 
> @add_arg_table s begin
> "-m"
> "-k"
> "-n"
> "-t"
> end
> 
> parsed_args = parse_args(s)
> m = parse(Int, parsed_args["m"])
> k = parse(Int, parsed_args["k"])
> n = parse(Int, parsed_args["n"])
> t = parse(Int, parsed_args["t"])
> 
> a = (2 * rand(m, k) - 1)'
> b = (2 * rand(k, n) - 1)'
> 
> tstart = time()
> for i = 1:t
> c = a' * b'
> end
> tend = time()
> duration = (tend - tstart) / t
> mflops = (2 * m * n * k) / duration * 1.0e-6
> @printf("%dx%dx%d\t%f s\t%f MFLOPS\n", m, n, k, duration, mflops)



Re: [julia-users] Multiple Dispatch Question

2016-01-18 Thread Christopher Alexander
Thanks for the response!  As a follow-up, what would I do in a situation 
where the passed-in second function (fcn2) and the default function take a 
different number of arguments?

Thanks!

Chris

On Monday, January 18, 2016 at 4:06:41 PM UTC-5, Erik Schnetter wrote:
>
> Define the second function like this: 
> ``` 
> my_func(fcn1::Function, passedIn::Float64) = my_func(fcn1, 
> default_fcn, passedIn) 
> ``` 
>
> -erik 
>
>
> On Mon, Jan 18, 2016 at 4:02 PM, Christopher Alexander 
>  wrote: 
> > Hello all, I had a question concerning a best practice in a particular 
> case 
> > of multiple dispatch which is as follows. 
> > 
> > Let's say I have a function with two different methods. 
> > 
> > function my_func(fcn1::Function,fcn2::Function, passedIn::Float64) 
> >  x = 0.0 
> >  y = 1.0 
> >  z = 2.0 
> >  val1 = fcn(x, y, passedIn) 
> >  val2 = fcn2(y, z, passedIn) 
> >  return val1, val2 
> > end 
> > 
> > function my_func(fcn1::Function, passedIn::Float64) 
> >  x = 0.0 
> >  y = 1.0 
> >  z = 2.0 
> >  val1 = fcn(x, y, passedIn) 
> >  val2 = default_fcn(x, y, z, passedIn) 
> >  return val1, val2 
> > end 
> > 
> > My question is basically, what would be the best way to do this without 
> > massive code duplication?  The actual situation I am working with has 
> much 
> > more going on in the function, so it's not like I could create some init 
> > function to set up x, y, & z.  But literally the only different behavior 
> > between the two methods is whether or not a second function is passed 
> in. 
> > 
> > Thanks! 
> > 
> > Chris 
> > 
>
>
>
> -- 
> Erik Schnetter  
> http://www.perimeterinstitute.ca/personal/eschnetter/ 
>


[julia-users] Executing anonymous function on worker fails when wrapped in a module

2016-01-18 Thread 'Greg Plowman' via julia-users
I'm trying to execute an anonymous function on a worker from within a 
module:

getCores(pid) = remotecall_fetch(pid, ()->CPU_CORES)

module Banana
export getCores2
getCores2(pid) = remotecall_fetch(pid, ()->CPU_CORES)
end



Firstly, is using anonymous function,()->CPU_CORES, as above a good way to 
return a global variable from a worker?

I can execute getCores successfully, but getCores2 fails:
Is it possible to "escape" the anonymous function in some way?


julia> getCores(2)
4

julia> using Banana

julia> getCores2(2)
WARNING: Module Banana not defined on process 2
fatal error on 2: ERROR: UndefVarError: Banana not defined
 in deserialize at serialize.jl:504
 in handle_deserialize at serialize.jl:465
 in deserialize at serialize.jl:696
 in deserialize_datatype at serialize.jl:651
 in handle_deserialize at serialize.jl:465
 in deserialize_expr at serialize.jl:627
 in handle_deserialize at serialize.jl:458
 in deserialize_expr at serialize.jl:627
 in handle_deserialize at serialize.jl:458
 in deserialize_expr at serialize.jl:627
 in handle_deserialize at serialize.jl:458
 in deserialize at serialize.jl:556
 in handle_deserialize at serialize.jl:465
 in deserialize at serialize.jl:538
 in handle_deserialize at serialize.jl:465
 in deserialize at serialize.jl:696
 in deserialize_datatype at serialize.jl:651
 in handle_deserialize at serialize.jl:465
 in message_handler_loop at multi.jl:863
 in process_tcp_streams at multi.jl:852
 in anonymous at task.jl:63
Worker 2 terminated.ERROR: ProcessExitedException()
 in yieldto at task.jl:71
 in wait at task.jl:371
 in wait at task.jl:286
 in wait at channels.jl:93
 in take! at channels.jl:82
 in take! at multi.jl:804
 in remotecall_fetch at multi.jl:730
 in getCores2 at none:3

ERROR (unhandled task failure): EOFError: read end of file
Julia>




[julia-users] Using MNE with Julia using PyCall

2016-01-18 Thread Rob
I am trying to use the python MNE library in Julia.

When I call the python function it returns a `Dict{Any,Any}` instead of a 
type `info`, when I pass this variable back to another python function I 
get the error 

ERROR: PyError (:PyObject_Call) 
> TypeError("info must be an instance of Info, not ",)


How can I keep the PyCall result as the `info` type and not have it 
converted to a Julia dict?


[julia-users] Re: Running a julia script on the web

2016-01-18 Thread hustf
There are many ways to do something like that, and if your users will 
accept to log in with google accounts, I believe the easiest one may be 
running everything on JuliaBox.org

This is how well it can be done. Snapshot: 
 https://jiahao.github.io/julia-blog/2014/06/09/the-colors-of-chemistry.html 


Re: [julia-users] strange error

2016-01-18 Thread Kevin Squire
Hi Fabrizio,

You didn't really provide enough information for anyone to help you here.
Your best bet would be to provide a short snippet of code which has the
problem.

Cheers,
   Kevin

On Mon, Jan 18, 2016 at 10:39 AM, Fabrizio Lacalandra <
fabrizio.lacalan...@gmail.com> wrote:

> Does anyone knows what this message below can mean ? Apparently the same
> code runs with a previous version of Julia, 0.4.0 i think. Now i am running
> the latest 0.4.3
> The line error in MyNetworkcns.jl is fake as that is simply the last line
> of the file itself
>
>
> Thanks
>
> Fabrizio
>
> ERROR: LoadError: LoadError: AssertionError: x.head == :escape
>  in include at boot.jl:261
>  in include_from_node1 at loading.jl:304
>  in include at boot.jl:261
>  in include_from_node1 at loading.jl:304
> while loading E:\CodiciSorgente\ProgrammiJulia\UCOTS\MyNetworkcns.jl, in
> expression starting on line 159
>


Re: [julia-users] Multiple Dispatch Question

2016-01-18 Thread Cedric St-Jean
my_func(fcn1::Function, passedIn::Float64) =
my_func(fcn1, (y, z, passedin)->default_fcn(0.0, y, z, passedin), 
passedIn) 

You could achieve the same effect in one definition if you put fcn2 as a 
keyword argument. Also check out FastAnonymous.jl if performance matters.

On Monday, January 18, 2016 at 4:25:20 PM UTC-5, Christopher Alexander 
wrote:
>
> Thanks for the response!  As a follow-up, what would I do in a situation 
> where the passed-in second function (fcn2) and the default function take a 
> different number of arguments?
>
> Thanks!
>
> Chris
>
> On Monday, January 18, 2016 at 4:06:41 PM UTC-5, Erik Schnetter wrote:
>>
>> Define the second function like this: 
>> ``` 
>> my_func(fcn1::Function, passedIn::Float64) = my_func(fcn1, 
>> default_fcn, passedIn) 
>> ``` 
>>
>> -erik 
>>
>>
>> On Mon, Jan 18, 2016 at 4:02 PM, Christopher Alexander 
>>  wrote: 
>> > Hello all, I had a question concerning a best practice in a particular 
>> case 
>> > of multiple dispatch which is as follows. 
>> > 
>> > Let's say I have a function with two different methods. 
>> > 
>> > function my_func(fcn1::Function,fcn2::Function, passedIn::Float64) 
>> >  x = 0.0 
>> >  y = 1.0 
>> >  z = 2.0 
>> >  val1 = fcn(x, y, passedIn) 
>> >  val2 = fcn2(y, z, passedIn) 
>> >  return val1, val2 
>> > end 
>> > 
>> > function my_func(fcn1::Function, passedIn::Float64) 
>> >  x = 0.0 
>> >  y = 1.0 
>> >  z = 2.0 
>> >  val1 = fcn(x, y, passedIn) 
>> >  val2 = default_fcn(x, y, z, passedIn) 
>> >  return val1, val2 
>> > end 
>> > 
>> > My question is basically, what would be the best way to do this without 
>> > massive code duplication?  The actual situation I am working with has 
>> much 
>> > more going on in the function, so it's not like I could create some 
>> init 
>> > function to set up x, y, & z.  But literally the only different 
>> behavior 
>> > between the two methods is whether or not a second function is passed 
>> in. 
>> > 
>> > Thanks! 
>> > 
>> > Chris 
>> > 
>>
>>
>>
>> -- 
>> Erik Schnetter  
>> http://www.perimeterinstitute.ca/personal/eschnetter/ 
>>
>

[julia-users] Re: Examples of integrating Fortran code in Julia

2016-01-18 Thread pokerhontas2k8
Hi,

first of all, I am new to Julia (and Fortran). I tried to follow OP's 
example and call Fortran from Julia. First, I was using the Intel Fortran 
compiler and tried to compile the following .f90 file (saved as f90tojl.f90)

module m
contains
   integer function five()
  five = 5
   end function five
end module m

as a shared library, by entering the following in the command line:

ifort f90tojl.f90 -O2 -dll -fPIC -o  ifortlib.dll

the compiler ignores -fPIC since that's an unknown option apparently but 
two files are created, one object file and the dll. I then try to call the 
function from Julia and that's where the trouble starts. 

The suggested command in this thread doesn't work because I guess the Intel 
compiler has different name mangling than the gfortran compiler. So I tried 
to find the name of my function, wikipedia suggests m_MP_five_ for ifort, 
so I tried 

julia: ccall( (:m_MP_five, "ifortlib.dll"), Int, () )

LoadError: ccall: could not find function m_MP_five in library ifortlib.dll


So my guess is that I am not using the correct name mangling. I couldn't find 
anything online so I tried to view the function name via the object manager in 
visual studio and an external dll viewer program. 
In visual studio I got a meaningless error the external viewer just didn't do 
anything (although it worked for other dll files). When I type 

julia: Libdl.dlopen("ifortlib.dll")
Ptr{Void} @0x2a9d8fa0

fwiw. At this point I got so pissed that I decided to install the gfortran 
compiler and just follow this thread step by step. So in the cmd window, I type:

gfortran -shared -O2 f90tojl.f90 -fPIC -o gfortlib.dll

(I get a warning that -fPIC is ignored, as written previously in this thread). 
I use the dll viewer to determine the name of the function, its __m_MOD_five 
indeed. Then

julia: ccall( (:__m_MOD_five, "gfortlib.dll"), Int, () )

LoadError: error compiling anonymous: could not load library "gfortlib.dll"
The specified module could not be found.


And 

julia: Libdl.dlopen("gfortlib.dll")
LoadError: could not load library "gfortlib.dll"
The specified module could not be found.


And I have no clue what to do now. 






Re: [julia-users] Julia way of filling columns of a matrix

2016-01-18 Thread Júlio Hoffimann
Yes, I will rely on the classical hcat() approach...

A = []
for i=1:n
  push!(A, [1,2,3])
end
A = hcat(A...)

Thank you.

2016-01-18 11:42 GMT-08:00 Júlio Hoffimann :

> Hi,
>
> Suppose I want to fill the columns of a matrix which size I don't know
> beforehand:
>
> A = zeros(3,0)
> for i=1:n
>   A = [A [1,2,3]]
> end
>
> Is there a memory efficient way of doing that in Julia?
>
> I understand that the above syntax is allocating 3*i entries at iteration
> i which gives 3*(1+2+...+n) = 3*(n+1)n/2 allocations as opposed to 3n.
>
> -Júlio
>


Re: [julia-users] Julia way of filling columns of a matrix

2016-01-18 Thread Kevin Squire
As long as each row has a fixed, known size, you could do

a = []
for i=1:n
append!(a, [1,2,3])
end
A = reshape(a, 3, n)

The 1-D array grows as needed, and reshape still points to the original
data, so no copying is done.

Cheers,
   Kevin

On Mon, Jan 18, 2016 at 12:48 PM, Júlio Hoffimann  wrote:

> Yes, I will rely on the classical hcat() approach...
>
> A = []
> for i=1:n
>   push!(A, [1,2,3])
> end
> A = hcat(A...)
>
> Thank you.
>
> 2016-01-18 11:42 GMT-08:00 Júlio Hoffimann :
>
>> Hi,
>>
>> Suppose I want to fill the columns of a matrix which size I don't know
>> beforehand:
>>
>> A = zeros(3,0)
>> for i=1:n
>>   A = [A [1,2,3]]
>> end
>>
>> Is there a memory efficient way of doing that in Julia?
>>
>> I understand that the above syntax is allocating 3*i entries at iteration
>> i which gives 3*(1+2+...+n) = 3*(n+1)n/2 allocations as opposed to 3n.
>>
>> -Júlio
>>
>
>


[julia-users] Multiple Dispatch Question

2016-01-18 Thread Christopher Alexander
Hello all, I had a question concerning a best practice in a particular case 
of multiple dispatch which is as follows.

Let's say I have a function with two different methods.

function my_func(fcn1::Function,fcn2::Function, passedIn::Float64)
 x = 0.0
 y = 1.0
 z = 2.0
 val1 = fcn(x, y, passedIn)
 val2 = fcn2(y, z, passedIn)
 return val1, val2
end

function my_func(fcn1::Function, passedIn::Float64)
 x = 0.0
 y = 1.0
 z = 2.0
 val1 = fcn(x, y, passedIn)
 val2 = default_fcn(x, y, z, passedIn)
 return val1, val2
end

My question is basically, what would be the best way to do this without 
massive code duplication?  The actual situation I am working with has much 
more going on in the function, so it's not like I could create some init 
function to set up x, y, & z.  But literally the only different behavior 
between the two methods is whether or not a second function is passed in.

Thanks!

Chris



Re: [julia-users] Simultaneous audio playback / recording.

2016-01-18 Thread Miguel Bazdresch
An alternative would be to interact with the sound card using sox (
http://sox.sourceforge.net/). In the past, I used sox from Octave to record
and play audio simultaneously. Let me know if you'd like to see the code; I
can probably dig it out of my old backups.

-- mb

On Sun, Jan 17, 2016 at 1:30 PM, CrocoDuck O'Ducks <
crocoduck.odu...@gmail.com> wrote:

> Hi there!
>
> I have a number MATLAB scripts that I use for electro-acoustic
> measurements (mainly impulse responses) and I would like to port them to
> JULIA. I also written the data acquisition in MATLAB. It works by streaming
> the test signal to the soundcard outputs while recording from the soundcard
> inputs. I would like to implement that as well. I was looking at AudioIO
> but there are few issues:
>
>
>- I cannot find documentation/examples on how to record from soundcard
>input.
>- I cannot find documentation/examples on selecting the audio device
>to use.
>- I cannot find documentation/examples on setting sampling variables
>(it is in my best interest to use the highest sample rate available).
>
> Are these things possible with JULIA? I think I can use aplayer and
> arecord through JULIA (I am on Linux), but I was wishing to have all the
> code contained within JULIA.
>
> Many thanks, I hope you can point me in the right direction.
>


[julia-users] Re: Immutable type with a function datatype

2016-01-18 Thread Simon Danisch
Am I missing something, or why isn't there this solution:

@enum Metric RIEMANNIAN LORENTZIAN # ...
immutable Sphere{Matric}
dim::Int
end
function metric(s::Sphere{RIEMANNIAN})

end
function metric(s::Sphere{LORENTZIAN})

end

Am Montag, 18. Januar 2016 16:08:38 UTC+1 schrieb Anonymous:
>
> Is the following code considered bad form in Julia?
>
> immutable Foo
> func::Function
> end
>
> foo = Foo(x->x^2)
> foo.func(3)
>
> This mimics the behavior of OOP since just like in OOP the internal method 
> cannot be changed (since the type is immutable).  Sometimes it really does 
> make the most sense to attach a function to an instance of a type, do I 
> take a performance hit doing things this way?
>