Re: [julia-users] Re: Defining a new numeric type with minimal effort

2016-06-10 Thread Kevin Kunzmann
Okay,

so what I want to do is

immutable MyType{T<:Real}
  val::T
  function MyType(x::T)
@assert 0 <= x
@assert x <= 1
new(x)
  end
end
MyType{T<:Real}(x::T) = MyType{T}(x)

So how do I get this thing to "walk", i.e., get "*" functionality. Out of 
the box I would feel that a subtype of Real should "inherit" +-*/ in some 
way - whats the point of being a subtype of Real otherwise?

What I would hope for is that I only need to provide a convert method from 
Real back to MyType, so that the operation can be performed as for a 
suitable Real and then the conversion back ensures type consistency.

I feel as if I am still missing the central point here...

Best Kevin


On Saturday, 11 June 2016 07:54:42 UTC+2, Kevin Kunzmann wrote:
>
> Hey,
>
> thanks for sticking with me ;)
>
> I am, however, a little bit confused now (seems that oo and parallelism 
> are the hardest to grasp in julia).
>
> I see that '+' was a bad example. So my error was, that I did not name the 
> field correctly, as 'Real' is expecting it to be named 'val'? Jeez, if this 
> is true then how is one ever going to 'inherit correctly' from a 
> complicated abstract type? There is no way of telling which fileds are 
> accessed how by all methods operating on that type x)
>
> So, put simply: What is the Julia way of ensuring that I pass a valid 
> probability to my function (figure between 0 and 1). Please do not tell me 
> that I am supposed to use @assert statements x)
> I would feel that a new type would be the cleanest way to do so?
>
> Best,
>
> Kevin 
>
> On Saturday, 11 June 2016 04:37:32 UTC+2, Jeffrey Sarnoff wrote:
>>
>> Hi Kevin,
>>
>> Right questions, different way.
>>
>> Julia's type system is
>>of  shared behavior for sharing behaviors.
>>for specialization as constructive delegation
>>
>> Real is an abstract type that is supertype to some other abstract types 
>>(Integer, Rational, FloatingPoint, FixedPoint, Probability, ...).
>>   
>> [all kinds of]
>>
>> Each immediate subtype of the Real is an abstract type that is supertype 
>> to some type[s].
>> For every individual immediate subtype of Real,  that individual is 
>> supertype to abstract types and/or/orelse to concrete types.
>>Bool <: 
>> Integer <: Real
>>   Union{ Int32, Int64 } <: Signed <: Integer <: Real
>>   Union{ Rational{Int32}, Rational{Int64} } <: Rational <: Real
>>
>>*note that ..currently.. there is not*  Integer <: Rational 
>> <: Real,
>>as ..currently.. all type-based inheritance may associate a 
>> concrete type with an abstraction
>> and that abstraction may be elaborated as a long chain 
>> linking single supertypes
>> or may associate a concrete type with the concretion of concrete 
>> constituents given as its field's types
>> orelse carry some of both manner of information, an enfolding of 
>> the elaborative and the constitutive.
>> 
>> in the early Summer of 2016:
>> Single inheritance of abstract type, and an inheriting abstract type 
>> may itself be inherited.
>> One single inheritance of an abstract type by a Concrete type, 
>>  and the same, jointly or independently for  each of its 
>> concretely typed fields.
>> Any type, abstract or concrete can be defined with one or more 
>> parameters (a parameterized type),
>>  each distinct value(s of the tuple) of the parameter constitutes 
>> a uniquely defined type,
>>  there is support for specifying manner of co-action and 
>> interaction for parameterized type 'siblings',
>>  as there is for specifying the interworking of types and 
>> intraworkings of values of a single type.
>>
>> The architects know how to let abstract types be functionally 
>> dispatchable into, just as sqrt(x::Int64) and sqrt(x::Float64) are 
>> dispatched into specializations of sqrt() for a Int64 or for a Float64 
>> argument.
>> The mechanism is being rethought so that a better, more widely useful way 
>> obtain (does this and makes
>> it easy to process with and fully support software interface protocols -- 
>> and enforce api constraints).
>>
>> Meanwhile "inheriting from Real" does give the type fallback processing 
>> for basic mathematical handling,
>> and also encourages some reimplementation, if only to delegate the 
>> calculation to the type's value field.
>> Multiplying two probabilities as reals gives a result that smaller than 
>> either (or equal to smaller of the two),
>> which is not what happens to the probability of a win when more skilled 
>> players join in the effort.
>>
>> Enjoy,
>>
>> Jeffrey
>>
>>  
>>
>>
>>
>>
>>   
>> There is desire and activity intending 
>>
>>
>>  
>>
>>
>>
>>which is not the same as red marbles are a color of Marbleness 
>> sculpted of a Material inh

Re: [julia-users] Re: Defining a new numeric type with minimal effort

2016-06-10 Thread Kevin Kunzmann
Sorry for spamming,

but let me reduce this to it's actual core - forget about probability. what 
I want to do is

immutable MyType{T<:Real}
  val::T
  function MyType(x::T)






On Saturday, 11 June 2016 07:54:42 UTC+2, Kevin Kunzmann wrote:
>
> Hey,
>
> thanks for sticking with me ;)
>
> I am, however, a little bit confused now (seems that oo and parallelism 
> are the hardest to grasp in julia).
>
> I see that '+' was a bad example. So my error was, that I did not name the 
> field correctly, as 'Real' is expecting it to be named 'val'? Jeez, if this 
> is true then how is one ever going to 'inherit correctly' from a 
> complicated abstract type? There is no way of telling which fileds are 
> accessed how by all methods operating on that type x)
>
> So, put simply: What is the Julia way of ensuring that I pass a valid 
> probability to my function (figure between 0 and 1). Please do not tell me 
> that I am supposed to use @assert statements x)
> I would feel that a new type would be the cleanest way to do so?
>
> Best,
>
> Kevin 
>
> On Saturday, 11 June 2016 04:37:32 UTC+2, Jeffrey Sarnoff wrote:
>>
>> Hi Kevin,
>>
>> Right questions, different way.
>>
>> Julia's type system is
>>of  shared behavior for sharing behaviors.
>>for specialization as constructive delegation
>>
>> Real is an abstract type that is supertype to some other abstract types 
>>(Integer, Rational, FloatingPoint, FixedPoint, Probability, ...).
>>   
>> [all kinds of]
>>
>> Each immediate subtype of the Real is an abstract type that is supertype 
>> to some type[s].
>> For every individual immediate subtype of Real,  that individual is 
>> supertype to abstract types and/or/orelse to concrete types.
>>Bool <: 
>> Integer <: Real
>>   Union{ Int32, Int64 } <: Signed <: Integer <: Real
>>   Union{ Rational{Int32}, Rational{Int64} } <: Rational <: Real
>>
>>*note that ..currently.. there is not*  Integer <: Rational 
>> <: Real,
>>as ..currently.. all type-based inheritance may associate a 
>> concrete type with an abstraction
>> and that abstraction may be elaborated as a long chain 
>> linking single supertypes
>> or may associate a concrete type with the concretion of concrete 
>> constituents given as its field's types
>> orelse carry some of both manner of information, an enfolding of 
>> the elaborative and the constitutive.
>> 
>> in the early Summer of 2016:
>> Single inheritance of abstract type, and an inheriting abstract type 
>> may itself be inherited.
>> One single inheritance of an abstract type by a Concrete type, 
>>  and the same, jointly or independently for  each of its 
>> concretely typed fields.
>> Any type, abstract or concrete can be defined with one or more 
>> parameters (a parameterized type),
>>  each distinct value(s of the tuple) of the parameter constitutes 
>> a uniquely defined type,
>>  there is support for specifying manner of co-action and 
>> interaction for parameterized type 'siblings',
>>  as there is for specifying the interworking of types and 
>> intraworkings of values of a single type.
>>
>> The architects know how to let abstract types be functionally 
>> dispatchable into, just as sqrt(x::Int64) and sqrt(x::Float64) are 
>> dispatched into specializations of sqrt() for a Int64 or for a Float64 
>> argument.
>> The mechanism is being rethought so that a better, more widely useful way 
>> obtain (does this and makes
>> it easy to process with and fully support software interface protocols -- 
>> and enforce api constraints).
>>
>> Meanwhile "inheriting from Real" does give the type fallback processing 
>> for basic mathematical handling,
>> and also encourages some reimplementation, if only to delegate the 
>> calculation to the type's value field.
>> Multiplying two probabilities as reals gives a result that smaller than 
>> either (or equal to smaller of the two),
>> which is not what happens to the probability of a win when more skilled 
>> players join in the effort.
>>
>> Enjoy,
>>
>> Jeffrey
>>
>>  
>>
>>
>>
>>
>>   
>> There is desire and activity intending 
>>
>>
>>  
>>
>>
>>
>>which is not the same as red marbles are a color of Marbleness 
>> sculpted of a Material inheritance.
>>  types 
>>Integers are Real, Rationals are Real  Integers are not Rationals
>>
>>
>> (you are reading how it is 
>>
>> On Fri, Jun 10, 2016 at 6:15 PM, Kevin Kunzmann  
>> wrote:
>>
>>> Hey Jeffrey,
>>>
>>> it's been a while, thx for the answer. I see that this would be working. 
>>> However, what about min, max, sin, etc.? I do not want to re-implement all 
>>> elementary functions for the Probability type. There must be some way to 
>>> inherit the behaviour of the abstract super

Re: [julia-users] Re: Defining a new numeric type with minimal effort

2016-06-10 Thread Kevin Kunzmann
Hey,

thanks for sticking with me ;)

I am, however, a little bit confused now (seems that oo and parallelism are 
the hardest to grasp in julia).

I see that '+' was a bad example. So my error was, that I did not name the 
field correctly, as 'Real' is expecting it to be named 'val'? Jeez, if this 
is true then how is one ever going to 'inherit correctly' from a 
complicated abstract type? There is no way of telling which fileds are 
accessed how by all methods operating on that type x)

So, put simply: What is the Julia way of ensuring that I pass a valid 
probability to my function (figure between 0 and 1). Please do not tell me 
that I am supposed to use @assert statements x)
I would feel that a new type would be the cleanest way to do so?

Best,

Kevin 

On Saturday, 11 June 2016 04:37:32 UTC+2, Jeffrey Sarnoff wrote:
>
> Hi Kevin,
>
> Right questions, different way.
>
> Julia's type system is
>of  shared behavior for sharing behaviors.
>for specialization as constructive delegation
>
> Real is an abstract type that is supertype to some other abstract types 
>(Integer, Rational, FloatingPoint, FixedPoint, Probability, ...).
>   
> [all kinds of]
>
> Each immediate subtype of the Real is an abstract type that is supertype 
> to some type[s].
> For every individual immediate subtype of Real,  that individual is 
> supertype to abstract types and/or/orelse to concrete types.
>Bool <: 
> Integer <: Real
>   Union{ Int32, Int64 } <: Signed <: Integer <: Real
>   Union{ Rational{Int32}, Rational{Int64} } <: Rational <: Real
>
>*note that ..currently.. there is not*  Integer <: Rational <: 
> Real,
>as ..currently.. all type-based inheritance may associate a 
> concrete type with an abstraction
> and that abstraction may be elaborated as a long chain linking 
> single supertypes
> or may associate a concrete type with the concretion of concrete 
> constituents given as its field's types
> orelse carry some of both manner of information, an enfolding of 
> the elaborative and the constitutive.
> 
> in the early Summer of 2016:
> Single inheritance of abstract type, and an inheriting abstract type 
> may itself be inherited.
> One single inheritance of an abstract type by a Concrete type, 
>  and the same, jointly or independently for  each of its 
> concretely typed fields.
> Any type, abstract or concrete can be defined with one or more 
> parameters (a parameterized type),
>  each distinct value(s of the tuple) of the parameter constitutes 
> a uniquely defined type,
>  there is support for specifying manner of co-action and 
> interaction for parameterized type 'siblings',
>  as there is for specifying the interworking of types and 
> intraworkings of values of a single type.
>
> The architects know how to let abstract types be functionally dispatchable 
> into, just as sqrt(x::Int64) and sqrt(x::Float64) are dispatched into 
> specializations of sqrt() for a Int64 or for a Float64 argument.
> The mechanism is being rethought so that a better, more widely useful way 
> obtain (does this and makes
> it easy to process with and fully support software interface protocols -- 
> and enforce api constraints).
>
> Meanwhile "inheriting from Real" does give the type fallback processing 
> for basic mathematical handling,
> and also encourages some reimplementation, if only to delegate the 
> calculation to the type's value field.
> Multiplying two probabilities as reals gives a result that smaller than 
> either (or equal to smaller of the two),
> which is not what happens to the probability of a win when more skilled 
> players join in the effort.
>
> Enjoy,
>
> Jeffrey
>
>  
>
>
>
>
>   
> There is desire and activity intending 
>
>
>  
>
>
>
>which is not the same as red marbles are a color of Marbleness 
> sculpted of a Material inheritance.
>  types 
>Integers are Real, Rationals are Real  Integers are not Rationals
>
>
> (you are reading how it is 
>
> On Fri, Jun 10, 2016 at 6:15 PM, Kevin Kunzmann  > wrote:
>
>> Hey Jeffrey,
>>
>> it's been a while, thx for the answer. I see that this would be working. 
>> However, what about min, max, sin, etc.? I do not want to re-implement all 
>> elementary functions for the Probability type. There must be some way to 
>> inherit the behaviour of the abstract supertype "Real". I guess I am 
>> missing something fundamental about the type system here. 
>>
>> I felt tat something like
>>
>> type Probability{T<:Real} <:Real
>> p::T
>> end
>>
>> import Base.convert
>>
>> convert{T<:Real}(::Type{T}, x::Probability) = convert(T, x.p)
>> convert{T1<:Real, T2<:Real}(::Type{Probability{T1}}, x::T2) = 
>> Probability(convert(T1, x))
>>
>>
>> should do t

Re: [julia-users] Re: Does it make sense to loop over types?

2016-06-10 Thread Tim Holy
While it seems attractive in principle, in practice this will hurt 
performance. Because of the loop, there is no single call-site with consistent 
types for which the compiler can infer in advance which method you're going to 
call. Consequently, you're forcing julia to do dynamic lookup on each call. If 
you really need to do this, it would be much better if you unrolled that loop 
by hand.

See more detail here:
http://docs.julialang.org/en/latest/manual/performance-tips/#types-with-values-as-parameters
and in the following section:
http://docs.julialang.org/en/latest/manual/performance-tips/#the-dangers-of-abusing-multiple-dispatch-aka-more-on-types-with-values-as-parameters

Best,
--Tim

On Friday, June 10, 2016 8:39:35 PM CDT Po Choi wrote:
> Thanks for pointing out the mistakes.
> 
> I want to loop over the data types because I want to loop over functions.
> I need to do something like this:
> function mywork(x, y, z)
>   procedure1(x, y, z)
>   procedure2(x, y, z)
>   procedure3(x, y, z)
>   #...
>   procedure8(x, y, z)
>   procedure9(x, y, z)
>   procedure10(x, y, z)
> end
> 
> I feel it may be silly to write out all the procedurek(x, y, z)
> Then I am thinking about the following:
> function mywork(x, y, z)
>   for k in 1:10
> procedure(::Val{k}, x, y, z)
>   end
> end
> 
> I have not seen anyone doing this before. So I am not sure I am doing
> something right.
> That's why I am asking this question.
> 
> I remember people like to use macro to define functions for different types.
> In my case, should I use macro? Or am I fine to code as above?
> 
> On Friday, June 10, 2016 at 7:34:09 PM UTC-7, Erik Schnetter wrote:
> > Your code won't work. `Val{k}` is a type, which is an object in Julia. You
> > are passing this object to `foo`. Thus in your declaration of `foo`, you
> > need to list the type of `Val`, not just `Val` -- types have types as
> > well:
> > ```
> > foo(::Type{Val{1}}) = 1
> > ```
> > 
> > Of course you know that using `Val` in this way is nonsensical in a real
> > program. I understand that you know this, as you're purposefully
> > experimenting with Julia, but I'd still like to point it out for the
> > casual
> > reader of this example.
> > 
> > Whether you encounter "performance issues" or not depends on what
> > performance you need. If you compare this code to simple arithmetic
> > operations (adding numbers), then it's slower. If you compare it to
> > sending
> > data across the network or accessing the disk, then it's faster.
> > 
> > I assume that calling `foo` in the loop requires a hash table lookup at
> > run time, and likely some memory allocation.
> > 
> > -erik
> > 
> > 
> > On Fri, Jun 10, 2016 at 9:40 PM, Po Choi  > 
> > > wrote:
> >> Ops! I accidentally hit the post button! So the post is not completed!
> >> 
> >> It is an example:
> >> foo(::Val{1}) = 1
> >> foo(::Val{2}) = 2
> >> foo(::Val{2}) = 3
> >> 
> >> function bar()
> >> 
> >>   s = 0
> >>   for t in Datatype[Val{k} for k in 1:3]
> >>   
> >> s += foo(t)
> >>   
> >>   end
> >> 
> >> end
> >> 
> >> Will there be any performance issue if I loop over types?
> >> I am still trying to understand how the multiple-dispatch works.
> >> Sometimes I am confused!
> >> 
> >> On Friday, June 10, 2016 at 6:37:31 PM UTC-7, Po Choi wrote:
> >>> foo(::Val{1}) = 1
> >>> foo(::Val{2}) = 2
> >>> foo(::Val{2}) = 3
> >>> 
> >>> function bar()
> >>> 
> >>> for t in Datatype[Val{k} for k in 1:3]
> >>> 
> >>>   end
> >>> 
> >>> end




Re: [julia-users] Re: testing positive definiteness via Cholesky?

2016-06-10 Thread Tim Holy
Two other advantages of just biting the bullet and implementing this for 
PositiveFactorizations:
- there is no iteration step; the first time you try to factor it, it works.
- when you think about it, the classic approach of adding a diagonal factor is 
(in my humble opinion) wrong, wrong, wrong. See the discussion linked from the 
PositiveFactorization README.

Best,
--Tim

On Friday, June 10, 2016 5:22:03 PM CDT vava...@uwaterloo.ca wrote:
> Dear Viral and Tim,
> 
> Thanks for your prompt responses to my query.  I should have stated more
> precisely how I am using positive definiteness testing.  The application is
> a classic trust-region method for optimization.  In a trust region method,
> the main operation is as follows.  The input is a symmetric (typically
> sparse) matrix A and a vector b.  The problem is to compute a certain real
> parameter lambda.  One tests if A + lambda*speye(n) is positive definite;
> if so, then one solves a linear system with this coefficient matrix, and if
> not, one increases lambda and tries again.
> 
> So the problem with using isposdef is that, in the case that A is actually
> positive definite, isposdef discards the Cholesky factor, so then my
> application would need to compute it again (redundantly) to solve the
> system.  In the case of the PostiveFactorizations.jl package, it appears
> that the package is aimed at dense rather than sparse matrices.
> 
> So aside from rewriting cholfact, it seems that the only remaining solution
> is Viral's suggestion to catch an exception from cholfact.  This raises
> another question.  Most C++ textbooks advise against using exceptions for
> ordinary control-flow on the grounds that throwing and catching an
> exception is a time-consuming operation.  How about in Julia?  Is it
> reasonable to use try/throw/catch for ordinary control flow in a scientific
> code?  The Julia manual states that exceptions are much slower than if
> statements.  But on the other hand, isposdef in cholmod.jl is written in
> terms of exceptions!
> 
> Thanks,
> Steve
> 
> On Thursday, June 9, 2016 at 10:16:29 PM UTC-4, vav...@uwaterloo.ca wrote:
> > In Matlab to check if a symmetric sparse matrix is positive definite, I
> > can say [R,p]=chol(A) and then if p>0 etc.  Is this functionality
> > available
> > in Julia?  The cholfact standard routine throws an exception if its
> > argument is not positive definite rather than returning any helpful
> > information.
> > 
> > I looked at the code for cholfact in cholmod.jl in Base; it appears that I
> > can write a modified version of cholfact that exposes this functionality.
> > But it would be better if the functionality were available in the library
> > so that my code is not sensitive to changes in undocumented low-level
> > routines.
> > 
> > Thanks,
> > Steve Vavasis




Re: [julia-users] Re: testing positive definiteness via Cholesky?

2016-06-10 Thread Tim Holy
On Friday, June 10, 2016 5:22:03 PM CDT vava...@uwaterloo.ca wrote:
> In the case of the PostiveFactorizations.jl package, it appears 
> that the package is aimed at dense rather than sparse matrices.

No, it's just that sparse matrices haven't been implemented yet. Sparse is 
definitely on the horizon; the advantage is that it would not suffer from the 
"discard the factorization" problem you noted.

For sparse, in principle it seems like that one could use the analysis 
routines of CHOLMOD to determine a good permutation; computing the actual 
factorization would then be fairly trivial, once one wraps one's head around 
the communication between CHOLMOD and julia.

This is not in my immediate TODO list, but contributions are of course 
welcome!

Best,
--Tim



Re: [julia-users] Re: Does it make sense to loop over types?

2016-06-10 Thread Erik Schnetter
In your case, there's no need to use a macro. There's also no need to use
`Val`; you can just pass `k` directly.

If you want to loop over functions, then you can put the functions into an
array, and loop over it:

```Julia
function mywork(x, y, z)
  for k in 1:10
procedure(k, x, y, z)
  end
end
```

or

```Julia
function mywork(x, y, z)
  for p in [proc1, proc2, proc3, ...]
p(x, y, z)
  end
end
```

-erik


On Fri, Jun 10, 2016 at 11:39 PM, Po Choi  wrote:

> Thanks for pointing out the mistakes.
>
> I want to loop over the data types because I want to loop over functions.
> I need to do something like this:
> function mywork(x, y, z)
>   procedure1(x, y, z)
>   procedure2(x, y, z)
>   procedure3(x, y, z)
>   #...
>   procedure8(x, y, z)
>   procedure9(x, y, z)
>   procedure10(x, y, z)
> end
>
> I feel it may be silly to write out all the procedurek(x, y, z)
> Then I am thinking about the following:
> function mywork(x, y, z)
>   for k in 1:10
> procedure(::Val{k}, x, y, z)
>   end
> end
>
> I have not seen anyone doing this before. So I am not sure I am doing
> something right.
> That's why I am asking this question.
>
> I remember people like to use macro to define functions for different
> types.
> In my case, should I use macro? Or am I fine to code as above?
>
>
> On Friday, June 10, 2016 at 7:34:09 PM UTC-7, Erik Schnetter wrote:
>>
>> Your code won't work. `Val{k}` is a type, which is an object in Julia.
>> You are passing this object to `foo`. Thus in your declaration of `foo`,
>> you need to list the type of `Val`, not just `Val` -- types have types as
>> well:
>> ```
>> foo(::Type{Val{1}}) = 1
>> ```
>>
>> Of course you know that using `Val` in this way is nonsensical in a real
>> program. I understand that you know this, as you're purposefully
>> experimenting with Julia, but I'd still like to point it out for the casual
>> reader of this example.
>>
>> Whether you encounter "performance issues" or not depends on what
>> performance you need. If you compare this code to simple arithmetic
>> operations (adding numbers), then it's slower. If you compare it to sending
>> data across the network or accessing the disk, then it's faster.
>>
>> I assume that calling `foo` in the loop requires a hash table lookup at
>> run time, and likely some memory allocation.
>>
>> -erik
>>
>>
>> On Fri, Jun 10, 2016 at 9:40 PM, Po Choi  wrote:
>>
>>> Ops! I accidentally hit the post button! So the post is not completed!
>>>
>>> It is an example:
>>> foo(::Val{1}) = 1
>>> foo(::Val{2}) = 2
>>> foo(::Val{2}) = 3
>>>
>>> function bar()
>>>   s = 0
>>>   for t in Datatype[Val{k} for k in 1:3]
>>> s += foo(t)
>>>   end
>>> end
>>>
>>> Will there be any performance issue if I loop over types?
>>> I am still trying to understand how the multiple-dispatch works.
>>> Sometimes I am confused!
>>>
>>>
>>> On Friday, June 10, 2016 at 6:37:31 PM UTC-7, Po Choi wrote:



 foo(::Val{1}) = 1
 foo(::Val{2}) = 2
 foo(::Val{2}) = 3

 function bar()

 for t in Datatype[Val{k} for k in 1:3]

   end
 end











>>
>>
>> --
>> Erik Schnetter 
>> http://www.perimeterinstitute.ca/personal/eschnetter/
>>
>


-- 
Erik Schnetter 
http://www.perimeterinstitute.ca/personal/eschnetter/


Re: [julia-users] Re: Does it make sense to loop over types?

2016-06-10 Thread Po Choi
Thanks for pointing out the mistakes.

I want to loop over the data types because I want to loop over functions.
I need to do something like this:
function mywork(x, y, z)
  procedure1(x, y, z)
  procedure2(x, y, z)
  procedure3(x, y, z)
  #...
  procedure8(x, y, z)
  procedure9(x, y, z)
  procedure10(x, y, z)
end

I feel it may be silly to write out all the procedurek(x, y, z)
Then I am thinking about the following:
function mywork(x, y, z)
  for k in 1:10
procedure(::Val{k}, x, y, z)
  end
end

I have not seen anyone doing this before. So I am not sure I am doing 
something right.
That's why I am asking this question.

I remember people like to use macro to define functions for different types.
In my case, should I use macro? Or am I fine to code as above?


On Friday, June 10, 2016 at 7:34:09 PM UTC-7, Erik Schnetter wrote:
>
> Your code won't work. `Val{k}` is a type, which is an object in Julia. You 
> are passing this object to `foo`. Thus in your declaration of `foo`, you 
> need to list the type of `Val`, not just `Val` -- types have types as well:
> ```
> foo(::Type{Val{1}}) = 1
> ```
>
> Of course you know that using `Val` in this way is nonsensical in a real 
> program. I understand that you know this, as you're purposefully 
> experimenting with Julia, but I'd still like to point it out for the casual 
> reader of this example.
>
> Whether you encounter "performance issues" or not depends on what 
> performance you need. If you compare this code to simple arithmetic 
> operations (adding numbers), then it's slower. If you compare it to sending 
> data across the network or accessing the disk, then it's faster.
>
> I assume that calling `foo` in the loop requires a hash table lookup at 
> run time, and likely some memory allocation.
>
> -erik
>
>
> On Fri, Jun 10, 2016 at 9:40 PM, Po Choi  > wrote:
>
>> Ops! I accidentally hit the post button! So the post is not completed!
>>
>> It is an example:
>> foo(::Val{1}) = 1
>> foo(::Val{2}) = 2
>> foo(::Val{2}) = 3
>>
>> function bar()
>>   s = 0
>>   for t in Datatype[Val{k} for k in 1:3]
>> s += foo(t)
>>   end
>> end
>>
>> Will there be any performance issue if I loop over types?
>> I am still trying to understand how the multiple-dispatch works. 
>> Sometimes I am confused!
>>
>>
>> On Friday, June 10, 2016 at 6:37:31 PM UTC-7, Po Choi wrote:
>>>
>>>
>>>
>>> foo(::Val{1}) = 1
>>> foo(::Val{2}) = 2
>>> foo(::Val{2}) = 3
>>>
>>> function bar()
>>>   
>>> for t in Datatype[Val{k} for k in 1:3]
>>>
>>>   end
>>> end
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>
>
> -- 
> Erik Schnetter > 
> http://www.perimeterinstitute.ca/personal/eschnetter/
>


Re: [julia-users] why the numerical result is different (RK45 julia and matlab)?

2016-06-10 Thread Erik Schnetter
When you write `(...)^1/8`, you probably mean `(...)^(1/8)` instead.

-erik

On Fri, Jun 10, 2016 at 10:29 PM,  wrote:

> this is the test for equation differential using runge-kutta45: f(x,y)=
> (-5*x - y/5)^1/8 + 10
>
>
> 
>
> why the numerical result is different? I used :
>
> function Rk_JL()
>  f(x,y)= (-5*x - y/5)^1/8 + 10
>  tspan = 0:0.001:n
>  y0 = [0.0, 1.0]
>  return ODE.ode45(f, y0,tspan);end
>
>
> and
>
>
> function [X1,Y1] = RK_M()
>  f = @(x,y) (-5*x - y/5)^1/8 + 10;
>  tspan = 0:0.001:n;
>  y0 = 1
>  [X1,Y1]= ode45(f,tspan,1);end
>
>


-- 
Erik Schnetter 
http://www.perimeterinstitute.ca/personal/eschnetter/


Re: [julia-users] Re: Defining a new numeric type with minimal effort

2016-06-10 Thread Jeffrey Sarnoff
Hi Kevin,

Right questions, different way.

Julia's type system is
   of  shared behavior for sharing behaviors.
   for specialization as constructive delegation

Real is an abstract type that is supertype to some other abstract types
   (Integer, Rational, FloatingPoint, FixedPoint, Probability, ...).

[all kinds of]

Each immediate subtype of the Real is an abstract type that is supertype to
some type[s].
For every individual immediate subtype of Real,  that individual is
supertype to abstract types and/or/orelse to concrete types.
   Bool <:
Integer <: Real
  Union{ Int32, Int64 } <: Signed <: Integer <: Real
  Union{ Rational{Int32}, Rational{Int64} } <: Rational <: Real

   *note that ..currently.. there is not*  Integer <: Rational <:
Real,
   as ..currently.. all type-based inheritance may associate a concrete
type with an abstraction
and that abstraction may be elaborated as a long chain linking
single supertypes
or may associate a concrete type with the concretion of concrete
constituents given as its field's types
orelse carry some of both manner of information, an enfolding of
the elaborative and the constitutive.

in the early Summer of 2016:
Single inheritance of abstract type, and an inheriting abstract type
may itself be inherited.
One single inheritance of an abstract type by a Concrete type,
 and the same, jointly or independently for  each of its concretely
typed fields.
Any type, abstract or concrete can be defined with one or more
parameters (a parameterized type),
 each distinct value(s of the tuple) of the parameter constitutes a
uniquely defined type,
 there is support for specifying manner of co-action and
interaction for parameterized type 'siblings',
 as there is for specifying the interworking of types and
intraworkings of values of a single type.

The architects know how to let abstract types be functionally dispatchable
into, just as sqrt(x::Int64) and sqrt(x::Float64) are dispatched into
specializations of sqrt() for a Int64 or for a Float64 argument.
The mechanism is being rethought so that a better, more widely useful way
obtain (does this and makes
it easy to process with and fully support software interface protocols --
and enforce api constraints).

Meanwhile "inheriting from Real" does give the type fallback processing for
basic mathematical handling,
and also encourages some reimplementation, if only to delegate the
calculation to the type's value field.
Multiplying two probabilities as reals gives a result that smaller than
either (or equal to smaller of the two),
which is not what happens to the probability of a win when more skilled
players join in the effort.

Enjoy,

Jeffrey







There is desire and activity intending






   which is not the same as red marbles are a color of Marbleness
sculpted of a Material inheritance.
 types
   Integers are Real, Rationals are Real  Integers are not Rationals


(you are reading how it is

On Fri, Jun 10, 2016 at 6:15 PM, Kevin Kunzmann 
wrote:

> Hey Jeffrey,
>
> it's been a while, thx for the answer. I see that this would be working.
> However, what about min, max, sin, etc.? I do not want to re-implement all
> elementary functions for the Probability type. There must be some way to
> inherit the behaviour of the abstract supertype "Real". I guess I am
> missing something fundamental about the type system here.
>
> I felt tat something like
>
> type Probability{T<:Real} <:Real
> p::T
> end
>
> import Base.convert
>
> convert{T<:Real}(::Type{T}, x::Probability) = convert(T, x.p)
> convert{T1<:Real, T2<:Real}(::Type{Probability{T1}}, x::T2) =
> Probability(convert(T1, x))
>
>
> should do the job as now any Probability can be converted to any concrete
> subtype of Real and "+" shoud be implemented there ;)
> Very strange, how does Julia handle inheritance at all???
>
> Best,
>
> Kevin
>
>
> On Monday, 29 February 2016 23:45:35 UTC+1, Jeffrey Sarnoff wrote:
>>
>> Kevin,
>>
>> If all that you ask of this type is that it does arithmetic, clamps any
>> negative values to zero, and clamps any values greater than one to one,
>> that is easy enough. Just note that arithmetic with probabilities usually
>> is more subtle than that.
>>
>> import Base: +,-,*,/
>>
>> immutable Probability <: Real
>>  val::Float64
>>
>>  Probability(x::Float64) = min(1.0, max(0.0, x))
>> end
>>
>> (+){T<:Probability}(a::T, b::T) = Probability( a.val + b.val )
>> (-){T<:Probability}(a::T, b::T) = Probability( a.val - b.val )
>> (*){T<:Probability}(a::T, b::T) = Probability( a.val * b.val )
>> (/){T<:Probability}(a::T, b::T) = Probability( a.val / b.val )
>>
>> You need conversion and promotion if you want to mix Float64 values and
>> Probability values: 2.0 * Probability(0.25) == Probability(0.5).
>>
>> On Sunday, February 28, 2016 at 1

Re: [julia-users] Re: Does it make sense to loop over types?

2016-06-10 Thread Erik Schnetter
Your code won't work. `Val{k}` is a type, which is an object in Julia. You
are passing this object to `foo`. Thus in your declaration of `foo`, you
need to list the type of `Val`, not just `Val` -- types have types as well:
```
foo(::Type{Val{1}}) = 1
```

Of course you know that using `Val` in this way is nonsensical in a real
program. I understand that you know this, as you're purposefully
experimenting with Julia, but I'd still like to point it out for the casual
reader of this example.

Whether you encounter "performance issues" or not depends on what
performance you need. If you compare this code to simple arithmetic
operations (adding numbers), then it's slower. If you compare it to sending
data across the network or accessing the disk, then it's faster.

I assume that calling `foo` in the loop requires a hash table lookup at run
time, and likely some memory allocation.

-erik


On Fri, Jun 10, 2016 at 9:40 PM, Po Choi  wrote:

> Ops! I accidentally hit the post button! So the post is not completed!
>
> It is an example:
> foo(::Val{1}) = 1
> foo(::Val{2}) = 2
> foo(::Val{2}) = 3
>
> function bar()
>   s = 0
>   for t in Datatype[Val{k} for k in 1:3]
> s += foo(t)
>   end
> end
>
> Will there be any performance issue if I loop over types?
> I am still trying to understand how the multiple-dispatch works. Sometimes
> I am confused!
>
>
> On Friday, June 10, 2016 at 6:37:31 PM UTC-7, Po Choi wrote:
>>
>>
>>
>> foo(::Val{1}) = 1
>> foo(::Val{2}) = 2
>> foo(::Val{2}) = 3
>>
>> function bar()
>>
>> for t in Datatype[Val{k} for k in 1:3]
>>
>>   end
>> end
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>


-- 
Erik Schnetter 
http://www.perimeterinstitute.ca/personal/eschnetter/


[julia-users] why the numerical result is different (RK45 julia and matlab)?

2016-06-10 Thread jmarcellopereira


this is the test for equation differential using runge-kutta45: f(x,y)= 
(-5*x - y/5)^1/8 + 10



why the numerical result is different? I used :

function Rk_JL()
 f(x,y)= (-5*x - y/5)^1/8 + 10
 tspan = 0:0.001:n
 y0 = [0.0, 1.0]
 return ODE.ode45(f, y0,tspan);end


and


function [X1,Y1] = RK_M()
 f = @(x,y) (-5*x - y/5)^1/8 + 10;
 tspan = 0:0.001:n;
 y0 = 1
 [X1,Y1]= ode45(f,tspan,1);end



[julia-users] Re: Does it make sense to loop over types?

2016-06-10 Thread Po Choi
Ops! I accidentally hit the post button! So the post is not completed!

It is an example:
foo(::Val{1}) = 1
foo(::Val{2}) = 2
foo(::Val{2}) = 3

function bar()
  s = 0
  for t in Datatype[Val{k} for k in 1:3]
s += foo(t)
  end
end

Will there be any performance issue if I loop over types?
I am still trying to understand how the multiple-dispatch works. Sometimes 
I am confused!

On Friday, June 10, 2016 at 6:37:31 PM UTC-7, Po Choi wrote:
>
>
>
> foo(::Val{1}) = 1
> foo(::Val{2}) = 2
> foo(::Val{2}) = 3
>
> function bar()
>   
> for t in Datatype[Val{k} for k in 1:3]
>
>   end
> end
>
>
>
>
>
>
>
>
>
>
>

[julia-users] PyPlot: LineCollection help needed

2016-06-10 Thread David P. Sanders
Does it work with ax=gca() instead? 

[julia-users] Does it make sense to loop over types?

2016-06-10 Thread Po Choi


foo(::Val{1}) = 1
foo(::Val{2}) = 2
foo(::Val{2}) = 3

function bar()
  
for t in Datatype[Val{k} for k in 1:3]
   
  end
end












[julia-users] Re: testing positive definiteness via Cholesky?

2016-06-10 Thread vavasis
Dear Viral and Tim,

Thanks for your prompt responses to my query.  I should have stated more 
precisely how I am using positive definiteness testing.  The application is 
a classic trust-region method for optimization.  In a trust region method, 
the main operation is as follows.  The input is a symmetric (typically 
sparse) matrix A and a vector b.  The problem is to compute a certain real 
parameter lambda.  One tests if A + lambda*speye(n) is positive definite; 
if so, then one solves a linear system with this coefficient matrix, and if 
not, one increases lambda and tries again.

So the problem with using isposdef is that, in the case that A is actually 
positive definite, isposdef discards the Cholesky factor, so then my 
application would need to compute it again (redundantly) to solve the 
system.  In the case of the PostiveFactorizations.jl package, it appears 
that the package is aimed at dense rather than sparse matrices.

So aside from rewriting cholfact, it seems that the only remaining solution 
is Viral's suggestion to catch an exception from cholfact.  This raises 
another question.  Most C++ textbooks advise against using exceptions for 
ordinary control-flow on the grounds that throwing and catching an 
exception is a time-consuming operation.  How about in Julia?  Is it 
reasonable to use try/throw/catch for ordinary control flow in a scientific 
code?  The Julia manual states that exceptions are much slower than if 
statements.  But on the other hand, isposdef in cholmod.jl is written in 
terms of exceptions!

Thanks,
Steve

On Thursday, June 9, 2016 at 10:16:29 PM UTC-4, vav...@uwaterloo.ca wrote:
>
> In Matlab to check if a symmetric sparse matrix is positive definite, I 
> can say [R,p]=chol(A) and then if p>0 etc.  Is this functionality available 
> in Julia?  The cholfact standard routine throws an exception if its 
> argument is not positive definite rather than returning any helpful 
> information.  
>
> I looked at the code for cholfact in cholmod.jl in Base; it appears that I 
> can write a modified version of cholfact that exposes this functionality.   
> But it would be better if the functionality were available in the library 
> so that my code is not sensitive to changes in undocumented low-level 
> routines.
>
> Thanks,
> Steve Vavasis
>  
>


[julia-users] Re: Defining a new numeric type with minimal effort

2016-06-10 Thread Kevin Kunzmann
Hey Jeffrey,

it's been a while, thx for the answer. I see that this would be working. 
However, what about min, max, sin, etc.? I do not want to re-implement all 
elementary functions for the Probability type. There must be some way to 
inherit the behaviour of the abstract supertype "Real". I guess I am 
missing something fundamental about the type system here. 

I felt tat something like

type Probability{T<:Real} <:Real
p::T
end

import Base.convert

convert{T<:Real}(::Type{T}, x::Probability) = convert(T, x.p)
convert{T1<:Real, T2<:Real}(::Type{Probability{T1}}, x::T2) = 
Probability(convert(T1, x))


should do the job as now any Probability can be converted to any concrete 
subtype of Real and "+" shoud be implemented there ;)
Very strange, how does Julia handle inheritance at all???

Best, 

Kevin


On Monday, 29 February 2016 23:45:35 UTC+1, Jeffrey Sarnoff wrote:
>
> Kevin,
>
> If all that you ask of this type is that it does arithmetic, clamps any 
> negative values to zero, and clamps any values greater than one to one, 
> that is easy enough. Just note that arithmetic with probabilities usually 
> is more subtle than that.
>
> import Base: +,-,*,/
>
> immutable Probability <: Real
>  val::Float64
>
>  Probability(x::Float64) = min(1.0, max(0.0, x))
> end
>
> (+){T<:Probability}(a::T, b::T) = Probability( a.val + b.val )
> (-){T<:Probability}(a::T, b::T) = Probability( a.val - b.val )
> (*){T<:Probability}(a::T, b::T) = Probability( a.val * b.val )
> (/){T<:Probability}(a::T, b::T) = Probability( a.val / b.val )
>
> You need conversion and promotion if you want to mix Float64 values and 
> Probability values: 2.0 * Probability(0.25) == Probability(0.5).
>
> On Sunday, February 28, 2016 at 10:33:07 AM UTC-5, Kevin Kunzmann wrote:
>>
>> Hey,
>>
>> I have a (probably) very simple question. I would like to define 
>> 'Probability' as a new subtype of 'Real', only with the additional 
>> restriction that the value must be between 0 and 1. How would I achieve 
>> that 'Julia-style'? This should be possible without having to rewrite all 
>> these promotion rules and stuff, is it not? 
>>
>> Best Kevin
>>
>

RE: [julia-users] Re: ArrayFire.jl - GPU Programming in Julia

2016-06-10 Thread David Anthoff
https://github.com/JuliaComputing/ArrayFire.jl/issues/40

 

 

From: julia-users@googlegroups.com [mailto:julia-users@googlegroups.com] On 
Behalf Of Gabriel Goh
Sent: Friday, June 10, 2016 3:00 PM
To: julia-users 
Subject: [julia-users] Re: ArrayFire.jl - GPU Programming in Julia

 

is there windows support? I have a pretty beefy gaming PC


On Thursday, June 9, 2016 at 10:08:42 PM UTC-7, ran...@juliacomputing.com 
  wrote:

Hello, 

 

We are pleased to announce ArrayFire.jl, a library for GPU and heterogeneous 
computing in Julia: (https://github.com/JuliaComputing/ArrayFire.jl). We look 
forward to your feedback and your contributions as well! 

 

For more information, check out Julia Computing's latest blog post: 
http://juliacomputing.com/blog/2016/06/09/julia-gpu.html

 

Thanks,

Ranjan

Julia Computing, Inc. 



[julia-users] Re: ArrayFire.jl - GPU Programming in Julia

2016-06-10 Thread Gabriel Goh
is there windows support? I have a pretty beefy gaming PC

On Thursday, June 9, 2016 at 10:08:42 PM UTC-7, ran...@juliacomputing.com 
wrote:
>
> Hello, 
>
> We are pleased to announce ArrayFire.jl, a library for GPU and 
> heterogeneous computing in Julia: (
> https://github.com/JuliaComputing/ArrayFire.jl). We look forward to your 
> feedback and your contributions as well! 
>
> For more information, check out Julia Computing's latest blog post: 
> http://juliacomputing.com/blog/2016/06/09/julia-gpu.html
>
> Thanks,
> Ranjan
> Julia Computing, Inc. 
>


[julia-users] Re: Pkg.update() fails on local package

2016-06-10 Thread John Best
Thanks for the quick reply. Just after I posted this, I re-read the 
Pkg.init help, and realized that I had tried to initialize a new package 
using it, when it actually switches your package library folder to a new 
location. Running `Pkg.init()` fixed my problems. My dumb mistake for the 
day I suppose!

Thanks!

On Thursday, June 9, 2016 at 11:41:02 PM UTC-8, Andreas Lobinger wrote:
>
> Hello colleague,
>
> i ran (twice) into a problem that looked similar by the symtoms. In both 
> cases the git configuration of METADATA were corrupted and in the recent 
> case the remote setting was somehow routed to the Package name. -> Look in 
> .julia/v0.4/METADATA/.git/config. If you see your Package name showing up, 
> correct it (in my case i just copied the correct githup URL).
>
> If not, post your git config -l here.
>
>

[julia-users] zero divisor

2016-06-10 Thread digxx
Is there an algorithm to determine a zero divisor of a matrix?


[julia-users] Pivoting when inverting a sparse matrix

2016-06-10 Thread Gabriel Goh
 

The following code doesn't make a lot of sense, is there a reason why the 
backslash doesnt pivot for sparse matrices?


*julia> *A =[1.0 0.0 1.0

   0.0 0.0 1.0

   1.0 1.0 0.0]

3x3 Array{Float64,2}:

 1.0  0.0  1.0

 0.0  0.0  1.0

 1.0  1.0  0.0


*julia>* inv(A)

3x3 Array{Float64,2}:

  1.0  -1.0  -0.0

 -1.0   1.0   1.0

  0.0   1.0   0.0


*julia>* A\ones(3)

3-element Array{Float64,1}:

 0.0

 1.0

 1.0


*julia>* sparse(A)\ones(3)

ERROR: ArgumentError: matrix has one or more zero pivots

 in ldltfact at sparse/cholmod.jl:1246

 in ldltfact at sparse/cholmod.jl:1253

 in factorize at sparse/linalg.jl:849

 in \ at linalg/generic.jl:326



Re: [julia-users] one lvl return kw

2016-06-10 Thread Stefan Karpinski
On Fri, Jun 10, 2016 at 3:52 PM, Ford O.  wrote:

>
>
>> Aside from goto, is there any language that has this?
>>>
>>
> None that I know of, but that doesn't mean it's a bad feature on it's own.
> Since every kind of block can return a value in julia, stronger control
> over the returned  value could be quite handy...
>

Innovation in control flow is not a goal of Julia.

P.S. Why loops don't return the last line executed value like all other
>>> blocks do?
>>>
>>
>> What value should they evaluate to if they execute zero times?
>>
>
> `nothing`
>

That makes every single loop a type unstable expression.


Re: [julia-users] ORB code in Julia

2016-06-10 Thread Kevin Squire
Hello Ira,

I suggest copy and pasting the code itself (and the error), instead of
posting them as images.  That makes it much easier for others to see what's
happening, try things themselves, and suggest fixes.

Cheers,
   Kevin

On Fri, Jun 10, 2016 at 8:32 AM, Ira  wrote:

> I am trying to convert this simple program that makes use of OpenCV's ORB
> into Julia code
>
> Any hints as to how to fix the issue?
>
> Many thanks,
> Ira
>


Re: [julia-users] one lvl return kw

2016-06-10 Thread Ford O.


On Friday, June 10, 2016 at 9:41:20 PM UTC+2, Stefan Karpinski wrote:
>
> On Fri, Jun 10, 2016 at 3:26 PM, Ford O. > 
> wrote:
>
>> Why is there no keyword that would allow programmer to do
>>
>> var = begin # any block, including : let, if, function, macro, for, 
>> while 
>>  in = user_input()
>>  if in == something
>>`kw` true # immediately jumps out of this block
>>  #lots of code
>>  false
>> end
>>
>> Note that I wanna jump only one level above, not like break or return 
>> keywords.
>>
>
> Aside from goto, is there any language that has this?
>

None that I know of, but that doesn't mean it's a bad feature on it's own. 
Since every kind of block can return a value in julia, stronger control 
over the returned  value could be quite handy...
 

>  
>
>> P.S. Why loops don't return the last line executed value like all other 
>> blocks do?
>>
>
> What value should they evaluate to if they execute zero times?
>

`nothing` 


Re: [julia-users] one lvl return kw

2016-06-10 Thread Stefan Karpinski
On Fri, Jun 10, 2016 at 3:26 PM, Ford O.  wrote:

> Why is there no keyword that would allow programmer to do
>
> var = begin # any block, including : let, if, function, macro, for, while
> 
>  in = user_input()
>  if in == something
>`kw` true # immediately jumps out of this block
>  #lots of code
>  false
> end
>
> Note that I wanna jump only one level above, not like break or return
> keywords.
>

Aside from goto, is there any language that has this?


> P.S. Why loops don't return the last line executed value like all other
> blocks do?
>

What value should they evaluate to if they execute zero times?


Re: [julia-users] one lvl return kw

2016-06-10 Thread Mauro
You're probably looking for @goto ;-)

Although break may be what you want:

julia> for i=1:10
   for j=1:10
   if j==3
   break
   end
   end
   println(i,j)
   end
13
23
33
43
53
63
73
83
93
103


On Fri, 2016-06-10 at 21:26, Ford O.  wrote:
> Why is there no keyword that would allow programmer to do
>
> var = begin # any block, including : let, if, function, macro, for, while 
> in = user_input()
> if in == something
>  `kw` true # immediately jumps out of this block
> #lots of code
> false
> end
>
>
> Note that I wanna jump only one level above, not like break or return 
> keywords.
>
> P.S. Why loops don't return the last line executed value like all other blocks
> do?


[julia-users] one lvl return kw

2016-06-10 Thread Ford O.
Why is there no keyword that would allow programmer to do

var = begin # any block, including : let, if, function, macro, for, while 

 in = user_input()
 if in == something
   `kw` true # immediately jumps out of this block
 #lots of code
 false
end


Note that I wanna jump only one level above, not like break or return 
keywords.

P.S. Why loops don't return the last line executed value like all other 
blocks do?


[julia-users] Re: Parametric constructor of concrete type accepting all subtypes of an(other) abstract type?

2016-06-10 Thread Adrian Salceanu
Never mind, after a re-re-read of the docs and a few other resources, it 
finally clicked :) 

type SQLFoo{T<:AbstractModel}
  model_name::Type{T}
  bar::Bool


  SQLFoo(model_name, bar) = new(model_name, bar)
end
SQLFoo{T<:AbstractModel}(model_name::Type{T}; bar::Bool = true) = SQLFoo{T}(
model_name, bar)


vineri, 10 iunie 2016, 20:21:18 UTC+2, Adrian Salceanu a scris:
>
> Hi,
>
> I don't know how to code this, any help appreciated. 
>
> I want to have a concrete type that has a field referencing any of the 
> subtypes of an(other) abstract type. How would I write that? 
>
> This is one of my many attempts: 
>
> type SQLFoo{T <: AbstractModel}
>   model_name::Type{T}
>
>
>   SQLFoo(model_name) = new(model_name)
> end
>
>
> julia> issubtype(Repo, AbstractModel)
> true
>
> julia> SQLFoo(Repo)
> ERROR: MethodError: `convert` has no method matching convert(::Type{SQLFoo
> {T<:AbstractModel}}, ::Type{Repo})
> This may have arisen from a call to the constructor SQLFoo{T<:
> AbstractModel}(...),
> since type constructors fall back to convert methods.
> Closest candidates are:
>   call{T}(::Type{T}, ::Any)
>   convert{T}(::Type{T}, ::T)
>  in call at essentials.jl:56
>
>
> Thanks,
> -- Adrian
>


[julia-users] Re: Parametric constructor of concrete type accepting all subtypes of an(other) abstract type?

2016-06-10 Thread Adrian Salceanu
A small update - I tried to simplify my example, but I removed too much. 

This version is easy, it's just: 

type SQLFoo{T <: AbstractModel}
  model_name::Type{T}
end

What I'm looking for is having optional args, such as: 

type SQLFoo{T <: AbstractModel}
  model_name::Type{T}
  bar::Bool

  SQLFoo(T; bar = true) = new(model_name, bar)
end

Is this possible? 


vineri, 10 iunie 2016, 20:21:18 UTC+2, Adrian Salceanu a scris:
>
> Hi,
>
> I don't know how to code this, any help appreciated. 
>
> I want to have a concrete type that has a field referencing any of the 
> subtypes of an(other) abstract type. How would I write that? 
>
> This is one of my many attempts: 
>
> type SQLFoo{T <: AbstractModel}
>   model_name::Type{T}
>
>
>   SQLFoo(model_name) = new(model_name)
> end
>
>
> julia> issubtype(Repo, AbstractModel)
> true
>
> julia> SQLFoo(Repo)
> ERROR: MethodError: `convert` has no method matching convert(::Type{SQLFoo
> {T<:AbstractModel}}, ::Type{Repo})
> This may have arisen from a call to the constructor SQLFoo{T<:
> AbstractModel}(...),
> since type constructors fall back to convert methods.
> Closest candidates are:
>   call{T}(::Type{T}, ::Any)
>   convert{T}(::Type{T}, ::T)
>  in call at essentials.jl:56
>
>
> Thanks,
> -- Adrian
>


[julia-users] Parametric constructor of concrete type accepting all subtypes of an(other) abstract type?

2016-06-10 Thread Adrian Salceanu
Hi,

I don't know how to code this, any help appreciated. 

I want to have a concrete type that has a field referencing any of the 
subtypes of an(other) abstract type. How would I write that? 

This is one of my many attempts: 

type SQLFoo{T <: AbstractModel}
  model_name::Type{T}


  SQLFoo(model_name) = new(model_name)
end


julia> issubtype(Repo, AbstractModel)
true

julia> SQLFoo(Repo)
ERROR: MethodError: `convert` has no method matching convert(::Type{SQLFoo{T
<:AbstractModel}}, ::Type{Repo})
This may have arisen from a call to the constructor SQLFoo{T<:AbstractModel
}(...),
since type constructors fall back to convert methods.
Closest candidates are:
  call{T}(::Type{T}, ::Any)
  convert{T}(::Type{T}, ::T)
 in call at essentials.jl:56


Thanks,
-- Adrian


Re: [julia-users] Re: array of arrays to multi-dimensional array

2016-06-10 Thread Lewis Lehe
thanks Islam! I had tried to use the vcat and did't get the result I
wanted, but I didn't think of using the splats.

On Fri, Jun 10, 2016 at 11:01 AM, Islam Badreldin  wrote:

> Hi Lewis
>
> Please see below
>
>
> On Friday, June 10, 2016 at 12:48:57 PM UTC-4, Lewis Lehe wrote:
>
>> Hi I I wondered if there is a neat in Julia way to create a
>> multi-dimensional array from an array of arrays. This would be useful for
>> creating data structures programatically.
>>
>> For example...
>>
>> arr = map(x->[0 1 2 3],1:8)
>>
>> now i have this array of arrays. but what I would really like is to make
>> the entries into the rows of an 8x4 matrix.
>>
>> Is there a simple way to do this?
>>
>
> As Tim pointed out, cat and friends are the answer. For your
> specific examples, this works for me (using ... for splat):
>
> > vcat(arr...)
> 8x4 Array{Int64,2}:
>  0  1  2  3
>  0  1  2  3
>  0  1  2  3
>  0  1  2  3
>  0  1  2  3
>  0  1  2  3
>  0  1  2  3
>  0  1  2  3
>
>
> Cheers,
> Islam
>


[julia-users] Re: array of arrays to multi-dimensional array

2016-06-10 Thread Islam Badreldin
Hi Lewis

Please see below

On Friday, June 10, 2016 at 12:48:57 PM UTC-4, Lewis Lehe wrote:

> Hi I I wondered if there is a neat in Julia way to create a 
> multi-dimensional array from an array of arrays. This would be useful for 
> creating data structures programatically.
>
> For example...
>
> arr = map(x->[0 1 2 3],1:8)
>
> now i have this array of arrays. but what I would really like is to make 
> the entries into the rows of an 8x4 matrix.
>
> Is there a simple way to do this?
>

As Tim pointed out, cat and friends are the answer. For your 
specific examples, this works for me (using ... for splat):

> vcat(arr...)
8x4 Array{Int64,2}:
 0  1  2  3
 0  1  2  3
 0  1  2  3
 0  1  2  3
 0  1  2  3
 0  1  2  3
 0  1  2  3
 0  1  2  3


Cheers,
Islam 


Re: [julia-users] Differential Equations Package

2016-06-10 Thread Mauro
On Fri, 2016-06-10 at 17:13, Chris Rackauckas  wrote:
...
> Thanks for helping me get more focus! Do you plan on moving to ODE.jl to
> JuliaMath? When I hit the major release, I think we should be considering
> bringing us together under some organization.

Thanks for the good discussion!  I don't know where ODE.jl goes, but it
may well join JuliaMath.  I think there was/is a push to lighten the
JuliaLang org because of CI time constraints.


Re: [julia-users] Re: Status of FEM packages

2016-06-10 Thread Mauro
Thanks for the update, looks good!

On Thu, 2016-06-09 at 23:18, Jukka Aho  wrote:
> Hi,
>
> JuliaFEM developer here.
>
> About performance. I recently measured FEM problem assembly time vs. solver
> time to find out the performance of JuliaFEM. Point was to find out when
> assembly time is less than time used to solve Ax=b which is done in FORTRAN
> and nothing can be done for that by me. I solved a displacement for 3d
> piston model[1] with different meshes up to size 282035 nodes,
> approximately 1 million dofs. Here's the results:
>
>   nnodes nels  cholmodtotalaster
> 4801 4503 0.37 2.00 1.83
> 9602 8801 0.75 3.85 3.08
>2175219408 2.02 9.30 8.67
>2990426237 2.7913.0912.03
>5300745060 6.2325.6928.26
>6562255429 8.2932.8042.15
>7454762876 8.8136.6948.92
>896307477713.5050.4971.83
>   1105159106821.0175.01   107.85
>   145521   11813841.58   125.31   191.23
>   178261   14384956.19   166.12   283.90
>   212365   17080176.75   206.91   481.30
>   244032   196436   112.12   270.06   557.24
>   282035   227333   150.56   332.14   811.62
>
> Assembly isn't optimized anyway but even with this performance it can be
> seen that solving Ax=b takes about same amount of time than assembly when
> there's about 1 million dofs in model. I hope that bigger models can be
> solved effectively using PETSc.jl in future. For now I use simply x = A \ b
> but it gets impractical with bigger models because of it's memory usage.
> These models are solved using my 5 years old PC so it's already possible to
> solve 1M dof problems with JuliaFEM in a reasonable time.
>
> Currently it's possible to solve heat/Poisson problems and linear/nonlinear
> elasticity problems in 2d and 3d. I've started to write blog posts about
> developing and using JuliaFEM under category juliafem[2], so the current
> status of the project can be followed from there and also from project web
> site[7], but it's quite unfinished at the moment. I have not yet written
> about developing own "problems" (expect that to be next post about
> developing JuliaFEM) but it has been made very simple. A standard poisson
> assembly procedure is only 30 lines of code and it's quite self
> explaining[3]. An example how to build FEM models from scratch can be found
> from tests[4]. I have written a script to read mesh from Code Aster med
> file to construct bigger and more realistic models using SALOME. A better
> example of solving Poisson problem in 3d can be found from [5]. Results can
> be saved to Xdmf file format and visualized using Paraview [6].
>
> Br,
> Jukka
>
> [1] http://ahojukka5.github.io/images/piston_displacement_results.png
> [2] http://ahojukka5.github.io/categories/juliafem/
> [3] https://github.com/JuliaFEM/JuliaFEM.jl/blob/master/src/heat.jl
> [4] https://github.com/JuliaFEM/JuliaFEM.jl/blob/master/test/test_heat.jl
> [5] http://ahojukka5.github.io/files/juliafem_examples/heat.jl
> [6] http://ahojukka5.github.io/images/juliafem_logo_heat_results.png
> [7] http://juliafem.org/
>
> keskiviikko 8. kesäkuuta 2016 11.11.04 UTC+3 CrocoDuck O'Ducks kirjoitti:
>>
>> Hi There!
>>
>> I am involved into some multiphysics FEM problems I solve with ElmerFEM
>> . Seems to me that everything being
>> maintained is pretty much gravitating around JuliaFEM
>>  and ElipticFEM
>> . Any of you guys doing
>> some FEM with Julia? If so, what do you use?
>>


Re: [julia-users] array of arrays to multi-dimensional array

2016-06-10 Thread Tim Holy
cat, hcat, vcat, and hvcat are your friends here.

--Tim

On Friday, June 10, 2016 9:48:57 AM CDT Lewis Lehe wrote:
> Hi I I wondered if there is a neat in Julia way to create a
> multi-dimensional array from an array of arrays. This would be useful for
> creating data structures programatically.
> 
> For example...
> 
> arr = map(x->[0 1 2 3],1:8)
> 
> now i have this array of arrays. but what I would really like is to make
> the entries into the rows of an 8x4 matrix.
> 
> Is there a simple way to do this?




[julia-users] array of arrays to multi-dimensional array

2016-06-10 Thread Lewis Lehe
Hi I I wondered if there is a neat in Julia way to create a 
multi-dimensional array from an array of arrays. This would be useful for 
creating data structures programatically.

For example...

arr = map(x->[0 1 2 3],1:8)

now i have this array of arrays. but what I would really like is to make 
the entries into the rows of an 8x4 matrix.

Is there a simple way to do this?


Re: [julia-users] Re: ArrayFire.jl - GPU Programming in Julia

2016-06-10 Thread Keno Fischer
Some computers have more than one graphics card and the more powerful one
needs to be activated manually. I know that's the case on my macbook.

On Fri, Jun 10, 2016 at 10:20 AM, Robert Feldt 
wrote:

> Great thanks. I suggest a comment about this or changing the example in
> the README so it's more clear to people like me... ;)
>
> /Robert
>
>>
>>


[julia-users] ORB code in Julia

2016-06-10 Thread Ira
I am trying to convert this simple program that makes use of OpenCV's ORB 
into Julia code

Any hints as to how to fix the issue?

Many thanks,
Ira


Re: [julia-users] Differential Equations Package

2016-06-10 Thread Chris Rackauckas

>
> I agree, keyword-args are a maintenance headache (and also potentially 
> bad for performance).  That was indeed one of the reason to make 
> Parameters.jl to generate the keyword constructors for the types 
> automatically.  Then use the types instead of keyword functions. 
>
> Concerning the implementation I was not very clear: I don't mean to 
> argue that you should not use any fancy type, etc., in your solvers 
> (maybe call them integrators?) but that you should also provide a dumb 
> interface for them (eventually) so they can be used outside of 
> DifferentialEquations.  (And yes, the @def does feel like unnecessary 
> metaprogramming). 
>

Yeah, in its current use it's unnecessary. I am doing a bit of a turn 
around on it now that I am implementing external integrators (nice word 
choice. That's what I've been doing in research papers, I'll carry that 
over to the package). Indeed, for adding external integrators to be the 
same as adding new solvers internally, calling internal integrators should 
be calling a function. That is pretty much in line with how I am using 
integrators which just make a few matrices and call IterativeSolvers.jl or 
NLSolve.jl (which are some of the FEM methods), and so it will also 
generalize to using ODE.jl, ODEInterface.jl, PETSc.jl, etc. Also, all of my 
GPU / Xeon Phi integrators for SDEs simply call C/Cuda functions (currently 
not released), and so those will be easier to make into function calls. So 
while @def was really quick and easy for building geometric multigrid 
solvers (for an example, see /src/fdm/stokesSolvers.jl), in the end I took 
the idea too far. It's looking like most of it should be all switched to 
function calls.

One thing that will be nice though is to use @def for loop header and 
footers. For example, there are many different rejection sampling adaptive 
algorithms which do the same routine for determining rejections and 
adjusting step size, and at the top of every loop I update the iteration 
(and hit the RNG for SDE methods), so I think the most maintainable way to 
write the integrators will be to make them their own function which is just 
a loop, which does:

while t > Yes, those and Sundials are what I have in mind as methods the user can 
> > choose. I don't really know how to handle the dependency issue though: 
> do 
> > you require these packages? Right now I require Plots and PyPlot. Is 
> that 
> > necessary? Is requiring NLSolve necessary when it's only used for the 
> > implicit methods? ForwardDiff is only used in the Rosenbrock methods 
> > (currently), should it be a dependency? Or should I detail a list of 
> what 
> > methods need what dependencies? I haven't settled on this, and may open 
> an 
> > issue. 
>
> The issue is https://github.com/JuliaLang/julia/issues/6195 
>
> There is https://github.com/MikeInnes/Requires.jl (but it does not work 
> 100%). 
>
> And here is how Plots.jl does it: 
>
> https://github.com/tbreloff/Plots.jl/blob/cf4d78c87c773453945f181cce2f1fe495c94798/src/backends/pyplot.jl#L59


Tom Breloff's recent change to plots will make it so that way I only 
require RecipesBase.jl, and then have it in documentation that to use the 
plotting functionality, you have to install Plots.jl and a backend 
(preferably PyPlot). So I think the first thing in the documentation will 
be a dependency chart/explanation, i.e. to use ___ methods, you 
need __. A nice little graphic will solve it.



I think I will tag the a new minor version to make the latest tagged 
version compatible with the new Plots and have all the latest integrators. 
I will cram to make the internal change to functions, add in calls to the 
other libraries (at least for ODEs), and change the dependency setup before 
the major release. With that I will have a blog post on the vision, and 
make it easier for others to start contributing. That will happen when my 
paper on the SDE methods is published (resubmitting by the end of the week, 
so at least by the end of summer, and I hope by midsummer) which will allow 
me to pull in a private branch where a lot of my development has been. When 
I get there, we should talk about whether some of the parts should be 
broken out to their own packages and stitched together via 
DifferentialEquations.jl (that would definitely make the tests quicker!)

Thanks for helping me get more focus! Do you plan on moving to ODE.jl to 
JuliaMath? When I hit the major release, I think we should be considering 
bringing us together under some organization. 


[julia-users] PyPlot: LineCollection help needed

2016-06-10 Thread NotSoRecentConvert
I'm trying to do a plot using LineCollection in PyPlot however am having a 
hard time converting an example (1 
 or 2 
).

lines = Any[Any[[1.0 2.0]];Any[[3.0 4.0]];Any[[5.0 .06]];Any[[0.0 0.0]]] # 
Points
c = Any[Any[1 0 0];Any[0 1 0];Any[0 0 1]] # Color

line_segments = matplotlib[:collections][:LineCollection](lines,colors=c)

fig = figure("Line Collection Example")
ax = axes()
ax[:add_collection](line_segments)
axis("tight")

It doesn't return an errors but I don't see any lines. Any idea what could 
be wrong?


Re: [julia-users] Re: ArrayFire.jl - GPU Programming in Julia

2016-06-10 Thread Robert Feldt
Great thanks. I suggest a comment about this or changing the example in the 
README so it's more clear to people like me... ;)

/Robert

>
>

Re: [julia-users] Re: ArrayFire.jl - GPU Programming in Julia

2016-06-10 Thread Mauro


On Fri, 2016-06-10 at 16:14, Robert Feldt  wrote:
> Thanks, this looks really nice.
>
> I seem to be able to install it ok but when trying to run on a fairly
> recent MacBook Pro (Retina, 13-inch, Early 2015) I get an error when trying
> this on julia Version 0.4.5 (2016-03-18 00:58 UTC)
> x86_64-apple-darwin15.4.0:
>
> julia> using ArrayFire
>
> julia> #Random number generation
>a = rand(AFArray{Float64}, 100, 100)

a = rand(AFArray{Float32}, 100, 100)

Most graphic cards only support 32 bit floats.

> ERROR: "ArrayFire Error (401) : Double precision not supported for this
> device"
>  in af_randu at /Users/feldt/.julia/v0.4/ArrayFire/src/wrap.jl:1033
>  in rand at /Users/feldt/.julia/v0.4/ArrayFire/src/create.jl:7
>
> Maybe update the docs to make the requirements and limitations more clear?
> Or is there a way around this?
>
> My computer/version details:
>
> Processor: 3,1 GHz Intel Core i7
>
> Memory: 16 GB 1867 MHz DDR3
>
> Graphics: Intel Iris Graphics 6100 1536 MB
> OS: El Capitan, 10.11.3 (15D21)
>
> Thanks,
>
> Robert Feldt
>
> Den fredag 10 juni 2016 kl. 12:19:16 UTC+2 skrev Mauro:
>>
>> You need to install the arrayfire library by hand:
>> https://github.com/JuliaComputing/ArrayFire.jl#installation
>>
>> If done already, check the trouble shooting section.
>>
>> On Fri, 2016-06-10 at 12:10, Fred >
>> wrote:
>> > Hi !
>> >
>> > Thank you for this great package ! I tried to install it on Julia 0.4.5
>> but I
>> > obtained :
>> >
>> > julia
>> >_
>> >  __ _(_)_   | A fresh approach to technical computing
>> >  (_)   | (_) (_)  | Documentation: http://docs.julialang.org
>> >  _ _  _| |_ __ _  | Type "?help" for help.
>> >  | | | | | | |/ _` | |
>> >  | | |_| | | | (_| | | Version 0.4.5 (2016-03-18 00:58 UTC)
>> > _/ |\__'_|_|_|\__'_| | Official http://julialang.org release
>> > |__/  | x86_64-linux-gnu
>> >
>> >
>> > julia> using ArrayFire
>> > ERROR: LoadError: LoadError: could not load library "libaf"
>> > libaf: Ne peut ouvrir le fichier d'objet partagé: Aucun fichier ou
>> dossier de
>> > ce type
>> > in dlopen at ./libdl.jl:36
>> > in dlopen at libdl.jl:36
>> > in include at ./boot.jl:261
>> > in include_from_node1 at ./loading.jl:320
>> > in include at ./boot.jl:261
>> > in include_from_node1 at ./loading.jl:320
>> > in require at ./loading.jl:259
>> > while loading /home/fred/.julia/v0.4/ArrayFire/src/config.jl, in
>> expression
>> > starting on line 6
>> > while loading /home/fred/.julia/v0.4/ArrayFire/src/ArrayFire.jl, in
>> expression
>> > starting on line 5
>>


Re: [julia-users] Re: ArrayFire.jl - GPU Programming in Julia

2016-06-10 Thread Robert Feldt
Thanks, this looks really nice. 

I seem to be able to install it ok but when trying to run on a fairly 
recent MacBook Pro (Retina, 13-inch, Early 2015) I get an error when trying 
this on julia Version 0.4.5 (2016-03-18 00:58 UTC) 
x86_64-apple-darwin15.4.0:

julia> using ArrayFire

julia> #Random number generation
   a = rand(AFArray{Float64}, 100, 100)
ERROR: "ArrayFire Error (401) : Double precision not supported for this 
device"
 in af_randu at /Users/feldt/.julia/v0.4/ArrayFire/src/wrap.jl:1033
 in rand at /Users/feldt/.julia/v0.4/ArrayFire/src/create.jl:7

Maybe update the docs to make the requirements and limitations more clear? 
Or is there a way around this?

My computer/version details:

Processor: 3,1 GHz Intel Core i7

Memory: 16 GB 1867 MHz DDR3

Graphics: Intel Iris Graphics 6100 1536 MB
OS: El Capitan, 10.11.3 (15D21)

Thanks,

Robert Feldt

Den fredag 10 juni 2016 kl. 12:19:16 UTC+2 skrev Mauro:
>
> You need to install the arrayfire library by hand: 
> https://github.com/JuliaComputing/ArrayFire.jl#installation 
>
> If done already, check the trouble shooting section. 
>
> On Fri, 2016-06-10 at 12:10, Fred > 
> wrote: 
> > Hi ! 
> > 
> > Thank you for this great package ! I tried to install it on Julia 0.4.5 
> but I 
> > obtained : 
> > 
> > julia 
> >_ 
> >  __ _(_)_   | A fresh approach to technical computing 
> >  (_)   | (_) (_)  | Documentation: http://docs.julialang.org 
> >  _ _  _| |_ __ _  | Type "?help" for help. 
> >  | | | | | | |/ _` | | 
> >  | | |_| | | | (_| | | Version 0.4.5 (2016-03-18 00:58 UTC) 
> > _/ |\__'_|_|_|\__'_| | Official http://julialang.org release 
> > |__/  | x86_64-linux-gnu 
> > 
> > 
> > julia> using ArrayFire 
> > ERROR: LoadError: LoadError: could not load library "libaf" 
> > libaf: Ne peut ouvrir le fichier d'objet partagé: Aucun fichier ou 
> dossier de 
> > ce type 
> > in dlopen at ./libdl.jl:36 
> > in dlopen at libdl.jl:36 
> > in include at ./boot.jl:261 
> > in include_from_node1 at ./loading.jl:320 
> > in include at ./boot.jl:261 
> > in include_from_node1 at ./loading.jl:320 
> > in require at ./loading.jl:259 
> > while loading /home/fred/.julia/v0.4/ArrayFire/src/config.jl, in 
> expression 
> > starting on line 6 
> > while loading /home/fred/.julia/v0.4/ArrayFire/src/ArrayFire.jl, in 
> expression 
> > starting on line 5 
>


[julia-users] Re: [ANN]: ScikitLearn.jl 0.1.0 released, now with Julia ecosystem support

2016-06-10 Thread Viral Shah
I've been following the progress and this is really beginning to look 
pretty awesome!

-viral

On Friday, June 10, 2016 at 7:29:39 AM UTC-4, Cedric St-Jean wrote:
>
> I'm pleased to announce the first major relase of ScikitLearn.jl 
> ! It now works with the 
> following Julia models:
>
> *DecisionTree.jl 
> *
>  - DecisionTreeClassifier
>  - DecisionTreeRegressor
>  - RandomForestClassifier
>  - RandomForestRegressor
>  - AdaBoostStumpClassifier
>
> *LowRankModels.jl 
> *
>  - SkGLRM: Generalized Low Rank Models
>  - PCA: Principal Component Analysis
>  - QPCA: Quadratically Regularized PCA
>  - RPCA: Robust PCA
>  - NNMF: Non-negative matrix factorization
>  - K-Means
>
> *GaussianProcesses.jl* 
>
> *GaussianMixtures.jl* 
>
> Full list here . 
> Special 
> thanks to *@BenSadeghi*, *@davidavdav*, *@fairbrot *and *@madeleineudell *for 
> their help and support.
>
> DataFrames  are now accepted 
> as inputs, through DataFrameMapper 
> . And finally, 
> the Python version of scikit-learn has been made an optional dependency, 
> meaning that it's now possible to run a cross-validated grid-search of an 
> NNMF and DecisionTreeClassifier pipeline, in 100% Julia. Though, of course, 
> all of the 150 Python models 
>  remain accessible.
>
> If you're a package maintainer and would like to support the scikit-learn 
> interface, it's easy , and 
> I'm glad to help, just get in touch 
>  if you have any 
> questions.
>
> Cédric
>


Re: [julia-users] How to convert pgm graphics file to array 0-255

2016-06-10 Thread Tim Holy
I assume you're using Images.jl? Try `raw`. But why do you want to do this? 
FixedPointNumbers fixes one of the more annoying aspects of working with 
images, having "white" mean 255 when you're working with an image represented 
as UInt8 but having "white" mean 1 when you're working with floating-point. In 
a "real" language like Julia :-), there's no need for that separation.

--Tim

On Thursday, June 9, 2016 8:25:31 AM CDT programista...@gmail.com wrote:
> How to convert pgm graphics file to array 0-255 ?
> Paul




Re: [julia-users] Re: testing positive definiteness via Cholesky?

2016-06-10 Thread Tim Holy
See also https://github.com/timholy/PositiveFactorizations.jl

--Tim

On Thursday, June 9, 2016 7:56:41 PM CDT Viral Shah wrote:
> isposdef() or catch the exception from cholfact.
> 
> -viral
> 
> On Thursday, June 9, 2016 at 10:16:29 PM UTC-4, vava...@uwaterloo.ca wrote:
> > In Matlab to check if a symmetric sparse matrix is positive definite, I
> > can say [R,p]=chol(A) and then if p>0 etc.  Is this functionality
> > available
> > in Julia?  The cholfact standard routine throws an exception if its
> > argument is not positive definite rather than returning any helpful
> > information.
> > 
> > I looked at the code for cholfact in cholmod.jl in Base; it appears that I
> > can write a modified version of cholfact that exposes this functionality.
> > But it would be better if the functionality were available in the library
> > so that my code is not sensitive to changes in undocumented low-level
> > routines.
> > 
> > Thanks,
> > Steve Vavasis




Re: [julia-users] Re: Using pmap in julia

2016-06-10 Thread Tim Holy
I haven't read this thread in detail, but are the answers from your 
calculation expressable as "bitstypes" (e.g., Float64, etc) and/or arrays of 
bitstypes? If so, perhaps you could make your function deposit its results in 
a SharedArray, and then return `nothing`?

Best,
--Tim

On Wednesday, June 8, 2016 8:17:47 PM CDT Martha White wrote:
> I think I realize now why pmap is so slow. It is because on each call, it
> is copying over a lot of information to send to each worker. Because I have
> many nested outer loops, this ends up calling pmap thousands of times. I
> would like to create and pass variables to pmap, that are not re-copied by
> pmap. As an abstract example, imagine that I have 4 cores, but want to make
> 4000 calls to a function called my_fun.
> 
> total = zeros(Float64, 100)
> for i = 1:1000
># pmap_answers is dimension 4 x 100, where my_fun returns a vector of
> information of length 100
>pmap_answers = pmap(index -> my_fun(i, index), 1:4)
># Sum across 4 parallel runs
>total += sum(pmap_answers,2)
> end
> 
> This allocates memory for pmap_answers 1000 times. But, really, I could
> allocate this memory once outside of the loop and allow pmap to re-use that
> memory. I could pass in an array of size numCores x 100 to pmap. However, I
> know that pmap currently re-copies all variables that are passed to it. Is
> there a way to stop pmap from re-allocating memory, and instead just use
> pre-allocated memory? Or, any parallel functionality that allows this?
> 
> On Thursday, June 2, 2016 at 10:37:15 AM UTC-4, Martha White wrote:
> > I was printing information from each worker, and seeing the worker number
> > increase. But, when I actually check nworkers, the number stays at 3. So,
> > I
> > was incorrect about the number of workers increasing. Rather, because I am
> > adding and removing workers in the outer loop, the worker id is
> > increasing.
> > However, I do still have issues with speed, where it is slower to use pmap
> > and run in parallel. I am not currently seeing the open files issues, but
> > am running again to see if I can recreate that problem. In any case, for
> > speed, it might be that too much memory is being copied to pass to each
> > worker. Is there a way to restrict what is copied? For example, some
> > values
> > are const; can I somehow give this information to pmap?
> > 
> > On Wednesday, June 1, 2016 at 11:05:46 PM UTC-4, Greg Plowman wrote:
> >> You say you get a large number of workers.
> >> Without delving too deep, this seems pretty weird, regardless of other
> >> code.
> >> Have you checked the number of workers (using nworkers()) after call to
> >> addprocs()?
> >> If you are getting errors and re-run the script, is addprocs() just
> >> accumulating more workers?
> >> If so, perhaps try rmprocs(workers()) before addprocs()




[julia-users] [ANN]: ScikitLearn.jl 0.1.0 released, now with Julia ecosystem support

2016-06-10 Thread Cedric St-Jean
I'm pleased to announce the first major relase of ScikitLearn.jl 
! It now works with the 
following Julia models:

*DecisionTree.jl 
*
 - DecisionTreeClassifier
 - DecisionTreeRegressor
 - RandomForestClassifier
 - RandomForestRegressor
 - AdaBoostStumpClassifier

*LowRankModels.jl 
*
 - SkGLRM: Generalized Low Rank Models
 - PCA: Principal Component Analysis
 - QPCA: Quadratically Regularized PCA
 - RPCA: Robust PCA
 - NNMF: Non-negative matrix factorization
 - K-Means

*GaussianProcesses.jl* 

*GaussianMixtures.jl* 

Full list here . Special 
thanks to *@BenSadeghi*, *@davidavdav*, *@fairbrot *and *@madeleineudell *for 
their help and support.

DataFrames  are now accepted 
as inputs, through DataFrameMapper 
. And finally, 
the Python version of scikit-learn has been made an optional dependency, 
meaning that it's now possible to run a cross-validated grid-search of an 
NNMF and DecisionTreeClassifier pipeline, in 100% Julia. Though, of course, 
all of the 150 Python models 
 remain accessible.

If you're a package maintainer and would like to support the scikit-learn 
interface, it's easy , and 
I'm glad to help, just get in touch 
 if you have any 
questions.

Cédric


[julia-users] inner constructor returning an object of a different type

2016-06-10 Thread Tamas Papp
Hi,

I have seen the following pattern in some library code: a type with an
inner constructor which returns a value which is not of that type. A
contrived example is

type Foo
Foo() = :something_else
end

Apparently this is OK (in 0.4.5 at least where I tried). Is this an
oversight? Or if this has an application, could someone please share an
example? (It made little sense in the context I saw it).

Best,

Tamas


Re: [julia-users] Re: ArrayFire.jl - GPU Programming in Julia

2016-06-10 Thread Mauro
You need to install the arrayfire library by hand:
https://github.com/JuliaComputing/ArrayFire.jl#installation

If done already, check the trouble shooting section.

On Fri, 2016-06-10 at 12:10, Fred  wrote:
> Hi !
>
> Thank you for this great package ! I tried to install it on Julia 0.4.5 but I
> obtained :
>
> julia
>_
>  __ _(_)_   | A fresh approach to technical computing
>  (_)   | (_) (_)  | Documentation: http://docs.julialang.org
>  _ _  _| |_ __ _  | Type "?help" for help.
>  | | | | | | |/ _` | |
>  | | |_| | | | (_| | | Version 0.4.5 (2016-03-18 00:58 UTC)
> _/ |\__'_|_|_|\__'_| | Official http://julialang.org release
> |__/  | x86_64-linux-gnu
>
>
> julia> using ArrayFire
> ERROR: LoadError: LoadError: could not load library "libaf"
> libaf: Ne peut ouvrir le fichier d'objet partagé: Aucun fichier ou dossier de
> ce type
> in dlopen at ./libdl.jl:36
> in dlopen at libdl.jl:36
> in include at ./boot.jl:261
> in include_from_node1 at ./loading.jl:320
> in include at ./boot.jl:261
> in include_from_node1 at ./loading.jl:320
> in require at ./loading.jl:259
> while loading /home/fred/.julia/v0.4/ArrayFire/src/config.jl, in expression
> starting on line 6
> while loading /home/fred/.julia/v0.4/ArrayFire/src/ArrayFire.jl, in expression
> starting on line 5


[julia-users] Re: ArrayFire.jl - GPU Programming in Julia

2016-06-10 Thread Fred
Hi !

Thank you for this great package ! I tried to install it on Julia 0.4.5 but 
I obtained :

julia 
   _
   _   _ _(_)_ |  A fresh approach to technical computing
  (_) | (_) (_)|  Documentation: http://docs.julialang.org
   _ _   _| |_  __ _   |  Type "?help" for help.
  | | | | | | |/ _` |  |
  | | |_| | | | (_| |  |  Version 0.4.5 (2016-03-18 00:58 UTC)
 _/ |\__'_|_|_|\__'_|  |  Official http://julialang.org release
|__/   |  x86_64-linux-gnu


julia> using ArrayFire
ERROR: LoadError: LoadError: could not load library "libaf"
libaf: Ne peut ouvrir le fichier d'objet partagé: Aucun fichier ou dossier 
de ce type
 in dlopen at ./libdl.jl:36
 in dlopen at libdl.jl:36
 in include at ./boot.jl:261
 in include_from_node1 at ./loading.jl:320
 in include at ./boot.jl:261
 in include_from_node1 at ./loading.jl:320
 in require at ./loading.jl:259
while loading /home/fred/.julia/v0.4/ArrayFire/src/config.jl, in expression 
starting on line 6
while loading /home/fred/.julia/v0.4/ArrayFire/src/ArrayFire.jl, in 
expression starting on line 5





Re: [julia-users] Differential Equations Package

2016-06-10 Thread Mauro
On Thu, 2016-06-09 at 18:30, Chris Rackauckas  wrote:
> Hey,
>
> thanks for elaborating (maybe you should copy your message into some
>> vision.md file in your repo).  Your plan sounds cool, good luck with it!
>> I would definitely use a plug and play ODE/PDE solver infrastructure.
>
>
>   I will write a cleaner version as a blog post with some examples to walk
> through the whole idea once I have some more "polish" done.

(PS: thanks for your great blog!)

>>
>>
> I agree that at the fast pace you're moving at the moment it is best to
>> just have your own package(s).  However, if you code up more ODE/DAE
>> solvers, it would be great if those are usable without all the machinery
>> of DifferentialEquations.jl and if they eventually make into ODE.jl (or
>> some other "standard" ODE package).
>>
>>
> They definitely could. In fact, you can port some of the RK methods just by
> implementing some of the tableaus from
> here: 
> https://github.com/ChrisRackauckas/DifferentialEquations.jl/blob/master/src/ode/ODECoefficientTypes.jl.
> I just checked and am surprised you guys don't have a DP78 method, but it
> shouldn't take more than minutes to plop that tableau over there.

Yes, we only had the Fehlberg 78.  Not anymore though
https://github.com/JuliaLang/ODE.jl/pull/101, thanks!

>> A few more comments in-line below.  Cheers! Mauro
>>
>> On Wed, 2016-06-08 at 20:40, Chris Rackauckas > > wrote:
>> > Thanks for chiming in! There's a lot to say about this topic, and I
>> think it's
>> > great! I felt like Julia's differential equation solvers were behind
>> where they
>> > should be, which was crazy to me since there is so many state-of-the-art
>> > mathematical packages in other areas (optimization, stats, etc.)
>>
>> I think we are missing someone who actually does research in this area.
>> As far as I can tell, all of us ODE.jl-ers are just "users", but not
>> researching new methods.
>>
>> I think one of the key features here is that whilst JuMP is all fancy,
>> its components are pretty down to earth.  If you just want to optimize a
>> function without using JuMP's DSL then just use, e.g., NLopt.jl.
>> Maybe something to keep in mind when designing DifferentialEquations.
>>
>
> I have thought about keeping the solvers as functions which could just be
> called "naturally" without making a type, but the main issue is that
> plugging into some function by passing all of the parts is nice for only
> for the simplest types of problems (ODEs, Poisson equation, etc., but even
> this falls apart if you start talking about Taylor methods etc.). I
> actually started out with the FEM solvers having a solver with a huge
> definition, and then having the type be a dispatch which then plugs a bunch
> of things in. However, it started to become hard to maintain like that. For
> one, I didn't know you could splat kwargs (ahh! That was horrifying to pass
> every kwarg!), but then I didn't have a good way to make the kwargs of the
> typed dispatch match the kwargs of the solver (since the kwargs were
> defined for two different functions), and any time there was a change I'd
> have to "propagate that change through all the solvers". I finally had
> enough of that and found your Parameters.jl package. Now you see that the
> solvers just use a type and unpack at the top. Very simple and easy to
> maintain. But if there's a better way of handling that issue with the
> dispatches, then I can put them back to how they were and pull them out to
> be separate solvers which could be called directly.Then if you want to pull
> out some of the new solving algorithms (I need better words for
> differentiating here: the solver is the method which acts on a type, while
> the solving algorithm is RK or Euler-Maruyama), that's pretty much what I'm
> doing with @def except not making them directly callable due to
> kwarg/default settings problems. It's just easier to have all of the
> pre/post processing in one way, but that could change by some smarter
> engineering.

I agree, keyword-args are a maintenance headache (and also potentially
bad for performance).  That was indeed one of the reason to make
Parameters.jl to generate the keyword constructors for the types
automatically.  Then use the types instead of keyword functions.

Concerning the implementation I was not very clear: I don't mean to
argue that you should not use any fancy type, etc., in your solvers
(maybe call them integrators?) but that you should also provide a dumb
interface for them (eventually) so they can be used outside of
DifferentialEquations.  (And yes, the @def does feel like unnecessary
metaprogramming).

>> > Once I had that structure, I wanted it all to be similar so that way
>> they could
>> > interact well. Even though they do currently use it, there will be
>> (S)PDE
>> > solvers which allow for using method of lines discretizations which then
>> just
>> > plug into (S)ODE solvers. And what about all of those C/Fortran
>> > solvers?
>>
>> (Do you know https:/

Re: [julia-users] Constructors for types with Nullable fields

2016-06-10 Thread Milan Bouchet-Valat
Le vendredi 10 juin 2016 à 00:56 -0700, Helge Eichhorn a écrit :
> Hi,
> 
> let's say I have the following type with two Nullable fields:
> 
> type WithNulls 
>     a::Nullable{Float64} 
>     b::Nullable{Float64}
> end
> 
> I now want the user to be able to create an instance of this type
> without caring about Nullables. For this I use a constructor with
> keyword arguments.
> 
> function WithNulls(;
>     a = Nullable{Float64}(),
>     b = Nullable{Float64}(),
> )
>     WithNulls(a, b)
> end
> 
> This works for Float64 but not for the other leaf types of Real.
> 
> # Works
> WithNulls(a=3.0)
> 
> # Does not work
> WithNulls(a=pi)
> 
> This can be fixed by adding the following methods to convert:
> 
> Base.convert{T<:Real}(::Type{Nullable{T}}, v::T) = Nullable{T}(v)
> Base.convert{T<:Real,S<:Real}(::Type{Nullable{T}}, v::S) =
> Nullable{T}(convert(T,v))
> 
> Finally the question:
> Should the above convert methods not be part of Base? I think
> converting between different Nullable{T<:Real} values might be a
> common use case. Is there a more elegant way to do this?
Yes. These methods have been added in the 0.5 development version, so
your example works directly there.


Regards


Re: ***SPAM*** [julia-users] Constructors for types with Nullable fields

2016-06-10 Thread Milan Bouchet-Valat
Le vendredi 10 juin 2016 à 00:56 -0700, Helge Eichhorn a écrit :
> Hi,
> 
> let's say I have the following type with two Nullable fields:
> 
> type WithNulls 
>     a::Nullable{Float64} 
>     b::Nullable{Float64}
> end
> 
> I now want the user to be able to create an instance of this type
> without caring about Nullables. For this I use a constructor with
> keyword arguments.
> 
> function WithNulls(;
>     a = Nullable{Float64}(),
>     b = Nullable{Float64}(),
> )
>     WithNulls(a, b)
> end
> 
> This works for Float64 but not for the other leaf types of Real.
> 
> # Works
> WithNulls(a=3.0)
> 
> # Does not work
> WithNulls(a=pi)
> 
> This can be fixed by adding the following methods to convert:
> 
> Base.convert{T<:Real}(::Type{Nullable{T}}, v::T) = Nullable{T}(v)
> Base.convert{T<:Real,S<:Real}(::Type{Nullable{T}}, v::S) =
> Nullable{T}(convert(T,v))
> 
> Finally the question:
> Should the above convert methods not be part of Base? I think
> converting between different Nullable{T<:Real} values might be a
> common use case. Is there a more elegant way to do this?
Yes. These methods have been added in the 0.5 development version, so
your example works directly there.


Regards


[julia-users] Constructors for types with Nullable fields

2016-06-10 Thread Helge Eichhorn
Hi,

let's say I have the following type with two Nullable fields:

type WithNulls 
a::Nullable{Float64} 
b::Nullable{Float64}
end


I now want the user to be able to create an instance of this type without 
caring about Nullables. For this I use a constructor with keyword arguments.


function WithNulls(;
a = Nullable{Float64}(),
b = Nullable{Float64}(),
)
WithNulls(a, b)
end


This works for Float64 but not for the other leaf types of Real.


# Works
WithNulls(a=3.0)

# Does not work
WithNulls(a=pi)


This can be fixed by adding the following methods to convert:


Base.convert{T<:Real}(::Type{Nullable{T}}, v::T) = Nullable{T}(v)
Base.convert{T<:Real,S<:Real}(::Type{Nullable{T}}, v::S) = Nullable{T}(
convert(T,v))


Finally the question:

Should the above convert methods not be part of Base? I think converting 
between different Nullable{T<:Real} values might be a common use case. Is 
there a more elegant way to do this?


[julia-users] Re: Pkg.update() fails on local package

2016-06-10 Thread Andreas Lobinger
Hello colleague,

i ran (twice) into a problem that looked similar by the symtoms. In both 
cases the git configuration of METADATA were corrupted and in the recent 
case the remote setting was somehow routed to the Package name. -> Look in 
.julia/v0.4/METADATA/.git/config. If you see your Package name showing up, 
correct it (in my case i just copied the correct githup URL).

If not, post your git config -l here.