No.
julia> d=rand(10,3)
10x3 Array{Float64,2}:
0.933615 0.900428 0.601756
0.226293 0.57221 0.638165
0.466379 0.422646 0.0512318
0.974844 0.96944 0.576568
0.501572 0.708026 0.750814
0.387061 0.652366 0.533456
0.488878 0.527256 0.389493
0.70682 0.372663 0.291916
0.680118 0.18169 0.168652
0.101872 0.16155 0.77697
julia> cor(d)
3x3 Array{Float64,2}:
1.0 0.566044 -0.237259
0.566044 1.0 0.373702
-0.237259 0.373702 1.0
julia> rank(cor(d))
3
I think if you take the correlation of an n x m matrix, the result is a m x
m matrix with rank=min(m, n - 1).
On Monday, July 28, 2014 9:36:53 AM UTC-7, Stefan Karpinski wrote:
>
> Does this computation not always return a rank-1 matrix?
>
>
> On Mon, Jul 28, 2014 at 12:33 PM, John Myles White <[email protected]
> <javascript:>> wrote:
>
>> But how would you know the rank of the correlation matrix in advance?
>>
>> -- John
>>
>> On Jul 28, 2014, at 9:25 AM, Stefan Karpinski <[email protected]
>> <javascript:>> wrote:
>>
>> This is the sort of thing that just begs for a custom representation of a
>> rank-1 matrix, which fortunately, isn't terribly hard to implement in Julia.
>>
>>
>> On Mon, Jul 28, 2014 at 12:08 PM, Tim Holy <[email protected]
>> <javascript:>> wrote:
>>
>>> If they're sparse along dimension 1, you can at least save time
>>> computing the
>>> dot product of the two sparse vectors. But yes, the correlation matrix
>>> itself
>>> will be dense.
>>>
>>> --Tim
>>>
>>> On Monday, July 28, 2014 11:23:31 AM Jiahao Chen wrote:
>>> > > I don't think sparse cor() is implemented and is falling back to the
>>> dense
>>> > > implementation.
>>> > Computing the correlation matrix is much like computing the outer
>>> > product of two sparse vectors. There will be massive fill-in and I
>>> > don't see how you can preserve sparsity without special knowledge
>>> > about the sparsity pattern.
>>> > Thanks,
>>> >
>>> > Jiahao Chen
>>> > Staff Research Scientist
>>> > MIT Computer Science and Artificial Intelligence Laboratory
>>> >
>>> > On Mon, Jul 28, 2014 at 11:12 AM, Stefan Karpinski <
>>> [email protected] <javascript:>>
>>> wrote:
>>> > > https://github.com/JuliaLang/julia/issues/new
>>> > >
>>> > >
>>> > > On Mon, Jul 28, 2014 at 10:06 AM, paul analyst <[email protected]
>>> <javascript:>>
>>> > >
>>> > > wrote:
>>> > >> Issue on github or on julia-dev groups?
>>> > >> Paul
>>> > >>
>>> > >> W dniu poniedziałek, 28 lipca 2014 12:05:27 UTC+2 użytkownik Viral
>>> Shah
>>> > >>
>>> > >> napisał:
>>> > >>> Please file an issue. I don't think sparse cor() is implemented
>>> and is
>>> > >>> falling back to the dense implementation.
>>> > >>>
>>> > >>> -viral
>>> > >>>
>>> > >>> On Monday, July 28, 2014 1:41:55 PM UTC+5:30, paul analyst wrote:
>>> > >>>> Correlation sparse array is very slow. Out of memory on a dense
>>> array
>>> > >>>> when we have 30,000 columns. How quickly it calculated?
>>> > >>>>
>>> > >>>> julia> I=int32((rand(10^7)*9999999).+1);
>>> > >>>>
>>> > >>>> julia> J=int32((rand(10^7)*29999).+1);
>>> > >>>>
>>> > >>>> julia> V=int8((rand(10^7)*9).+1);
>>> > >>>>
>>> > >>>> julia> D=sparse(I,J,V);
>>> > >>>>
>>> > >>>> julia> @time cor(D[:,1:30]);
>>> > >>>> elapsed time: 23.806328476 seconds (2458875228 bytes allocated,
>>> 0.14%
>>> > >>>> gc
>>> > >>>> time)
>>> > >>>>
>>> > >>>> julia> @time cor(full(D[:,1:30]));
>>> > >>>> elapsed time: 4.494099126 seconds (2732042496 bytes allocated,
>>> 5.31% gc
>>> > >>>> time)
>>> > >>>>
>>> > >>>> julia>
>>> > >>>>
>>> > >>>> Paul
>>>
>>>
>>
>>
>