I seriously doubt that, actually.  10^14 is a very large number.

As far as I know, the record for computing an SVD of a large sparse matrix
started with about 2-3 x 10^9 non-zero elements.  You are saying that your
problem is 100,000 times larger than this.  I think that you are going to
have to
wait for another 15 compute speed doubling times before this becomes a
feasible computation.

On Tue, Nov 23, 2010 at 11:55 AM, PEDRO MANUEL JIMENEZ RODRIGUEZ <
[email protected]> wrote:

> Well, this is in the worst case but it could be possible.
>
> I'm not going to make any tests with this amount of data because for me is
> impossible but this project is part of a bigger one and they would have
> enough space to deal with this amount of data.
>
>
> ----------------------------------------
> > From: [email protected]
> > Date: Mon, 22 Nov 2010 14:46:20 -0800
> > Subject: Re: Lanczos Algorithm
> > To: [email protected]
> >
> > That seems like a lot. That would mean that have 10^14 = 100 trillion
> > nonzero elements which would take 10PB to store with one bit per non-zero
> > element.
> >
> > Are there many totally zero rows?
> >
> > Can you estimate how many non-zero elements you have in all?
> >
>

Reply via email to