On 10/08/15 09:23, Oleg Goldshmidt wrote:
> A general comment. Asymptotic complexity has its uses but is very rarely
> relevant in practice. One would probaly need a serious literature search
> just to find out on what scale asymptotic complexity becomes relevant
> for a given type of problem, and I will not be surprised if such a
> search gives no generic result, or no practically useful result. In
> practice, people keep solving problems that would require Hubble times
> asymptotically.
>
> There was no scale (or speed) requirement mentioned in the OP. Later,
> Shachar mentioned 273x273 as a target. [He also said it was likely to be
> sparse, so I wouldn't get stuck on O() estimates.] I admit I have not
> done much matrix manipulation for quite a while, but 273x273 seems quite
> large for a regular computer.
74,000 cells hardly seem beyond the realm of recent[1] computer's power
to crunch. Even assuming straight on n^3, this means handling 20 million
cells. Do it in a non-compiled language, and it might take you a full
second (that's 100 cycles just to perform a single XOR).
>  It is not clear to me (I'll appreciate a
> comment just for curiousity's sake as well as for education) that a
> straightforward algorithm will be practical.
As I've said before, I've inversed, using Gaussian elimination, a 20x20
(400 cells) matrix using nothing more than vi. I'll add that the bigger
the matrix, the sparser it is (all lines except 12, regardless of matrix
size, are already diagonal). Even had that not been the case, however, I
fail to see how these are matrix sizes beyond the reach of normal
computer's capabilities.

Shachar
1 - say, state of the art circa 2000.
_______________________________________________
Linux-il mailing list
Linux-il@cs.huji.ac.il
http://mailman.cs.huji.ac.il/mailman/listinfo/linux-il

Reply via email to