Incorrect matrix rank via SVD
-----------------------------

                 Key: MATH-465
                 URL: https://issues.apache.org/jira/browse/MATH-465
             Project: Commons Math
          Issue Type: Bug
    Affects Versions: 2.1
         Environment: Windows XP Prof. Vs. 2002
            Reporter: Marisa Thoma


The getRank() function of SingularValueDecompositionImpl does not work 
properly. This problem is probably related to the numerical stability problems 
mentioned in [MATH-327|https://issues.apache.org/jira/browse/MATH-327] and 
[MATH-320|https://issues.apache.org/jira/browse/MATH-320].

Example call with the standard matrix from R (rank 2):

{code:title=TestSVDRank.java}
import org.apache.commons.math.linear.Array2DRowRealMatrix;
import org.apache.commons.math.linear.RealMatrix;
import org.apache.commons.math.linear.SingularValueDecomposition;
import org.apache.commons.math.linear.SingularValueDecompositionImpl;

public class TestSVDRank {
        public static void main(String[] args) {
                double[][] d = { { 1, 1, 1 }, { 0, 0, 0 }, { 1, 2, 3 } };
                RealMatrix m = new Array2DRowRealMatrix(d);
                SingularValueDecomposition svd = new 
SingularValueDecompositionImpl(m);
                int r = svd.getRank();
                System.out.println("Rank: "+r);
        }
}
{code} 

The rank is computed as 3. This problem also occurs for larger matrices. I 
discovered the problem when trying to replace the corresponding JAMA method.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to