[
https://issues.apache.org/jira/browse/MAHOUT-6?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12571450#action_12571450
]
Ted Dunning commented on MAHOUT-6:
----------------------------------
The different matrix API's have differed on this.
The goal to have easy extensibility implies that either these operations happen
in one of the abstract classes or that they live in a different place.
Many of these algorithms take the form of a matrix implementation. For
instance, all of the factorizations like eigenvector or singular vector or QR
or LU decomposition are nominally a matrix that has internal parts which are
useful sometimes in their own right (eigenvalues) or which facilitate some
common operation (back-sub in LU, least squares for QR). For all of them,
however, you can do multiplications and additions and such (just not
destructive mult and add).
As such, the approach of having special matrix types that are actually
decompositions is pretty attractive. The destructive ops can throw
unimplemented operation exceptions (which is the default at the highest
abstract matrix level anyway) and they can add some additional API elements for
their special capabilities. Whether constructors or factor methods are better
is an open question in my mind.
On 2/22/08 6:38 AM, "Grant Ingersoll" <[EMAIL PROTECTED]> wrote:
> Any thoughts about where support belongs for things like calculating
> eigenvalues/vectors and/or *Rank algorithms that rely on an iterative
> operations on a matrix/graph, etc.? Seems like they can be
> generalized, but not sure if they are first class citizens on a Matrix
> implementation or not.
> Need a matrix implementation
> ----------------------------
>
> Key: MAHOUT-6
> URL: https://issues.apache.org/jira/browse/MAHOUT-6
> Project: Mahout
> Issue Type: New Feature
> Reporter: Ted Dunning
> Attachments: MAHOUT-6a.diff
>
>
> We need matrices for Mahout.
> An initial set of basic requirements includes:
> a) sparse and dense support are required
> b) row and column labels are important
> c) serialization for hadoop use is required
> d) reasonable floating point performance is required, but awesome FP is not
> e) the API should be simple enough to understand
> f) it should be easy to carve out sub-matrices for sending to different
> reducers
> g) a reasonable set of matrix operations should be supported, these should
> eventually include:
> simple matrix-matrix and matrix-vector and matrix-scalar linear algebra
> operations, A B, A + B, A v, A + x, v + x, u + v, dot(u, v)
> row and column sums
> generalized level 2 and 3 BLAS primitives, alpha A B + beta C and A u +
> beta v
> h) easy and efficient iteration constructs, especially for sparse matrices
> i) easy to extend with new implementations
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.