I think that there are two rationales FOR the current behavior and at least
one argument AGAINST:

FOR:

It can be hard for an implementation to know the exact number of non-default
elements.

There is an analogy with the size() method.

It is what it was and users may already use it.

AGAINST:

The size() method was silly to begin with.  rowSize() and columnSize() are
far more useful.  Returning an array
implies a copy which means that there is lots of allocation going on or it
means returning a reference to internal
state which is worse.

The worst cost is probably the row sparse representations.  NumNonDefault
would have to iterate
over all rows calling numNonDefault on each row.  That isn't soooo bad,
especially if somebody is about
to iterate over all those elements.


SUMMARY:

I wouldn't mind a change, but others should speak up if they care about the
current behavior.

On Thu, Oct 21, 2010 at 11:33 PM, Alexander Hans <[email protected]> wrote:

> Hey,
>
> Why does Matrix.getNumNondefaultElements() return an int[2] containing the
> numbers of rows and numbers of columns with non-default elements? I would
> expect it to return just an int containing the actual number of
> non-default elements. From the number of rows and columns I think one can
> do nothing except derive an upper limit of the total number as
> numRows*numCols. If a user of Matrix wants to know the actual total number
> of non-default elements, I think currently he/she would have to iterate
> the whole matrix and count him-/herself. It seems like the currently
> implemented version is used nowhere, the only reference I've found is
> testSize(), which tests that method.
>
> So can I go ahead and change the interface and the implementations to
> return the actual number of non-default elements?
>
>
> Thanks,
>
> Alex
>
>
>

Reply via email to