[ 
https://issues.apache.org/jira/browse/SPARK-17001?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15422663#comment-15422663
 ] 

Sean Owen commented on SPARK-17001:
-----------------------------------

Yes that came up on the thread that prompted this, that VectorAssembler could 
be told to make only dense vectors. That's reasonable. The downside pointed out 
there was that this means the vectors are dense when they could benefit from a 
sparse representation, just for compatibility with another component.

But maybe there's an argument that this is behavior that's generally useful to 
turn off because the same issue may come up elsewhere. If something doesn't 
work on sparse vectors, and the output of VectorAssembler is a mix of both just 
depending on the values in the input, it could make some pipelines succeed or 
error based on the input data, when the input is conceptually entirely valid.

Of course you can indeed manually make the vectors dense. Not bad at all and 
that's what we had done in the past. It involves an extra copy.

I had thought it simplest in this case to just let it work rather than fail, 
but I don't mind going other directions with the solution.

> Enable standardScaler to standardize sparse vectors when withMean=True
> ----------------------------------------------------------------------
>
>                 Key: SPARK-17001
>                 URL: https://issues.apache.org/jira/browse/SPARK-17001
>             Project: Spark
>          Issue Type: Improvement
>    Affects Versions: 2.0.0
>            Reporter: Tobi Bosede
>            Priority: Minor
>
> When withMean = true, StandardScaler will not handle sparse vectors, and 
> instead throw an exception. This is presumably because subtracting the mean 
> makes a sparse vector dense, and this can be undesirable. 
> However, VectorAssembler generates vectors that may be a mix of sparse and 
> dense, even when vectors are smallish, depending on their values. It's common 
> to feed this into StandardScaler, but it would fail sometimes depending on 
> the input if withMean = true. This is kind of surprising.
> StandardScaler should go ahead and operate on sparse vectors and subtract the 
> mean, if explicitly asked to do so with withMean, on the theory that the user 
> knows what he/she is doing, and there is otherwise no way to make this work.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to