Github user thvasilo commented on the pull request:

    https://github.com/apache/flink/pull/1849#issuecomment-212977748
  
    I did some testing and I think the problem has to do with the types that 
each scaler expects.
    
    `StandardScaler` has fit and transform operations for `DataSets` of type 
`Vector`, `LabeledVector`, and `(T :< Vector, Double)` while `MinMaxScaler` 
does not provide one for `(T :< Vector, Double)`. If you add the operations the 
code runs fine (at least re. you first comment).
    
    So this is a bug unrelated to this PR I think. The question becomes if we 
want to support all three of these types. My recommendation would be to have 
support for `Vector` and `LabeledVector` only, and remove all operations that 
work on `(Vector, Double)` tuples. I will file a JIRA for that.
    
    There is an argument to be whether some pre-processing steps are supervised 
(e.g. [PCA vs. 
LDA](https://stats.stackexchange.com/questions/161362/supervised-dimensionality-reduction))
 but in the strict definition of a transformer we shouldn't care about the 
label, only the features, so that operation can implemented at the 
`Transformer` level.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

Reply via email to