Github user sethah commented on the issue:
https://github.com/apache/spark/pull/15314
@zhengruifeng From a testing perspective, if we are going to accept
arbitrary numeric column types, then we should have unit tests which validate
that _all_ of the numeric types are handled correctly, not just `Int` and
`Long`. Right now, there is no quick way I can see to implement this. Ideally,
we'd write a helper method that we can easily apply in all the test suites,
accepting arbitrary estimators with `HasWeightCol` trait. I'm hesitant to
suggest that here, since we really need an exhaustive helper test that we can
apply to all aspects of instance weighting in ML algos. The other option is to
test all numeric types inside every test suite that has instance weighting, but
this is going to duplicate a lot of code. [I've created a
Jira](https://issues.apache.org/jira/browse/SPARK-17772) for weighted instance
testing, we can either block this PR until its implemented, or put in some
temporary tests for now. Thoughts?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]