Github user srowen commented on the pull request:
https://github.com/apache/spark/pull/6905#issuecomment-113542976
You would at least have to change the docs in `Evaluator` to say that
`evaluate` no longer necessarily returns larger values for better evaluations.
I don't see that you changed `RegressionEvaluator` to flip the logic though?
There's more of a problem with that though. 3 of the 4 metrics in
`RegressionEvaluator` are worse when larger: RMSE, MSE, MAE. But R^2 is not. So
this would still have a similar bug.
Another possibility is to invert the result of RMSE, MSE, MAE. For eval
purposes, their relative ranking is all that matters so returning 1/x as the
evaluation criteria is fine, for example. That would let you fully fix this
without any API change.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]