Github user mgaido91 commented on a diff in the pull request:
https://github.com/apache/spark/pull/20410#discussion_r164267635
--- Diff: python/pyspark/ml/wrapper.py ---
@@ -118,10 +118,9 @@ def _transfer_params_to_java(self):
"""
Transforms the embedded params to the companion Java object.
"""
- paramMap = self.extractParamMap()
for param in self.params:
- if param in paramMap:
- pair = self._make_java_param_pair(param, paramMap[param])
+ if param in self._paramMap:
+ pair = self._make_java_param_pair(param,
self._paramMap[param])
--- End diff --
Thanks for your comment. There is only one problem: you _can't_ transfer
the default values to Scala without using `set`. Indeed, `setDefault` is
`protected`, so it can't be called by Python.
Moreover, we already have test cases which ensures that the defaults in
Python and Scala are the same. And since the user can't change a default, we
are on the safe side. As I said before, I think that a good next step would be
to read the defaults from Scala in Python and this will make us sure that they
will be always consistent.
Thanks for the suggestion, but I don't really agree that using
`getOrDefault` would make the intent more clear: on the opposite, I think it
might be confusing. I can use `isSet` as you suggested, instead, or I can add a
comment, what do you think?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]