Github user jkbradley commented on a diff in the pull request:
https://github.com/apache/spark/pull/11663#discussion_r57024870
--- Diff: python/pyspark/ml/param/__init__.py ---
@@ -275,23 +425,9 @@ def _set(self, **kwargs):
"""
for param, value in kwargs.items():
p = getattr(self, param)
- if p.expectedType is None or type(value) == p.expectedType or
value is None:
- self._paramMap[getattr(self, param)] = value
- else:
- try:
- # Try and do "safe" conversions that don't lose
information
- if p.expectedType == float:
- self._paramMap[getattr(self, param)] = float(value)
- # Python 3 unified long & int
- elif p.expectedType == int and type(value).__name__ ==
'long':
- self._paramMap[getattr(self, param)] = value
- else:
- raise Exception(
- "Provided type {0} incompatible with type {1}
for param {2}"
- .format(type(value), p.expectedType, p))
- except ValueError:
- raise Exception(("Failed to convert {0} to type {1}
for param {2}"
- .format(type(value), p.expectedType,
p)))
+ if value is not None:
+ value = p.typeConverter(value)
--- End diff --
It would be nice to catch the error and augment it with a nicer message
saying which Param value was invalid.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]