Github user jkbradley commented on a diff in the pull request:

    https://github.com/apache/spark/pull/11663#discussion_r56406664
  
    --- Diff: python/pyspark/ml/param/__init__.py ---
    @@ -275,23 +382,9 @@ def _set(self, **kwargs):
             """
             for param, value in kwargs.items():
                 p = getattr(self, param)
    -            if p.expectedType is None or type(value) == p.expectedType or 
value is None:
    -                self._paramMap[getattr(self, param)] = value
    -            else:
    -                try:
    -                    # Try and do "safe" conversions that don't lose 
information
    -                    if p.expectedType == float:
    -                        self._paramMap[getattr(self, param)] = float(value)
    -                    # Python 3 unified long & int
    -                    elif p.expectedType == int and type(value).__name__ == 
'long':
    -                        self._paramMap[getattr(self, param)] = value
    -                    else:
    -                        raise Exception(
    -                            "Provided type {0} incompatible with type {1} 
for param {2}"
    -                            .format(type(value), p.expectedType, p))
    -                except ValueError:
    -                    raise Exception(("Failed to convert {0} to type {1} 
for param {2}"
    -                                     .format(type(value), p.expectedType, 
p)))
    +            if value is not None:
    +                value = p.typeConverter(value)
    +            self._paramMap[getattr(self, param)] = value
    --- End diff --
    
    reuse value ```p```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to