Github user DylanGuedes commented on a diff in the pull request:
https://github.com/apache/spark/pull/20788#discussion_r175181523
--- Diff: python/pyspark/sql/dataframe.py ---
@@ -437,10 +437,12 @@ def hint(self, name, *parameters):
if not isinstance(name, str):
raise TypeError("name should be provided as str, got
{0}".format(type(name)))
+ allowed_types = (basestring, list, float, int)
--- End diff --
Good catch - the Scala API accepts any, but since i don't know about the
python->scala conversion, I can't answer if it is necessary to restrict to
primite python types. Whatever, I get noticed that even dict works, so I
extended it.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]