[ https://issues.apache.org/jira/browse/SPARK-41872?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sandeep Singh updated SPARK-41872: ---------------------------------- Description: {code:java} row = self.spark.createDataFrame([("Alice", None, None, None)], schema).fillna(True).first() self.assertEqual(row.age, None){code} {code:java} Traceback (most recent call last): File "/Users/s.singh/personal/spark-oss/python/pyspark/sql/tests/test_dataframe.py", line 231, in test_fillna self.assertEqual(row.age, None) AssertionError: nan != None{code} was: {code:java} df = self.spark.range(10e10).toDF("id") such_a_nice_list = ["itworks1", "itworks2", "itworks3"] hinted_df = df.hint("my awesome hint", 1.2345, "what", such_a_nice_list){code} {code:java} Traceback (most recent call last): File "/Users/s.singh/personal/spark-oss/python/pyspark/sql/tests/test_dataframe.py", line 556, in test_extended_hint_types hinted_df = df.hint("my awesome hint", 1.2345, "what", such_a_nice_list) File "/Users/s.singh/personal/spark-oss/python/pyspark/sql/connect/dataframe.py", line 482, in hint raise TypeError( TypeError: param should be a int or str, but got float 1.2345{code} > Fix DataFrame fillna with bool > ------------------------------ > > Key: SPARK-41872 > URL: https://issues.apache.org/jira/browse/SPARK-41872 > Project: Spark > Issue Type: Sub-task > Components: Connect > Affects Versions: 3.4.0 > Reporter: Sandeep Singh > Priority: Major > > {code:java} > row = self.spark.createDataFrame([("Alice", None, None, None)], > schema).fillna(True).first() > self.assertEqual(row.age, None){code} > {code:java} > Traceback (most recent call last): > File > "/Users/s.singh/personal/spark-oss/python/pyspark/sql/tests/test_dataframe.py", > line 231, in test_fillna > self.assertEqual(row.age, None) > AssertionError: nan != None{code} -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org