[ https://issues.apache.org/jira/browse/SPARK-37247?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun updated SPARK-37247: ---------------------------------- Parent: (was: SPARK-37246) Issue Type: Bug (was: Sub-task) > Failed test_create_nan_decimal_dataframe > (pyspark.sql.tests.test_dataframe.DataFrameTests) > ------------------------------------------------------------------------------------------ > > Key: SPARK-37247 > URL: https://issues.apache.org/jira/browse/SPARK-37247 > Project: Spark > Issue Type: Bug > Components: PySpark > Affects Versions: 3.3.0 > Reporter: Xinrong Meng > Priority: Major > > {code:java} > File > "/Users/xinrong.meng/spark/python/pyspark/sql/tests/test_dataframe.py", line > 957, in test_create_nan_decimal_dataframe > self.spark.createDataFrame(data=[Decimal('NaN')], > schema='decimal').collect(), > File "/Users/xinrong.meng/spark/python/pyspark/sql/dataframe.py", line 751, > in collect > sock_info = self._jdf.collectToPython() > File > "/Users/xinrong.meng/spark/python/lib/py4j-0.10.9.2-src.zip/py4j/java_gateway.py", > line 1309, in __call__ > return_value = get_return_value( > File "/Users/xinrong.meng/spark/python/pyspark/sql/utils.py", line 178, in > deco > return f(*a, **kw) > File > "/Users/xinrong.meng/spark/python/lib/py4j-0.10.9.2-src.zip/py4j/protocol.py", > line 326, in get_return_value > raise Py4JJavaError( > py4j.protocol.Py4JJavaError: An error occurred while calling > o135.collectToPython. > : org.apache.spark.SparkException: Job aborted due to stage failure: Task 3 > in stage 2.0 failed 1 times, most recent failure: Lost task 3.0 in stage 2.0 > (TID 7) (172.16.203.223 executor driver): > net.razorvine.pickle.PickleException: problem construction object: > java.lang.reflect.InvocationTargetException > ...{code} -- This message was sent by Atlassian Jira (v8.20.1#820001) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org