HyukjinKwon commented on a change in pull request #25594:
[SPARK-28881][PYTHON][TESTS] Add a test to make sure toPandas with Arrow
optimization throws an exception per maxResultSize
URL: https://github.com/apache/spark/pull/25594#discussion_r317943848
##########
File path: python/pyspark/sql/tests/test_arrow.py
##########
@@ -421,6 +421,35 @@ def run_test(num_records, num_parts, max_records,
use_delay=False):
run_test(*case)
[email protected](
+ not have_pandas or not have_pyarrow,
+ pandas_requirement_message or pyarrow_requirement_message)
+class MaxResultArrowTests(unittest.TestCase):
+ # These tests are separate as 'spark.driver.maxResultSize' configuration
+ # is a static configuration to Spark context.
+
+ @classmethod
+ def setUpClass(cls):
+ cls.spark = SparkSession.builder \
+ .master("local[4]") \
+ .appName(cls.__name__) \
+ .config("spark.driver.maxResultSize", "10k") \
+ .getOrCreate()
+
+ # Explicitly enable Arrow and disable fallback.
+ cls.spark.conf.set("spark.sql.execution.arrow.pyspark.enabled", "true")
+
cls.spark.conf.set("spark.sql.execution.arrow.pyspark.fallback.enabled",
"false")
Review comment:
`spark.sql.execution.arrow.enabled` now has an alias
`spark.sql.execution.arrow.pyspark.enabled` as of
https://github.com/apache/spark/commit/d6632d185e147fcbe6724545488ad80dce20277e
And, `spark.sql.execution.arrow.pyspark.fallback.enabled` was just set to
narrow down test scope. Correct behaviour should be a failure in Arrow
optimized code path (although the JIRAs' case, it fails in Arrow optimized path
first, and then fails again in non-Arrow optimized path).
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]