Github user cloud-fan commented on a diff in the pull request:
https://github.com/apache/spark/pull/18972#discussion_r150223463
--- Diff: sql/core/src/test/scala/org/apache/spark/sql/DataFrameSuite.scala
---
@@ -2081,10 +2081,8 @@ class DataFrameSuite extends QueryTest with
SharedSQLContext {
}
withSQLConf(SQLConf.CODEGEN_FALLBACK.key -> "false") {
- val e = intercept[SparkException] {
- df.filter(filter).count()
- }.getMessage
- assert(e.contains("grows beyond 64 KB"))
+ // SPARK-21720 avoids an exception due to JVM code size limit
--- End diff --
I think we should create a config for the threshold instead of hardcoding
`1024`, then we can keep the test case here, by setting the threshold to
Long.max.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]