GitHub user heary-cao opened a pull request:
https://github.com/apache/spark/pull/18961
[SQL]nondeterministic expressions correctly for filter predicates
## What changes were proposed in this pull request?
Currently, We do interpretedpredicate optimization, but not very well,
because when our filter contained an indeterminate expression, it would have an
exception. This PR describes solving this problem by adding the initialize
method in InterpretedPredicate.
java.lang.IllegalArgumentException:
```
java.lang.IllegalArgumentException: requirement failed: Nondeterministic
expression org.apache.spark.sql.catalyst.expressions.Rand should be initialized
before eval.
at scala.Predef$.require(Predef.scala:224)
at
org.apache.spark.sql.catalyst.expressions.Nondeterministic$class.eval(Expression.scala:291)
at
org.apache.spark.sql.catalyst.expressions.RDG.eval(randomExpressions.scala:34)
at
org.apache.spark.sql.catalyst.expressions.BinaryExpression.eval(Expression.scala:415)
at
org.apache.spark.sql.catalyst.expressions.InterpretedPredicate.eval(predicates.scala:38)
at
org.apache.spark.sql.catalyst.catalog.ExternalCatalogUtils$$anonfun$prunePartitionsByFilter$1.apply(ExternalCatalogUtils.scala:158)
at
org.apache.spark.sql.catalyst.catalog.ExternalCatalogUtils$$anonfun$prunePartitionsByFilter$1.apply(ExternalCatalogUtils.scala:157)
at scala.collection.immutable.Stream.filter(Stream.scala:519)
at scala.collection.immutable.Stream.filter(Stream.scala:202)
at
org.apache.spark.sql.catalyst.catalog.ExternalCatalogUtils$.prunePartitionsByFilter(ExternalCatalogUtils.scala:157)
at
org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$listPartitionsByFilter$1.apply(HiveExternalCatalog.scala:1129)
at
org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$listPartitionsByFilter$1.apply(HiveExternalCatalog.scala:1119)
at
org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
at
org.apache.spark.sql.hive.HiveExternalCatalog.listPartitionsByFilter(HiveExternalCatalog.scala:1119)
at
org.apache.spark.sql.catalyst.catalog.SessionCatalog.listPartitionsByFilter(SessionCatalog.scala:925)
at
org.apache.spark.sql.execution.datasources.CatalogFileIndex.filterPartitions(CatalogFileIndex.scala:73)
at
org.apache.spark.sql.execution.datasources.PruneFileSourcePartitions$$anonfun$apply$1.applyOrElse(PruneFileSourcePartitions.scala:60)
at
org.apache.spark.sql.execution.datasources.PruneFileSourcePartitions$$anonfun$apply$1.applyOrElse(PruneFileSourcePartitions.scala:27)
at
org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$2.apply(TreeNode.scala:267)
at
org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$2.apply(TreeNode.scala:267)
at
org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:70)
at
org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:266)
at
org.apache.spark.sql.execution.datasources.PruneFileSourcePartitions$.apply(PruneFileSourcePartitions.scala:27)
at
org.apache.spark.sql.execution.datasources.PruneFileSourcePartitions$.apply(PruneFileSourcePartitions.scala:26)
```
## How was this patch tested?
Should be covered existing test cases and add new test cases.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/heary-cao/spark Predicate
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/18961.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #18961
----
commit f42ede0c4bb68534b9a28dca322a833d2b253f10
Author: caoxuewen <[email protected]>
Date: 2017-08-16T10:12:26Z
nondeterministic expressions correctly for filter predicates
----
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]