aokolnychyi commented on code in PR #6760:
URL: https://github.com/apache/iceberg/pull/6760#discussion_r1136500540


##########
spark/v3.3/spark/src/main/scala/org/apache/spark/sql/execution/datasources/SparkExpressionConverter.scala:
##########
@@ -36,15 +35,15 @@ object SparkExpressionConverter {
     SparkFilters.convert(DataSourceStrategy.translateFilter(sparkExpression, 
supportNestedPredicatePushdown = true).get)
   }
 
-  @throws[AnalysisException]
-  def collectResolvedSparkExpression(session: SparkSession, tableName: String, 
where: String): Expression = {
+  def collectResolvedSparkExpressionOption(session: SparkSession,
+                                           tableName: String, where: String): 
Option[Expression] = {
     val tableAttrs = session.table(tableName).queryExecution.analyzed.output
     val unresolvedExpression = 
session.sessionState.sqlParser.parseExpression(where)
     val filter = Filter(unresolvedExpression, DummyRelation(tableAttrs))
     val optimizedLogicalPlan = 
session.sessionState.executePlan(filter).optimizedPlan
     optimizedLogicalPlan.collectFirst {
-      case filter: Filter => filter.condition
-    }.getOrElse(throw new AnalysisException("Failed to find filter 
expression"))
+      case filter: Filter => Some(filter.condition)

Review Comment:
   Would it be fair to assume we would get back an empty local table scan back 
if the condition is evaluated to false? If so, what about modifying the logic 
in `collectFirst` to find that and then return `false`?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to