berkaysynnada commented on code in PR #7793:
URL: https://github.com/apache/arrow-datafusion/pull/7793#discussion_r1359621688


##########
datafusion/physical-expr/src/analysis.rs:
##########
@@ -184,24 +192,17 @@ fn shrink_boundaries(
         }
     });
     let graph_nodes = graph.gather_node_indices(&[expr.clone()]);
-    let (_, root_index) = graph_nodes.first().ok_or_else(|| {
-        DataFusionError::Internal("Error in constructing predicate 
graph".to_string())
-    })?;
-    let final_result = graph.get_interval(*root_index);
+    // Since propagation result was successful, the graph has at least one 
element.
+    // An empty check is also done at the outer scope, do not repeat it here.
+    let (_, root_index) = graph_nodes[0];
+    let final_result = graph.get_interval(root_index);
 
-    // If during selectivity calculation we encounter an error, use 1.0 as 
cardinality estimate
-    // safest estimate(e.q largest possible value).
     let selectivity = calculate_selectivity(
         &final_result.lower.value,
         &final_result.upper.value,
         &target_boundaries,
         &initial_boundaries,
-    )
-    .unwrap_or(1.0);
-
-    if !(0.0..=1.0).contains(&selectivity) {

Review Comment:
   While calculating new intervals on the PhysicalExpr graph, we intersect the 
original value of every node with the calculated result, which ensures an 
interval cannot expand after the analysis. I believe we can safely assume 
`initial_boundaries` are always wider or equal than `target_boundaries`.
   
   As far as I remember, there was such a check in the initial implementation 
of AnalysisContext , but it has become unnecessary now.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to