[ 
https://issues.apache.org/jira/browse/HIVE-25356?focusedWorklogId=627615&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-627615
 ]

ASF GitHub Bot logged work on HIVE-25356:
-----------------------------------------

                Author: ASF GitHub Bot
            Created on: 26/Jul/21 09:42
            Start Date: 26/Jul/21 09:42
    Worklog Time Spent: 10m 
      Work Description: zabetak commented on a change in pull request #2504:
URL: https://github.com/apache/hive/pull/2504#discussion_r676448741



##########
File path: 
ql/src/java/org/apache/hadoop/hive/ql/optimizer/calcite/rules/jdbc/JDBCAbstractSplitFilterRule.java
##########
@@ -127,7 +127,9 @@ public void onMatch(RelOptRuleCall call, SqlDialect 
dialect) {
     ArrayList<RexCall> validJdbcNode = visitor.getValidJdbcNode();
     ArrayList<RexCall> invalidJdbcNode = visitor.getInvalidJdbcNode();
 
-    assert validJdbcNode.size() != 0 && invalidJdbcNode.size() != 0;
+    if( validJdbcNode.size() == 0 || invalidJdbcNode.size() == 0) {
+      return;
+    }

Review comment:
       Did you hit this assertion error? My understanding is that the previous 
code guarantees that this should never happen so the assertion seems valid.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: gitbox-unsubscr...@hive.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Issue Time Tracking
-------------------

    Worklog Id:     (was: 627615)
    Time Spent: 20m  (was: 10m)

> JDBCSplitFilterAboveJoinRule's onMatch method throws exception 
> ---------------------------------------------------------------
>
>                 Key: HIVE-25356
>                 URL: https://issues.apache.org/jira/browse/HIVE-25356
>             Project: Hive
>          Issue Type: Bug
>            Reporter: Soumyakanti Das
>            Assignee: Soumyakanti Das
>            Priority: Major
>              Labels: pull-request-available
>          Time Spent: 20m
>  Remaining Estimate: 0h
>
>  
>  The stack trace is produced by [JDBCAbstractSplitFilterRule.java#L181 
> |https://github.com/apache/hive/blob/master/ql/src/java/org/apache/hadoop/hive/ql/optimizer/calcite/rules/jdbc/JDBCAbstractSplitFilterRule.java#L181].
>  In the onMatch method, a HiveFilter is being cast to HiveJdbcConverter.
> {code:java}
> java.lang.ClassCastException: 
> org.apache.hadoop.hive.ql.optimizer.calcite.reloperators.HiveFilter cannot be 
> cast to 
> org.apache.hadoop.hive.ql.optimizer.calcite.reloperators.jdbc.HiveJdbcConverter
>  java.lang.ClassCastException: 
> org.apache.hadoop.hive.ql.optimizer.calcite.reloperators.HiveFilter cannot be 
> cast to 
> org.apache.hadoop.hive.ql.optimizer.calcite.reloperators.jdbc.HiveJdbcConverter
>  at 
> org.apache.hadoop.hive.ql.optimizer.calcite.rules.jdbc.JDBCAbstractSplitFilterRule$JDBCSplitFilterAboveJoinRule.onMatch(JDBCAbstractSplitFilterRule.java:181)
>  at 
> org.apache.calcite.plan.AbstractRelOptPlanner.fireRule(AbstractRelOptPlanner.java:333)
>  at org.apache.calcite.plan.hep.HepPlanner.applyRule(HepPlanner.java:542) at 
> org.apache.calcite.plan.hep.HepPlanner.applyRules(HepPlanner.java:407) at 
> org.apache.calcite.plan.hep.HepPlanner.executeInstruction(HepPlanner.java:271)
>  at 
> org.apache.calcite.plan.hep.HepInstruction$RuleCollection.execute(HepInstruction.java:74)
>  at 
> org.apache.calcite.plan.hep.HepPlanner.executeProgram(HepPlanner.java:202) at 
> org.apache.calcite.plan.hep.HepPlanner.findBestExp(HepPlanner.java:189) at 
> org.apache.hadoop.hive.ql.parse.CalcitePlanner$CalcitePlannerAction.executeProgram(CalcitePlanner.java:2440)
>  at 
> org.apache.hadoop.hive.ql.parse.CalcitePlanner$CalcitePlannerAction.executeProgram(CalcitePlanner.java:2406)
>  at 
> org.apache.hadoop.hive.ql.parse.CalcitePlanner$CalcitePlannerAction.applyPostJoinOrderingTransform(CalcitePlanner.java:2326)
>  at 
> org.apache.hadoop.hive.ql.parse.CalcitePlanner$CalcitePlannerAction.apply(CalcitePlanner.java:1735)
>  at 
> org.apache.hadoop.hive.ql.parse.CalcitePlanner$CalcitePlannerAction.apply(CalcitePlanner.java:1588)
>  at 
> org.apache.calcite.tools.Frameworks.lambda$withPlanner$0(Frameworks.java:131) 
> at 
> org.apache.calcite.prepare.CalcitePrepareImpl.perform(CalcitePrepareImpl.java:914)
>  at org.apache.calcite.tools.Frameworks.withPrepare(Frameworks.java:180) at 
> org.apache.calcite.tools.Frameworks.withPlanner(Frameworks.java:126) at 
> org.apache.hadoop.hive.ql.parse.CalcitePlanner.logicalPlan(CalcitePlanner.java:1340)
>  at 
> org.apache.hadoop.hive.ql.parse.CalcitePlanner.genOPTree(CalcitePlanner.java:559)
>  at 
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:12512)
>  at 
> org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:452)
>  at 
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:316)
>  at 
> org.apache.hadoop.hive.ql.parse.ExplainSemanticAnalyzer.analyzeInternal(ExplainSemanticAnalyzer.java:175)
>  at 
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:316)
>  at org.apache.hadoop.hive.ql.Compiler.analyze(Compiler.java:223) at 
> org.apache.hadoop.hive.ql.Compiler.compile(Compiler.java:105) at 
> org.apache.hadoop.hive.ql.Driver.compile(Driver.java:500) at 
> org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:453) at 
> org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:417) at 
> org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:411) at 
> org.apache.hadoop.hive.ql.reexec.ReExecDriver.compileAndRespond(ReExecDriver.java:125)
>  at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:229) 
> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:256) 
> at org.apache.hadoop.hive.cli.CliDriver.processCmd1(CliDriver.java:201) at 
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:127) at 
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:422) at 
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:353) at 
> org.apache.hadoop.hive.ql.QTestUtil.executeClientInternal(QTestUtil.java:744) 
> at org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:714) at 
> org.apache.hadoop.hive.cli.control.CoreCliDriver.runTest(CoreCliDriver.java:170)
>  at 
> org.apache.hadoop.hive.cli.control.CliAdapter.runTest(CliAdapter.java:157) at 
> org.apache.hadoop.hive.cli.TestMiniLlapLocalCliDriver.testCliDriver(TestMiniLlapLocalCliDriver.java:62)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
> at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>  at java.lang.reflect.Method.invoke(Method.java:498) at 
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
>  at 
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>  at 
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
>  at 
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>  at 
> org.apache.hadoop.hive.cli.control.CliAdapter$2$1.evaluate(CliAdapter.java:135)
>  at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306) at 
> org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
>  at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366) at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
>  at 
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
>  at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331) at 
> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79) at 
> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329) at 
> org.junit.runners.ParentRunner.access$100(ParentRunner.java:66) at 
> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293) at 
> org.junit.runners.ParentRunner.run(ParentRunner.java:413) at 
> org.junit.runners.Suite.runChild(Suite.java:128) at 
> org.junit.runners.Suite.runChild(Suite.java:27) at 
> org.junit.runners.ParentRunner$4.run(ParentRunner.java:331) at 
> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79) at 
> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329) at 
> org.junit.runners.ParentRunner.access$100(ParentRunner.java:66) at 
> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293) at 
> org.apache.hadoop.hive.cli.control.CliAdapter$1$1.evaluate(CliAdapter.java:95)
>  at org.junit.rules.RunRules.evaluate(RunRules.java:20) at 
> org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306) at 
> org.junit.runners.ParentRunner.run(ParentRunner.java:413) at 
> org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:365)
>  at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:273)
>  at 
> org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:238)
>  at 
> org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:159)
>  at 
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:377)
>  at 
> org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:138) 
> at org.apache.maven.surefire.booter.ForkedBooter.run(ForkedBooter.java:465) 
> at 
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:451){code}
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to