[ 
https://issues.apache.org/jira/browse/HIVE-9306?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14269409#comment-14269409
 ] 

Xuefu Zhang commented on HIVE-9306:
-----------------------------------

fs_default_name2.q output needs to be updated, which will be consistent with 
trunk.

skewjoinopt5.q failed due to error below. It shouldn't be related to the 
changes here. In hive.log:
{code}
2015-01-07 22:05:46,510 INFO  [main]: impl.RemoteSparkJobStatus 
(RemoteSparkJobStatus.java:getSparkJobInfo(143)) - Job hasn't been submitted 
after 30s. Aborting it.
2015-01-07 22:05:46,511 ERROR [main]: exec.Task 
(SessionState.java:printError(839)) - Failed to execute spark task, with 
exception 'java.lang.IllegalStateException(RPC channel is closed.)'
java.lang.IllegalStateException: RPC channel is closed.
        at 
com.google.common.base.Preconditions.checkState(Preconditions.java:149)
        at org.apache.hive.spark.client.rpc.Rpc.call(Rpc.java:264)
        at org.apache.hive.spark.client.rpc.Rpc.call(Rpc.java:251)
        at 
org.apache.hive.spark.client.SparkClientImpl$ClientProtocol.cancel(SparkClientImpl.java:375)
        at 
org.apache.hive.spark.client.SparkClientImpl.cancel(SparkClientImpl.java:159)
        at 
org.apache.hive.spark.client.JobHandleImpl.cancel(JobHandleImpl.java:59)
        at 
org.apache.hadoop.hive.ql.exec.spark.status.impl.RemoteSparkJobStatus.getSparkJobInfo(RemoteSparkJobStatus.java:144)
        at 
org.apache.hadoop.hive.ql.exec.spark.status.impl.RemoteSparkJobStatus.getState(RemoteSparkJobStatus.java:75)
        at 
org.apache.hadoop.hive.ql.exec.spark.status.SparkJobMonitor.startMonitor(SparkJobMonitor.java:72)
        at 
org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:108)
        at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
        at 
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88)
        at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1634)
        at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1393)
        at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1179)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1045)
        at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1035)
        at 
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:206)
        at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:158)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:369)
        at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:304)
        at org.apache.hadoop.hive.ql.QTestUtil.executeClient(QTestUtil.java:880)
        at 
org.apache.hadoop.hive.cli.TestSparkCliDriver.runTest(TestSparkCliDriver.java:234)
        at 
org.apache.hadoop.hive.cli.TestSparkCliDriver.testCliDriver_skewjoinopt5(TestSparkCliDriver.java:206)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at junit.framework.TestCase.runTest(TestCase.java:176)
        at junit.framework.TestCase.runBare(TestCase.java:141)
        at junit.framework.TestResult$1.protect(TestResult.java:122)
        at junit.framework.TestResult.runProtected(TestResult.java:142)
        at junit.framework.TestResult.run(TestResult.java:125)
        at junit.framework.TestCase.run(TestCase.java:129)
        at junit.framework.TestSuite.runTest(TestSuite.java:255)
        at junit.framework.TestSuite.run(TestSuite.java:250)
        at 
org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:84)
        at 
org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
        at 
org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
        at 
org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
        at 
org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
        at 
org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
        at 
org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
{code}
This might be transient, but we need to address it if the problem persists.

> Let Context.isLocalOnlyExecutionMode() return false if execution engine is 
> Spark [Spark Branch]
> -----------------------------------------------------------------------------------------------
>
>                 Key: HIVE-9306
>                 URL: https://issues.apache.org/jira/browse/HIVE-9306
>             Project: Hive
>          Issue Type: Sub-task
>          Components: Spark
>            Reporter: Xuefu Zhang
>            Assignee: Xuefu Zhang
>         Attachments: HIVE-9306.1-spark.patch
>
>




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to