[ https://issues.apache.org/jira/browse/HIVE-14714?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15503061#comment-15503061 ]
Rui Li commented on HIVE-14714: ------------------------------- bq. These threads are running in HS2 therefore, they won't be terminated in case of beeline is closed. Yeah, but if we use CLI, these threads run in the CLI. Then we may lose some output from spark-submit after CLI exits. Thinking more about this, I think the problem is more specific to yarn-cluster mode right? Because in yarn-client mode, RemoteDriver runs in spark-submit so it should shut down properly. For yarn-cluster mode, spark-submit is just a monitor for the spark app. It may be acceptable to lose some output from it. But on the other hand, user can set {{spark.yarn.submit.waitAppCompletion=false}} so that spark-submit exits after the app is submitted in order to avoid this hanging issue. HIVE-13895 actually made this default. I wonder if that should be enough for the issue. > Finishing Hive on Spark causes "java.io.IOException: Stream closed" > ------------------------------------------------------------------- > > Key: HIVE-14714 > URL: https://issues.apache.org/jira/browse/HIVE-14714 > Project: Hive > Issue Type: Bug > Components: HiveServer2 > Affects Versions: 1.1.0 > Reporter: Gabor Szadovszky > Assignee: Gabor Szadovszky > Attachments: HIVE-14714.2.patch, HIVE-14714.patch > > > After execute hive command with Spark, finishing the beeline session or > even switch the engine causes IOException. The following executed Ctrl-D to > finish the session but "!quit" or even "set hive.execution.engine=mr;" causes > the issue. > From HS2 log: > {code} > 2016-09-06 16:15:12,291 WARN org.apache.hive.spark.client.SparkClientImpl: > [HiveServer2-Handler-Pool: Thread-106]: Timed out shutting down remote > driver, interrupting... > 2016-09-06 16:15:12,291 WARN org.apache.hive.spark.client.SparkClientImpl: > [Driver]: Waiting thread interrupted, killing child process. > 2016-09-06 16:15:12,296 WARN org.apache.hive.spark.client.SparkClientImpl: > [stderr-redir-1]: Error in redirector thread. > java.io.IOException: Stream closed > at > java.io.BufferedInputStream.getBufIfOpen(BufferedInputStream.java:162) > at java.io.BufferedInputStream.read1(BufferedInputStream.java:272) > at java.io.BufferedInputStream.read(BufferedInputStream.java:334) > at sun.nio.cs.StreamDecoder.readBytes(StreamDecoder.java:283) > at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:325) > at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:177) > at java.io.InputStreamReader.read(InputStreamReader.java:184) > at java.io.BufferedReader.fill(BufferedReader.java:154) > at java.io.BufferedReader.readLine(BufferedReader.java:317) > at java.io.BufferedReader.readLine(BufferedReader.java:382) > at > org.apache.hive.spark.client.SparkClientImpl$Redirector.run(SparkClientImpl.java:617) > at java.lang.Thread.run(Thread.java:745) > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)