-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/51895/#review149035
-----------------------------------------------------------



LGTM, if you take care of my issues/open questions.


spark-client/src/main/java/org/apache/hive/spark/client/SparkClientImpl.java 
(line 662)
<https://reviews.apache.org/r/51895/#comment216573>

    What would happen if the child process is killed while we are inside this 
while loop (so after the BufferedReader#ready check)? Wouldn't we get a stream 
closed exception on line 674?



spark-client/src/main/java/org/apache/hive/spark/client/SparkClientImpl.java 
(line 676)
<https://reviews.apache.org/r/51895/#comment216576>

    Since we have 2 redirectors maybe also log out which one we are in.



spark-client/src/main/java/org/apache/hive/spark/client/SparkClientImpl.java 
(line 684)
<https://reviews.apache.org/r/51895/#comment216577>

    Since we have 2 redirectors maybe also log out which one we are in.



spark-client/src/main/java/org/apache/hive/spark/client/SparkClientImpl.java 
(line 697)
<https://reviews.apache.org/r/51895/#comment216575>

    Wouldn't lineBuilder.indexOf(String.valueOf('\n')) work as well?


- Barna Zsombor Klara


On Sept. 14, 2016, 4:54 p.m., Gabor Szadovszky wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/51895/
> -----------------------------------------------------------
> 
> (Updated Sept. 14, 2016, 4:54 p.m.)
> 
> 
> Review request for hive, Chaoyu Tang, Naveen Gangam, and Barna Zsombor Klara.
> 
> 
> Repository: hive-git
> 
> 
> Description
> -------
> 
> HIVE-14714 - Finishing Hive on Spark causes "java.io.IOException: Stream 
> closed"
> 
> 
> Diffs
> -----
> 
>   
> spark-client/src/main/java/org/apache/hive/spark/client/SparkClientImpl.java 
> e8ca42aa22f0b312e009bea19e39adc8bd31e2b4 
> 
> Diff: https://reviews.apache.org/r/51895/diff/
> 
> 
> Testing
> -------
> 
> As the modification result is related to logging and the spark job submission 
> it would require too much efforts to create unit tests.
> 
> Tested manually by "highjacking" $SPARK_HOME/bin/spark-submit script to 
> reproduce the following scenarios:
> - The submit process does not exit after the RemoteDriver stopped
>   - Generating some output for less time than the actual redirector timeout
>   - Generating output for more time than the actual redirector timeout
> - The submit process ends properly after the RemoteDriver stopped
> 
> Expected behavior: After ending the actual session the client exits 
> immediately (beeline). All the stdout/stderr of the RemoteDriver are captured 
> properly in the hive.log until the redirector timeout.
> 
> 
> Thanks,
> 
> Gabor Szadovszky
> 
>

Reply via email to