Repository: spark
Updated Branches:
  refs/heads/master ec79183ac -> 5f342049c


[SPARK-16339][CORE] ScriptTransform does not print stderr when outstream is lost

## What changes were proposed in this pull request?

Currently, if due to some failure, the outstream gets destroyed or closed and 
later `outstream.close()` leads to IOException in such case. Due to this, the 
`stderrBuffer` does not get logged and there is no way for users to see why the 
job failed.

The change is to first display the stderr buffer and then try closing the 
outstream.

## How was this patch tested?

The correct way to test this fix would be to grep the log to see if the 
`stderrBuffer` gets logged but I dont think having test cases which do that is 
a good idea.

(If this patch involves UI changes, please attach a screenshot; otherwise, 
remove this)

…

Author: Tejas Patil <tej...@fb.com>

Closes #13834 from tejasapatil/script_transform.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/5f342049
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/5f342049
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/5f342049

Branch: refs/heads/master
Commit: 5f342049cce9102fb62b4de2d8d8fa691c2e8ac4
Parents: ec79183
Author: Tejas Patil <tej...@fb.com>
Authored: Wed Jul 6 09:18:04 2016 +0100
Committer: Sean Owen <so...@cloudera.com>
Committed: Wed Jul 6 09:18:04 2016 +0100

----------------------------------------------------------------------
 .../spark/sql/hive/execution/ScriptTransformation.scala      | 8 ++++----
 1 file changed, 4 insertions(+), 4 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/5f342049/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/ScriptTransformation.scala
----------------------------------------------------------------------
diff --git 
a/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/ScriptTransformation.scala
 
b/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/ScriptTransformation.scala
index 84990d3..d063dd6 100644
--- 
a/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/ScriptTransformation.scala
+++ 
b/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/ScriptTransformation.scala
@@ -314,15 +314,15 @@ private class ScriptTransformationWriterThread(
       }
       threwException = false
     } catch {
-      case NonFatal(e) =>
+      case t: Throwable =>
         // An error occurred while writing input, so kill the child process. 
According to the
         // Javadoc this call will not throw an exception:
-        _exception = e
+        _exception = t
         proc.destroy()
-        throw e
+        throw t
     } finally {
       try {
-        outputStream.close()
+        Utils.tryLogNonFatalError(outputStream.close())
         if (proc.waitFor() != 0) {
           logError(stderrBuffer.toString) // log the stderr circular buffer
         }


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to