Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/6404#discussion_r31016417
--- Diff:
sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/ScriptTransformation.scala
---
@@ -58,11 +58,12 @@ case class ScriptTransformation(
child.execute().mapPartitions { iter =>
val cmd = List("/bin/bash", "-c", script)
val builder = new ProcessBuilder(cmd)
+ builder.redirectErrorStream(true)
--- End diff --
Isn't this pushing stderr into stdout? that mixes error output with the
program output.
I'm not arguing against consuming stderr, but maybe against pouring into
stdout.
BTW is this still an issue with `ProcessBuilder`? I know it was a problem
in Java back in the day; you had to consume the streams or face deadlock. I
imagine it still is but I haven't written code for `ProcessBuilder` in a long
time.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]