[
https://issues.apache.org/jira/browse/HIVE-869?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12764268#action_12764268
]
Zheng Shao commented on HIVE-869:
---------------------------------
@HIVE-869.3.patch:
+1. Code looks good. Will correct the following indentation and commit.
{code}
+ if(isBrokenPipeException(e) && allowPartialConsumption()) {
+ setDone(true);
+ LOG.warn("Got broken pipe during write: ignoring exception and setting
operator to done");
+ } else {
+ LOG.error("Error in writing to script: " + e.getMessage());
+ if(isBrokenPipeException(e)) {
+ displayBrokenPipeInfo();
+ }
+ scriptError = e;
+ throw new HiveException(e);
+ }
{code}
> Allow ScriptOperator to consume not all input data
> --------------------------------------------------
>
> Key: HIVE-869
> URL: https://issues.apache.org/jira/browse/HIVE-869
> Project: Hadoop Hive
> Issue Type: New Feature
> Affects Versions: 0.5.0
> Reporter: Zheng Shao
> Assignee: Paul Yang
> Fix For: 0.5.0
>
> Attachments: HIVE-869.1.patch, HIVE-869.2.patch, HIVE-869.3.patch
>
>
> The ScriptOperator (SELECT TRANSFORM(a, b, c) USING 'myscript' AS (d, e, f)
> ...) has a problem:
> If the user script exits without consuming all data from standard input, then
> we will report an error even if the exit code from the user script is 0.
> We want to have an option, when enabled, ScriptOperator will return
> successfully in that case.
> If the option is not enabled, then we should stick to the current behavior.
> The option can be called: "hive.exec.script.allow.partial.consumption ".
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.