[
https://issues.apache.org/jira/browse/HIVE-13657?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15271776#comment-15271776
]
Mohit Sabharwal commented on HIVE-13657:
----------------------------------------
Thanks, [~szehon]. In my tests, the stderr simply contained the exception
thrown. However, you're right that the stderr may potentially contain
irrelevant info and also too much info. For the latter, I've tried to limit the
number of lines sent back. For the former, I don't see any simple way of
filtering out irrelevant info. We can explore capturing spark driver log4j logs
corresponding to this query and use verbosity levels like we do in HIVE-10119
for MR as a follow-up item.
> Spark driver stderr logs should appear in hive client logs
> ----------------------------------------------------------
>
> Key: HIVE-13657
> URL: https://issues.apache.org/jira/browse/HIVE-13657
> Project: Hive
> Issue Type: Bug
> Reporter: Mohit Sabharwal
> Assignee: Mohit Sabharwal
> Attachments: HIVE-13657.patch
>
>
> Currently, spark driver exceptions are not getting logged in beeline.
> Instead, the users sees the not-so-useful:
> {code}
> ERROR : Failed to execute spark task, with exception
> 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark
> client.)'
> <huge stack trace ommitted>
> {code}
> The user has to look at HS2 logs to discover the root cause:
> {code}
> 2015-04-01 11:33:16,048 INFO org.apache.hive.spark.client.SparkClientImpl:
> 15/04/01 11:33:16 WARN UserGroupInformation: PriviledgedActionException
> as:foo (auth:PROXY) via hive (auth:SIMPLE)
> cause:org.apache.hadoop.security.AccessControlException: Permission denied:
> user=foo, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x
> ...
> {code}
> We should surface these critical errors in hive client.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)