[jira] [Updated] (HIVE-19053) RemoteSparkJobStatus#getSparkJobInfo treats all exceptions as timeout errors

2018-06-08 Thread Aihua Xu (JIRA)


 [ 
https://issues.apache.org/jira/browse/HIVE-19053?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Aihua Xu updated HIVE-19053:

   Resolution: Fixed
Fix Version/s: 4.0.0
   Status: Resolved  (was: Patch Available)

Pushed to master. Thanks [~stakiar] for reviewing the code.

> RemoteSparkJobStatus#getSparkJobInfo treats all exceptions as timeout errors
> 
>
> Key: HIVE-19053
> URL: https://issues.apache.org/jira/browse/HIVE-19053
> Project: Hive
>  Issue Type: Sub-task
>  Components: Spark
>Reporter: Sahil Takiar
>Assignee: Aihua Xu
>Priority: Major
> Fix For: 4.0.0
>
> Attachments: HIVE-19053.1.patch, HIVE-19053.2.patch
>
>
> {code}
> Future getJobInfo = sparkClient.run(
> new GetJobInfoJob(jobHandle.getClientJobId(), sparkJobId));
> try {
>   return getJobInfo.get(sparkClientTimeoutInSeconds, TimeUnit.SECONDS);
> } catch (Exception e) {
>   LOG.warn("Failed to get job info.", e);
>   throw new HiveException(e, ErrorMsg.SPARK_GET_JOB_INFO_TIMEOUT,
>   Long.toString(sparkClientTimeoutInSeconds));
> }
> {code}
> It should only throw {{ErrorMsg.SPARK_GET_JOB_INFO_TIMEOUT}} if a 
> {{TimeoutException}} is thrown. Other exceptions should be handled 
> independently.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19053) RemoteSparkJobStatus#getSparkJobInfo treats all exceptions as timeout errors

2018-06-06 Thread Aihua Xu (JIRA)


 [ 
https://issues.apache.org/jira/browse/HIVE-19053?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Aihua Xu updated HIVE-19053:

Attachment: HIVE-19053.2.patch

> RemoteSparkJobStatus#getSparkJobInfo treats all exceptions as timeout errors
> 
>
> Key: HIVE-19053
> URL: https://issues.apache.org/jira/browse/HIVE-19053
> Project: Hive
>  Issue Type: Sub-task
>  Components: Spark
>Reporter: Sahil Takiar
>Assignee: Aihua Xu
>Priority: Major
> Attachments: HIVE-19053.1.patch, HIVE-19053.2.patch
>
>
> {code}
> Future getJobInfo = sparkClient.run(
> new GetJobInfoJob(jobHandle.getClientJobId(), sparkJobId));
> try {
>   return getJobInfo.get(sparkClientTimeoutInSeconds, TimeUnit.SECONDS);
> } catch (Exception e) {
>   LOG.warn("Failed to get job info.", e);
>   throw new HiveException(e, ErrorMsg.SPARK_GET_JOB_INFO_TIMEOUT,
>   Long.toString(sparkClientTimeoutInSeconds));
> }
> {code}
> It should only throw {{ErrorMsg.SPARK_GET_JOB_INFO_TIMEOUT}} if a 
> {{TimeoutException}} is thrown. Other exceptions should be handled 
> independently.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19053) RemoteSparkJobStatus#getSparkJobInfo treats all exceptions as timeout errors

2018-06-06 Thread Aihua Xu (JIRA)


 [ 
https://issues.apache.org/jira/browse/HIVE-19053?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Aihua Xu updated HIVE-19053:

Attachment: (was: HIVE-19053.2.patch)

> RemoteSparkJobStatus#getSparkJobInfo treats all exceptions as timeout errors
> 
>
> Key: HIVE-19053
> URL: https://issues.apache.org/jira/browse/HIVE-19053
> Project: Hive
>  Issue Type: Sub-task
>  Components: Spark
>Reporter: Sahil Takiar
>Assignee: Aihua Xu
>Priority: Major
> Attachments: HIVE-19053.1.patch
>
>
> {code}
> Future getJobInfo = sparkClient.run(
> new GetJobInfoJob(jobHandle.getClientJobId(), sparkJobId));
> try {
>   return getJobInfo.get(sparkClientTimeoutInSeconds, TimeUnit.SECONDS);
> } catch (Exception e) {
>   LOG.warn("Failed to get job info.", e);
>   throw new HiveException(e, ErrorMsg.SPARK_GET_JOB_INFO_TIMEOUT,
>   Long.toString(sparkClientTimeoutInSeconds));
> }
> {code}
> It should only throw {{ErrorMsg.SPARK_GET_JOB_INFO_TIMEOUT}} if a 
> {{TimeoutException}} is thrown. Other exceptions should be handled 
> independently.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19053) RemoteSparkJobStatus#getSparkJobInfo treats all exceptions as timeout errors

2018-06-05 Thread Aihua Xu (JIRA)


 [ 
https://issues.apache.org/jira/browse/HIVE-19053?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Aihua Xu updated HIVE-19053:

Attachment: HIVE-19053.2.patch

> RemoteSparkJobStatus#getSparkJobInfo treats all exceptions as timeout errors
> 
>
> Key: HIVE-19053
> URL: https://issues.apache.org/jira/browse/HIVE-19053
> Project: Hive
>  Issue Type: Sub-task
>  Components: Spark
>Reporter: Sahil Takiar
>Assignee: Aihua Xu
>Priority: Major
> Attachments: HIVE-19053.1.patch, HIVE-19053.2.patch
>
>
> {code}
> Future getJobInfo = sparkClient.run(
> new GetJobInfoJob(jobHandle.getClientJobId(), sparkJobId));
> try {
>   return getJobInfo.get(sparkClientTimeoutInSeconds, TimeUnit.SECONDS);
> } catch (Exception e) {
>   LOG.warn("Failed to get job info.", e);
>   throw new HiveException(e, ErrorMsg.SPARK_GET_JOB_INFO_TIMEOUT,
>   Long.toString(sparkClientTimeoutInSeconds));
> }
> {code}
> It should only throw {{ErrorMsg.SPARK_GET_JOB_INFO_TIMEOUT}} if a 
> {{TimeoutException}} is thrown. Other exceptions should be handled 
> independently.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19053) RemoteSparkJobStatus#getSparkJobInfo treats all exceptions as timeout errors

2018-06-04 Thread Aihua Xu (JIRA)


 [ 
https://issues.apache.org/jira/browse/HIVE-19053?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Aihua Xu updated HIVE-19053:

Status: Patch Available  (was: Open)

patch-1: simple improvement to handle InterruptedException and 
ExecutionException and throw different errors.

[~stakiar] can you take a look if it makes sense?

> RemoteSparkJobStatus#getSparkJobInfo treats all exceptions as timeout errors
> 
>
> Key: HIVE-19053
> URL: https://issues.apache.org/jira/browse/HIVE-19053
> Project: Hive
>  Issue Type: Sub-task
>  Components: Spark
>Reporter: Sahil Takiar
>Assignee: Aihua Xu
>Priority: Major
> Attachments: HIVE-19053.1.patch
>
>
> {code}
> Future getJobInfo = sparkClient.run(
> new GetJobInfoJob(jobHandle.getClientJobId(), sparkJobId));
> try {
>   return getJobInfo.get(sparkClientTimeoutInSeconds, TimeUnit.SECONDS);
> } catch (Exception e) {
>   LOG.warn("Failed to get job info.", e);
>   throw new HiveException(e, ErrorMsg.SPARK_GET_JOB_INFO_TIMEOUT,
>   Long.toString(sparkClientTimeoutInSeconds));
> }
> {code}
> It should only throw {{ErrorMsg.SPARK_GET_JOB_INFO_TIMEOUT}} if a 
> {{TimeoutException}} is thrown. Other exceptions should be handled 
> independently.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19053) RemoteSparkJobStatus#getSparkJobInfo treats all exceptions as timeout errors

2018-06-04 Thread Aihua Xu (JIRA)


 [ 
https://issues.apache.org/jira/browse/HIVE-19053?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Aihua Xu updated HIVE-19053:

Attachment: HIVE-19053.1.patch

> RemoteSparkJobStatus#getSparkJobInfo treats all exceptions as timeout errors
> 
>
> Key: HIVE-19053
> URL: https://issues.apache.org/jira/browse/HIVE-19053
> Project: Hive
>  Issue Type: Sub-task
>  Components: Spark
>Reporter: Sahil Takiar
>Assignee: Aihua Xu
>Priority: Major
> Attachments: HIVE-19053.1.patch
>
>
> {code}
> Future getJobInfo = sparkClient.run(
> new GetJobInfoJob(jobHandle.getClientJobId(), sparkJobId));
> try {
>   return getJobInfo.get(sparkClientTimeoutInSeconds, TimeUnit.SECONDS);
> } catch (Exception e) {
>   LOG.warn("Failed to get job info.", e);
>   throw new HiveException(e, ErrorMsg.SPARK_GET_JOB_INFO_TIMEOUT,
>   Long.toString(sparkClientTimeoutInSeconds));
> }
> {code}
> It should only throw {{ErrorMsg.SPARK_GET_JOB_INFO_TIMEOUT}} if a 
> {{TimeoutException}} is thrown. Other exceptions should be handled 
> independently.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-19053) RemoteSparkJobStatus#getSparkJobInfo treats all exceptions as timeout errors

2018-03-26 Thread Sahil Takiar (JIRA)

 [ 
https://issues.apache.org/jira/browse/HIVE-19053?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sahil Takiar updated HIVE-19053:

Description: 
{code}
Future getJobInfo = sparkClient.run(
new GetJobInfoJob(jobHandle.getClientJobId(), sparkJobId));
try {
  return getJobInfo.get(sparkClientTimeoutInSeconds, TimeUnit.SECONDS);
} catch (Exception e) {
  LOG.warn("Failed to get job info.", e);
  throw new HiveException(e, ErrorMsg.SPARK_GET_JOB_INFO_TIMEOUT,
  Long.toString(sparkClientTimeoutInSeconds));
}
{code}

It should only throw {{ErrorMsg.SPARK_GET_JOB_INFO_TIMEOUT}} if a 
{{TimeoutException}} is thrown. Other exceptions should be handled 
independently.

> RemoteSparkJobStatus#getSparkJobInfo treats all exceptions as timeout errors
> 
>
> Key: HIVE-19053
> URL: https://issues.apache.org/jira/browse/HIVE-19053
> Project: Hive
>  Issue Type: Sub-task
>  Components: Spark
>Reporter: Sahil Takiar
>Priority: Major
>
> {code}
> Future getJobInfo = sparkClient.run(
> new GetJobInfoJob(jobHandle.getClientJobId(), sparkJobId));
> try {
>   return getJobInfo.get(sparkClientTimeoutInSeconds, TimeUnit.SECONDS);
> } catch (Exception e) {
>   LOG.warn("Failed to get job info.", e);
>   throw new HiveException(e, ErrorMsg.SPARK_GET_JOB_INFO_TIMEOUT,
>   Long.toString(sparkClientTimeoutInSeconds));
> }
> {code}
> It should only throw {{ErrorMsg.SPARK_GET_JOB_INFO_TIMEOUT}} if a 
> {{TimeoutException}} is thrown. Other exceptions should be handled 
> independently.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)