[ 
https://issues.apache.org/jira/browse/SQOOP-934?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13687003#comment-13687003
 ] 

Venkat Ranganathan commented on SQOOP-934:
------------------------------------------

Raghav,  can you please create a review board link?   Also, catching the 
exception from close if any is a good idea (if the connection is stale, this 
will force  the next getConnection to return a valid connection as this will 
guarantee that we null out the connection variable.

                
> JDBC Connection can timeout after import but before hive import
> ---------------------------------------------------------------
>
>                 Key: SQOOP-934
>                 URL: https://issues.apache.org/jira/browse/SQOOP-934
>             Project: Sqoop
>          Issue Type: Improvement
>    Affects Versions: 1.4.2
>            Reporter: Jarek Jarcec Cecho
>            Assignee: Raghav Kumar Gautam
>         Attachments: SQOOP-934.patch
>
>
> Our current [import 
> rutine|https://github.com/apache/sqoop/blob/trunk/src/java/org/apache/sqoop/tool/ImportTool.java#L385]
>  imports data into HDFS and then tries to do Hive import. As the connection 
> to the remote server is opened only once at the begging it might timeout 
> during very long mapreduce job. I believe that we should ensure that the 
> connection is still valid before performing the hive import.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

Reply via email to