[ 
https://issues.apache.org/jira/browse/SQOOP-443?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Jarek Jarcec Cecho updated SQOOP-443:
-------------------------------------

    Status: Patch Available  (was: Open)
    
> Calling sqoop with hive import is not working multiple times due to  kept 
> output directory
> ------------------------------------------------------------------------------------------
>
>                 Key: SQOOP-443
>                 URL: https://issues.apache.org/jira/browse/SQOOP-443
>             Project: Sqoop
>          Issue Type: Improvement
>    Affects Versions: 1.4.0-incubating, 1.4.1-incubating
>            Reporter: Jarek Jarcec Cecho
>            Assignee: Jarek Jarcec Cecho
>            Priority: Minor
>         Attachments: SQOOP-443.patch
>
>
> Hive is not removing input directory when doing "LOAD DATA" command in all 
> cases. This input directory is actually sqoop's export directory. Because 
> this directory is kept, calling same sqoop command twice is failing on 
> exception "org.apache.hadoop.mapred.FileAlreadyExistsException: Output 
> directory $table already exists".
> This issue might be easily overcome by manual directory removal, however it's 
> putting unnecessary burden on users. It's also complicating executing saved 
> jobs as there is additional script execution needed.

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: 
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

Reply via email to