[ 
https://issues.apache.org/jira/browse/SQOOP-1333?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14016607#comment-14016607
 ] 

William Watson commented on SQOOP-1333:
---------------------------------------

At the very least, the fact that it returns such a useless message is a bug, 
IMHO.

> Sqoop Fails to Import from PostgreSQL to S3 with Confusing "Imported Failed: 
> null" exception
> --------------------------------------------------------------------------------------------
>
>                 Key: SQOOP-1333
>                 URL: https://issues.apache.org/jira/browse/SQOOP-1333
>             Project: Sqoop
>          Issue Type: Bug
>    Affects Versions: 1.4.4
>         Environment: CentOS 6, Hadoop 2, Sqoop 1.4.4
>            Reporter: William Watson
>
> I see some issues resolved with importing from MySQL to S3 (SQOOP-891), but I 
> can't find any information on the following command and error:
> {code}
> sqoop import -Dfs.defaultFS=s3n:// --connect 
> "jdbc:postgresql://ip_address/cleanroom_matching?extra_options=-Dfs.defaultFS%3Ds3n%3A%2F%2F"
>  --fields-terminated-by \\t --username [omitted] --password [omitted] 
> --split-by cr_user_id  --query "SELECT * FROM table WHERE (\$CONDITIONS)" 
> --direct --delete-target-dir --target-dir 
> 's3n://[omitted]:[omitted]@[omitted]/sqoop/'  --verbose
> {code}
> {code}
> 14/06/03 09:49:53 ERROR tool.ImportTool: Imported Failed: null
> {code}
> That's it, there's no stack trace. The query works on its own, if I import to 
> disk it works just fine. It's when I change to S3 that it fails. It was 
> failing because I didn't set the default file system after I did that, I 
> started getting the confusing error.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to