[
https://issues.apache.org/jira/browse/SQOOP-475?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13402166#comment-13402166
]
Georgy B Abraham commented on SQOOP-475:
----------------------------------------
@Jarek i am a colleague of afsal ,
We were just trying to import a table from mysql Db to hive by writing the code
in java. the pgm could fetch the data from mysql into the eclipse's package.
the java file generated by sqoop is also stored in the package but the pgm
couldnt write the table into hive ..
when using command line everything is ok , but when we use java it shows the
error mentioned above
the bit of code we used
@SuppressWarnings("deprecation")
com.cloudera.sqoop.SqoopOptions options
= new com.cloudera.sqoop.SqoopOptions();
options.setConnectString("jdbc:mysql://localhost/sebin");
options.setTableName("afsal");
options.setUsername("root");
options.setPassword("6397");
System.out.println(options.getConf());
options.setHadoopHome("/home/556940/hadoop-0.20.2-cdh3u2/bin");
System.out.println(options.getHadoopHome());
options.setNumMappers(1);
options.setClearStagingTable(true);
//options.setHiveImport(true);
System.out.println(options.getHadoopHome());
String targetDir="afsal";
//options.setTargetDir("hdfs://localhost:9000/user/hive/warehouse/"+targetDir);
options.setHiveHome("/home/556940/hive-0.7.1/");
System.out.println(options.doHiveImport());
System.out.println(options.getConf());
new ImportTool().run(options);
> Unable to import into external hive table located on S3
> -------------------------------------------------------
>
> Key: SQOOP-475
> URL: https://issues.apache.org/jira/browse/SQOOP-475
> Project: Sqoop
> Issue Type: Bug
> Components: hive-integration
> Affects Versions: 1.4.1-incubating
> Environment: Amazon EMR
> Hadoop 0.20.205
> Hive 0.7.1
> Sqoop 1.4.1-incubating
> Reporter: Porati Sébastien
>
> When i try to import into an hive table located on an S3 bucket, i got the
> following error message :
> FAILED: Error in semantic analysis: Line 2:17 Path is not legal
> 'hdfs://10.48.189.XX:9000/user/hadoop/client': Move from:
> hdfs://10.48.189.XX:9000/user/hadoop/client to:
> s3://some-bucket/sqoop-test/hive/client is not valid. Please check that
> values for params "default.fs.name" and "hive.metastore.warehouse.dir" do not
> conflict.
> Hive table creation Script :
> CREATE DATABASE IF NOT EXISTS sqoop_test;
> USE sqoop_test;
> CREATE EXTERNAL TABLE IF NOT EXISTS client (
> id INT,
> email STRING,
> cookie_uid STRING,
> is_blacklisted TINYINT
> )
> LOCATION 's3://some-bucket/sqoop-test/hive/client';
> Sqoop command :
> sqoop import --connect jdbc:mysql://my.domain.com/mydb --username myuser
> --password XXXX --table client --hive-import --hive-overwrite --hive-table
> sqoop_test.client
--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators:
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira