[ 
https://issues.apache.org/jira/browse/SQOOP-475?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13416458#comment-13416458
 ] 

sachin pawar commented on SQOOP-475:
------------------------------------

Hi,

I am also facing the same issue..can you please help..
let me know if you need more info

i am running following code
                String sqoopArgs[] = new String[] 
                {
                        "-D", "fs.default.name=hdfs://<hostname>:8020/",
                        "-D", "mapred.job.tracker=<hostname>:8021",
                        "-D", 
"hive.metastore.warehouse.dir=hdfs://<hostname>:8020/user/hive/warehouse",
                        "-D", "hadoop.security.authorization=false",
                        "-D", "dfs.replication=1",
                        "-D", "dfs.data.dir=/home/sacpawar/hadoop_storage/data",
                        "-D", "dfs.name.dir=/home/sacpawar/hadoop_storage/name",
                        "-D", "hadoop.tmp.dir=/home/haiyee/hadoop_storage/tmp",
                        "-D", 
"mapreduce.jobtracker.staging.root.dir=/user/sacpawar/",
                        "--connect", 
"jdbc:sqlserver://<dbhostname>:1433;databaseName=SqoopTestDB",
                        "--username", "sa", 
                        "--password", "sa", 
                        "--table", "bizunit",
                        "--hive-home", "/opt/dev/hive-0.9.0",
                        "--hive-import"
                };

        
                // ====================================
                
                ImportTool iTool = new ImportTool();
                Sqoop sqoop = new Sqoop(iTool);
        
                try {
                        System.out.println("Started....");
                        ToolRunner.run(sqoop, sqoopArgs);
                        System.out.println("=======DONE==========");
                } catch (Exception e) {
                        // TODO Auto-generated catch block
                        e.printStackTrace();
                }


        standalone hadoop
        hadoop sqoop and hive installed on same box
        HADOOP_HOME, SQOOP_HOME, HIVE_HOME set to respective install dirs

        no hive-site.xml is used ..using default derby...

        I can run following form command line on hadoop/sqoop/hive machine and 
it ran fine and created table

        sqoop create-hive-table --connect 
"jdbc:sqlserver://<dbhostname>:1433;databaseName=SqoopTestDB" --username sa 
--password sa --table bizunit --hive-table bizunit_test

        
        trying this code from different machine in the network...provided all 
the required jars

        data is loaded in HDFS through sqoop import process fine.
        
        end up getting following error during hive import process

FAILED: Error in semantic analysis: Line 2:17 Path is not legal 
''hdfs://172.25.6.96:8020/user/sacpawar/bizunit'':
 Move from: hdfs://172.25.6.96:8020/user/sacpawar/bizunit to: 
file:/user/hive/warehouse/bizunit is not valid.
 Please check that values for params "default.fs.name" and 
"hive.metastore.warehouse.dir" do not conflict.
        

please help!!!!

Sachin
                
> Unable to import into external hive table located on S3
> -------------------------------------------------------
>
>                 Key: SQOOP-475
>                 URL: https://issues.apache.org/jira/browse/SQOOP-475
>             Project: Sqoop
>          Issue Type: Bug
>          Components: hive-integration
>    Affects Versions: 1.4.1-incubating
>         Environment: Amazon EMR
> Hadoop 0.20.205
> Hive 0.7.1
> Sqoop 1.4.1-incubating
>            Reporter: Porati Sébastien
>
> When i try to import into an hive table located on an S3 bucket, i got the 
> following error message :
> FAILED: Error in semantic analysis: Line 2:17 Path is not legal 
> 'hdfs://10.48.189.XX:9000/user/hadoop/client': Move from: 
> hdfs://10.48.189.XX:9000/user/hadoop/client to: 
> s3://some-bucket/sqoop-test/hive/client is not valid. Please check that 
> values for params "default.fs.name" and "hive.metastore.warehouse.dir" do not 
> conflict.
> Hive table creation Script :
> CREATE DATABASE IF NOT EXISTS sqoop_test;
> USE sqoop_test;
> CREATE EXTERNAL TABLE IF NOT EXISTS client (
>     id INT,
>     email STRING,
>     cookie_uid STRING,
>     is_blacklisted TINYINT
> )
> LOCATION 's3://some-bucket/sqoop-test/hive/client';
> Sqoop command :
> sqoop import --connect jdbc:mysql://my.domain.com/mydb --username myuser 
> --password XXXX --table client --hive-import --hive-overwrite --hive-table 
> sqoop_test.client

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: 
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira


Reply via email to