[ 
https://issues.apache.org/jira/browse/SQOOP-475?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13417169#comment-13417169
 ] 

sachin pawar commented on SQOOP-475:
------------------------------------


can you please explain what you mean by "in sync". 
I tried setting both to same hdfs://172.25.6.96:8020/ in code ...but ran into 
same error.

How do I make hive use hdfs path not local file system?.
I thought I was doing this with my hive-site.xml. see below.

its a very basic set up.
hadoop/ hive installed by same user ...running under same user.
sqoop job run from remote machine is also same user
/user/hive has write permissions to all just to be on safe side.

following is my hive-site.xml


<property>
        <name>hive.metastore.local</name>
        <value>true</value>
</property>
<property>
        <name>hive.metastore.warehouse.dir</name>
        <value>hdfs://172.25.6.96:8020/user/hive/warehouse</value>
</property>
<property>
        <name>javax.jdo.option.ConnectionURL</name>
        
<value>jdbc:mysql://localhost/metastore_db?createDatabaseIfNotExist=true</value>
</property>
<property>
        <name>javax.jdo.option.ConnectionDriverName</name>
        <value>com.mysql.jdbc.Driver</value>
</property>
<property>
        <name>javax.jdo.option.ConnectionUserName</name>
        <value>uname</value>
</property>
<property>
        <name>javax.jdo.option.ConnectionPassword</name>
        <value>pwd</value>
</property>

thanks

                
> Unable to import into external hive table located on S3
> -------------------------------------------------------
>
>                 Key: SQOOP-475
>                 URL: https://issues.apache.org/jira/browse/SQOOP-475
>             Project: Sqoop
>          Issue Type: Bug
>          Components: hive-integration
>    Affects Versions: 1.4.1-incubating
>         Environment: Amazon EMR
> Hadoop 0.20.205
> Hive 0.7.1
> Sqoop 1.4.1-incubating
>            Reporter: Porati Sébastien
>
> When i try to import into an hive table located on an S3 bucket, i got the 
> following error message :
> FAILED: Error in semantic analysis: Line 2:17 Path is not legal 
> 'hdfs://10.48.189.XX:9000/user/hadoop/client': Move from: 
> hdfs://10.48.189.XX:9000/user/hadoop/client to: 
> s3://some-bucket/sqoop-test/hive/client is not valid. Please check that 
> values for params "default.fs.name" and "hive.metastore.warehouse.dir" do not 
> conflict.
> Hive table creation Script :
> CREATE DATABASE IF NOT EXISTS sqoop_test;
> USE sqoop_test;
> CREATE EXTERNAL TABLE IF NOT EXISTS client (
>     id INT,
>     email STRING,
>     cookie_uid STRING,
>     is_blacklisted TINYINT
> )
> LOCATION 's3://some-bucket/sqoop-test/hive/client';
> Sqoop command :
> sqoop import --connect jdbc:mysql://my.domain.com/mydb --username myuser 
> --password XXXX --table client --hive-import --hive-overwrite --hive-table 
> sqoop_test.client

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: 
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira


Reply via email to