[
https://issues.apache.org/jira/browse/SQOOP-475?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13491473#comment-13491473
]
shirley gracelyn commented on SQOOP-475:
----------------------------------------
I am facing the same error. This is what I see:
2012-11-05 20:12:57,711 ERROR org.apache.hadoop.hive.ql.Driver: FAILED: Error
in semantic analysis: Line 2:17 Path is not legal
'hdfs://host:8020/user/root/table_name': Move from:
hdfs://host:8020/user/root/table_name to:
hdfs://host/user/hive/warehouse/some.db/othr_tbl is not valid. Please check
that values for params "default.fs.name" and "hive.metastore.warehouse.dir" do
not conflict.
I see this error when executing a sqoop import action from oozie: 2012-11-05
20:12:57,667 INFO hive.ql.parse.ParseDriver: Parsing command:
LOAD DATA INPATH 'hdfs://host:8020/user/root/table_name' OVERWRITE INTO TABLE
`some.other_tbl`
The LOAD DATA IN PATH succeeds in hive, when it does not have the port number
8020 in the hdfs path. I confirmed this in hive. But problem is that, sqoop
import somehow appends the 8020 into the source URI in the LOAD statement, and
causes the failure. Is there a way to stop sqoop from appending the port number
in the hdfs path?
> Unable to import into external hive table located on S3
> -------------------------------------------------------
>
> Key: SQOOP-475
> URL: https://issues.apache.org/jira/browse/SQOOP-475
> Project: Sqoop
> Issue Type: Bug
> Components: hive-integration
> Affects Versions: 1.4.1-incubating
> Environment: Amazon EMR
> Hadoop 0.20.205
> Hive 0.7.1
> Sqoop 1.4.1-incubating
> Reporter: Porati Sébastien
>
> When i try to import into an hive table located on an S3 bucket, i got the
> following error message :
> FAILED: Error in semantic analysis: Line 2:17 Path is not legal
> 'hdfs://10.48.189.XX:9000/user/hadoop/client': Move from:
> hdfs://10.48.189.XX:9000/user/hadoop/client to:
> s3://some-bucket/sqoop-test/hive/client is not valid. Please check that
> values for params "default.fs.name" and "hive.metastore.warehouse.dir" do not
> conflict.
> Hive table creation Script :
> CREATE DATABASE IF NOT EXISTS sqoop_test;
> USE sqoop_test;
> CREATE EXTERNAL TABLE IF NOT EXISTS client (
> id INT,
> email STRING,
> cookie_uid STRING,
> is_blacklisted TINYINT
> )
> LOCATION 's3://some-bucket/sqoop-test/hive/client';
> Sqoop command :
> sqoop import --connect jdbc:mysql://my.domain.com/mydb --username myuser
> --password XXXX --table client --hive-import --hive-overwrite --hive-table
> sqoop_test.client
--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira