Greetings!
 
Hopefully this isn't too much of a newbie question, but I am unable to get the 
--hive-overwrite argument working. I'm using sqoop 1.3.0-cdh3u2 on the Cloudera 
VMWare Player VM.
 
 
The following sqoop invocation succeeds in creating the Hive table and 
populates it with data:
 
sqoop import --connect 'jdbc:mysql://localhost/MyDB?zeroDateTimeBehavior=round' 
--username cloudera --query 'SELECT *, 47 AS JobID FROM SalesPerson WHERE 
$CONDITIONS' --split-by ID  --target-dir /tmp/SalesPerson --create-hive-table 
--hive-import --hive-table MyDB_SalesPerson
 
 
However, while the following sqoop invocation does produce the desired data in 
HDFS (i.e., /tmp/SalesPerson) it does not overwrite the data in the Hive table:
 
sqoop import --connect 'jdbc:mysql://localhost/MyDB?zeroDateTimeBehavior=round' 
--username cloudera --query 'SELECT *, 87 AS JobID FROM SalesPerson WHERE 
$CONDITIONS' --split-by ID  --target-dir /tmp/SalesPerson --hive-overwrite 
--hive-table MyDB_salesperson
 
 
There is nothing in Hive.log that indicates the --hive-overwrite sqoop 
invocation is interacting with Hive (e.g., no exceptions).
 
Any assistance would be greatly appreciated.
 
Thanx,

Dave                                      

Reply via email to