Not sure. Could you check your task logs for any exceptions? It's possible that your Sqoop job is failing before the --last-value is updated.
-Abe On Fri, Nov 21, 2014 at 6:33 PM, Wen, Dongrong < [email protected]> wrote: > Hi, > > On CentOS 6.5, I am using Sqoop Version 1.4.4-cdh5.1.0 and HBase Version > 0.98.1-cdh5.1.0. > > Here is my Sqoop job of incremental append from Intersystems Cache to > HBase. > ------------------------- > sudo -u hdfs sqoop job --create JobName \ > -- \ > import \ > --verbose \ > -m 1 \ > --connect jdbc:Cache://xx.xx.xx.xx:1972/USER \ > --driver com.intersys.jdbc.CacheDriver \ > --username username \ > --password password \ > --column-family CFNAME \ > --hbase-table TABLENAME \ > --hbase-row-key NUM \ > --query "select * from TABLENAME where \$CONDITIONS" \ > --incremental append \ > --check-column NUM \ > --last-value 9 > ------------------------- > > I can see the new data rows appended successfully to the table in HBase. > But Sqoop's "incremental.last.value" does not get updated to the new > last-value on hsqldb. I tried it several times and it is 9 all the time as > given during the sqoop job creation. Below is the last few lines of Sqoop > output message. > >>>>>>>>>>>>>>>>> > 14/11/20 15:28:55 INFO mapreduce.ImportJobBase: Transferred 0 bytes in > 24.6684 seconds (0 bytes/sec) > 14/11/20 15:28:55 INFO mapreduce.ImportJobBase: Retrieved 7 records. > 14/11/20 15:28:55 DEBUG util.ClassLoaderStack: Restoring classloader: > sun.misc.Launcher$AppClassLoader@5ac524dd > 14/11/20 15:28:55 ERROR tool.ImportTool: Imported Failed: Can not create a > Path from a null string > 14/11/20 15:28:55 DEBUG hsqldb.HsqldbJobStorage: Flushing current > transaction > 14/11/20 15:28:55 DEBUG hsqldb.HsqldbJobStorage: Closing connection > >>>>>>>>>>>>>>>>>> > > However, if I change the target to HDFS. It works perfectly. Sqoop's > "incremental.last.value" does update to hsqldb without any problem. Here is > my Sqoop incremental job to HDFS. > ------------------------- > sudo -u hdfs sqoop job --create jobname \ > -- \ > import \ > --verbose \ > -m 1 \ > --connect jdbc:Cache://xx.xx.xx.xx:1972/USER \ > --driver com.intersys.jdbc.CacheDriver \ > --username username \ > --password password \ > --query "select * from tablename where \$CONDITIONS" \ > --target-dir /user/hdfs/tablename \ > --incremental append \ > --check-column NUM \ > --last-value 9 > ------------------------- > > > Is this a bug? Or did I miss anything else? Thanks. > > > -Dongrong > >
