Hi, I'm trying to import a dummy table (cities) into HDFS with the
following command:

$ sqoop import --connect jdbc:oracle:thin:@10.0.2.15:1521/XE --username
system --password root --table cities --columns country,city --split-by id
--target-dir /sqoop/output1

but no rows get imported.

The SQL statement that gets generated by Sqoop is

SELECT t.* FROM cities t WHERE 1=0

which is odd because 1 will never be equal to zero.

I'm running:
Hadoop 2.4.1
Sqoop 1.4.5


I get the following output where it says split with lower bound 'id IS NULL'

14/09/30 15:36:34 INFO client.RMProxy: Connecting to ResourceManager at /
0.0.0.0:8032
14/09/30 15:36:41 DEBUG db.DBConfiguration: Fetching password from job
credentials store
14/09/30 15:36:50 INFO db.DBInputFormat: Using read commited transaction
isolation
14/09/30 15:36:50 INFO db.DataDrivenDBInputFormat: BoundingValsQuery:
SELECT MIN(id), MAX(id) FROM SYSTEM.CITIES
14/09/30 15:36:53 DEBUG db.DataDrivenDBInputFormat: Creating input split
with lower bound 'id IS NULL' and upper bound 'id IS NULL'
14/09/30 15:36:57 INFO mapreduce.JobSubmitter: number of splits:1
14/09/30 15:37:12 INFO mapreduce.JobSubmitter: Submitting tokens for job:
job_1412093880347_0008
14/09/30 15:37:18 INFO impl.YarnClientImpl: Submitted application
application_1412093880347_0008
14/09/30 15:37:19 INFO mapreduce.Job: The url to track the job:
http://debian.dev:8088/proxy/application_1412093880347_0008/
14/09/30 15:37:19 INFO mapreduce.Job: Running job: job_1412093880347_0008
14/09/30 15:38:11 INFO mapreduce.Job: Job job_1412093880347_0008 running in
uber mode : false

Reply via email to