Its quite straightforward. You can refer 
http://sqoop.apache.org/docs/1.4.3/SqoopUserGuide.html#_free_form_query_imports


An example that I had tried(and worked) :

sqoop import --connect jdbc:mysql://<db_machine_ip>/<schema_name> --username 
<user_name> --password <password> --query "<sql query here>" -m 1 --target-dir 
<destination_hdfs_directory>

The only confusion I have now is that the destination is always an HDFS path - 
how to import this data into an EXISTING Hive table ?

Regards,
Omkar Joshi


From: Raj Hadoop [mailto:[email protected]]
Sent: Thursday, July 25, 2013 6:13 AM
To: [email protected]
Subject: Complex Oracle query to HDFS or Hive through Sqoop

Hello everyone,


I want to load data from an Oracle query to HDFS or Hive through Sqoop.


My query in Oracle is based on something like this -

   select a.col1,a.col2,bcol3,c.col4,d.col5 from T1 a,T2 b, T3 c, T4 d where 
a.col1=b.col1 and b.col2=c.col2 and c.col3 = d.col3;

 Is this possible in SQOOP ? Please advise.


Thanks,
Raj

________________________________
The contents of this e-mail and any attachment(s) may contain confidential or 
privileged information for the intended recipient(s). Unintended recipients are 
prohibited from taking action on the basis of information in this e-mail and 
using or disseminating the information, and must notify the sender and delete 
it from their system. L&T Infotech will not accept responsibility or liability 
for the accuracy or completeness of, or the presence of any virus or disabling 
code in this e-mail"

Reply via email to