Akash,

You can write a simple Java program that queries your Oracle DB and uses
whatever kind of file output object from java.io that you like to write the
data to a file.

Compile the program and package it into a jar file.

Then run the program using <path-to-hadoop>/'hadoop jar <jar-file-name>
<output-file-name> in your Hadoop cluster.

The result will be in HDFS.

-John

On Mon, Jun 25, 2012 at 8:02 AM, Akash Sharma <sharma...@hotmail.com> wrote:

>
>
> Hi,
>
> Seeking some advise/options on sqoop of data from Oracle DB to HDFS. We
> are on Cloudera 3.
>
> Sqoop/JDBC connection to Oracle RAC fails within the Integration cluster.
> In the Oracle RAC there  is an additional layer between the Edge
> Node/Server and Database. Routing the connection occurs to  appropriate
> database listener. The specific issue is ORA-12516, TNS:listener could not
> find available handler with matching protocol stack. This issue is
> primarily a network configuration issue which is stopping the team to
> progress further.
>
> I would like to know other than sqoop, what are the other  options to
> bring the data into HDFS. Our primary goal is to land data from Oracle DB
> into HDFS. Since sqoop does not work, we are thinking of using Oracle
> unload, ftp, and then put command to import the data into HDFS. Please
> guide if there  is any other option other than this. Will it help in anyway
> to code these options in java using HDFS/FTP Api within Java itself rather
> than using tools?
>
> Any help is appreciative.
>
> -Akash
>
>
>

Reply via email to