Yes, i use this option becase i've got the same issue... So I load files
into HDFS and after I load data from these files to Hive Table, using LOAD
statement

--
Laurent HATIER - Consultant Big Data & Business Intelligence chez CapGemini
fr.linkedin.com/pub/laurent-hatier/25/36b/a86/
<http://fr.linkedin.com/pub/laurent-h/25/36b/a86/>

2014-10-17 17:37 GMT+02:00 prabha k <[email protected]>:

> Hi Laurent,  Thanks for your kind response, I am able to load the data into
> HDFS path where as when trying to use Hive Import in sqoop command Hive
> table does not get created in a specific location or does not create at all
> in warehouse location. Looking for your kind help on this.
>
> Some of the forum I see need to copy the sqoop/lib path jar files in
> Oozie/lib path as well would this be a solution for this issue
>
> Thanks
> Prabhakaran.k
>
> On Fri, Oct 17, 2014 at 8:52 PM, Laurent H <[email protected]>
> wrote:
>
> > Here an example for export (Oozie 4.0.0)
> >
> >   <action name="sqoop_node">
> >         <sqoop xmlns="uri:oozie:sqoop-action:0.2">
> >             <job-tracker>${jobTracker}</job-tracker>
> >             <name-node>${nameNode}</name-node>
> >             <configuration>
> >                 <property>
> >                     <name>mapreduce.job.queuename</name>
> >                     <value>${queueName}</value>
> >                 </property>
> >             </configuration>
> >             <arg>import</arg>
> >             <arg>--connect</arg>
> >
>  <arg>${TERADATA_JDBC_PATH}/DATABASE=${TERADATA_DATABASE}</arg>
> >             <arg>--connection-manager</arg>
> >             <arg>org.apache.sqoop.teradata.TeradataConnManager</arg>
> >             <arg>--username</arg>
> >             <arg>${TERADATA_USERNAME}</arg>
> >             <arg>--password</arg>
> >             <arg>${TERADATA_PASSWORD}</arg>
> >             <arg>--query</arg>
> >             <arg>SELECT field1, field2 FROM table_teradata</arg>
> >             <arg>--split-by</arg>
> >             <arg>ID</arg>
> >             <arg>--target-dir</arg>
> >             <arg>${hive_in}/terdata_file_for_loading_hadoop_file</arg>
> >             <arg>--as-textfile</arg>
> >
> >
> >
> <file>/apps/sqoop/lib/hortonworks-teradata-connector-1.2.1.2.1.3.0-563.jar#hortonworks-teradata-connector-1.2.1.2.1.3.0-563.jar</file>
> >             <file>/apps/sqoop/lib/opencsv-2.3.jar#opencsv-2.3.jar</file>
> >             <file>/apps/sqoop/lib/terajdbc4.jar#terajdbc4.jar</file>
> >
> >
> >
> <file>/apps/sqoop/lib/teradata-connector-1.2.1-hadoop200.jar#teradata-connector-1.2.1-hadoop200.jar</file>
> >             <file>/apps/sqoop/lib/tdgssconfig.jar#tdgssconfig.jar</file>
> >         </sqoop>
> >         <ok to="next_step_load_hive_table_from_hadoop_file"/>
> >         <error to="sqoop_fail"/>
> >     </action>
> >
> > --
> > Laurent HATIER - Consultant Big Data & Business Intelligence chez
> CapGemini
> > fr.linkedin.com/pub/laurent-hatier/25/36b/a86/
> > <http://fr.linkedin.com/pub/laurent-h/25/36b/a86/>
> >
> > 2014-10-16 18:36 GMT+02:00 prabha k <[email protected]>:
> >
> > > I am working on Sqoop to import data from teradata and facing issues
> > error
> > > I have copy pasted below.
> > >
> > > $*SQOOP_CONF_DIR* has not been set in the environment. Cannot check for
> > > additional configuration
> > >
> > > Can you please help me to know why this error is occurring?
> > >
> > > What are the jar files required to be placed in oozie and sqoop lib
> > folders
> > > ? will this resolve the issue.
> > >
> > > When executing the Sqoop command in Oozie workflow how does it work?
> > >
> > > Looking forward for your kind response.
> > > Thanks
> > > Pk
> > >
> >
>

Reply via email to