Thanks Jercec. But the problem is that job never even reaches to the MR stage. It fails right after loading the hive metastore and running tool.ExportTool. Hence, no logs to check.
. . . 15/12/11 16:33:28 DEBUG orm.CompilationManager: Could not rename /tmp/sqoop-root/compile/bb46fc39dfc3f2d7a1fbc6df6a2b9117/tblOthrBus.java to /root/./tblOthrBus.java 15/12/11 16:33:28 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/bb46fc39dfc3f2d7a1fbc6df6a2b9117/tblOthrBus.jar 15/12/11 16:33:28 DEBUG orm.CompilationManager: Scanning for .class files in directory: /tmp/sqoop-root/compile/bb46fc39dfc3f2d7a1fbc6df6a2b9117 15/12/11 16:33:28 DEBUG orm.CompilationManager: Got classfile: /tmp/sqoop-root/compile/bb46fc39dfc3f2d7a1fbc6df6a2b9117/tblOthrBus.class -> tblOthrBus.class 15/12/11 16:33:28 DEBUG orm.CompilationManager: Finished writing jar file /tmp/sqoop-root/compile/bb46fc39dfc3f2d7a1fbc6df6a2b9117/tblOthrBus.jar 15/12/11 16:33:28 INFO mapreduce.ExportJobBase: Beginning export of tblOthrBus 15/12/11 16:33:28 DEBUG util.ClassLoaderStack: Checking for existing class: tblOthrBus 15/12/11 16:33:28 DEBUG util.ClassLoaderStack: Attempting to load jar through URL: jar:file:/tmp/sqoop-root/compile/bb46fc39dfc3f2d7a1fbc6df6a2b9117/tblOthrBus.jar!/ 15/12/11 16:33:28 DEBUG util.ClassLoaderStack: Previous classloader is sun.misc.Launcher$AppClassLoader@66d2e7d9 15/12/11 16:33:28 DEBUG util.ClassLoaderStack: Testing class in jar: tblOthrBus 15/12/11 16:33:28 DEBUG util.ClassLoaderStack: Loaded jar into current JVM: jar:file:/tmp/sqoop-root/compile/bb46fc39dfc3f2d7a1fbc6df6a2b9117/tblOthrBus.jar!/ 15/12/11 16:33:28 DEBUG util.ClassLoaderStack: Added classloader for jar /tmp/sqoop-root/compile/bb46fc39dfc3f2d7a1fbc6df6a2b9117/tblOthrBus.jar: java.net.FactoryURLClassLoader@e041f0c 15/12/11 16:33:28 INFO mapreduce.ExportJobBase: Configuring HCatalog for export job 15/12/11 16:33:28 INFO hcat.SqoopHCatUtilities: Configuring HCatalog specific details for job 15/12/11 16:33:28 WARN hcat.SqoopHCatUtilities: Provided HCatalog table name tblOthrBus will be mapped to tblOthrBus 15/12/11 16:33:28 DEBUG manager.SqlManager: Execute getColumnInfoRawQuery : SELECT t.* FROM [tblOthrBus] AS t WHERE 1=0 15/12/11 16:33:28 DEBUG manager.SqlManager: Using fetchSize for next query: 1000 15/12/11 16:33:28 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM [tblOthrBus] AS t WHERE 1=0 15/12/11 16:33:28 DEBUG manager.SqlManager: Found column RowID of type [-5, 19, 0] 15/12/11 16:33:28 DEBUG manager.SqlManager: Found column indvlPK of type [12, 100, 0] 15/12/11 16:33:28 DEBUG manager.SqlManager: Found column desc of type [12, 2147483647, 0] 15/12/11 16:33:28 INFO hcat.SqoopHCatUtilities: Database column names projected : [rowid, indvlpk, desc] 15/12/11 16:33:28 INFO hcat.SqoopHCatUtilities: Database column name - info map : indvlpk : [Type : 12,Precision : 100,Scale : 0] rowid : [Type : -5,Precision : 19,Scale : 0] desc : [Type : 12,Precision : 2147483647,Scale : 0] 15/12/11 16:33:29 INFO hive.metastore: Trying to connect to metastore with URI thrift://devmaster01.mydomain.com:9083 15/12/11 16:33:29 INFO hive.metastore: Connected to metastore. 15/12/11 16:33:29 DEBUG util.ClassLoaderStack: Restoring classloader: sun.misc.Launcher$AppClassLoader@66d2e7d9 15/12/11 16:33:29 ERROR tool.ExportTool: Encountered IOException running export job: java.io.IOException: java.lang.NullPointerException Regards, Manish -----Original Message----- From: Jarek Jarcec Cecho [mailto:jar...@gmail.com] On Behalf Of Jarek Jarcec Cecho Sent: Thursday, December 10, 2015 3:06 PM To: user@sqoop.apache.org Subject: Re: Error while Sqoop Export Hive (AvroSerDe) to MSSQL using Hcat I’m not an hcatalog expert, but I would took a look into mapreduce task logs (especially into those that are failing). Jarcec > On Dec 10, 2015, at 10:17 AM, Manish Gupta 8 > <mgupt...@sapient.com<mailto:mgupt...@sapient.com>> wrote: > > Can someone please provide any pointers I should look into? Basically I want > to sqoop export avro data via Hive tables. > > Thanks > > > From: Manish Gupta 8 [mailto:mgupt...@sapient.com] > Sent: Wednesday, December 09, 2015 5:32 PM > To: user@sqoop.apache.org<mailto:user@sqoop.apache.org> > Subject: Error while Sqoop Export Hive (AvroSerDe) to MSSQL using Hcat > > Hi, > > I have an external table in hive created using Avro SerDe. On this, I have > created couple of normalized views. Now, when I try to export any of the > normalized views/tables from hive to sql server using sqoop, I get error: > “ERROR tool.ExportTool: Encountered IOException running export job: > java.io.IOException: java.lang.NullPointerException” > > My sqoop export looks like following: > sqoop export --connect > "jdbc:sqlserver://*****\****:1433;databaseName=****;user=*****;password=*****;" > --table tblOthrBus --hcatalog-database brokdb --hcatalog-table tblOthrBus > --verbose > > Please note: > · Table has data and I tested with hive as well as spark jdbc server. > · Database connection works fine. Tested with eval and export works > fine with export-dir option. > · Hcat details are correct. hcat.SqoopHCatUtilities is able to read > table metadata > · Hive metastore also gets connected. > > Can someone please guide me into what should I look into, any logs etc. > > Thanks > > Regards, > Manish Gupta > Specialist, Platform| Sapient Global Markets > Noida, India > > Tel: +91 (120) 479 5000 x73766 > Mobile: +91 981 059 1361 > Email: mgupt...@sapient.com<mailto:mgupt...@sapient.com> > sapient.com