Was your table created as you presented or did you removed some columns after table creation?
I'm not an PostgreSQL expert, but it seems that you might hit SQOOP-445: https://issues.apache.org/jira/browse/SQOOP-445 Jarcec On Mon, Feb 27, 2012 at 11:06:41AM -0500, Tolliver, Johnny S. wrote: > On Feb 25, 2012, at 1:11 AM, Jarek Jarcec Cecho wrote: > > > would you mind running sqoop again with parameter --verbose and sending > > entire output? > > Sure. With --verbose on the create-hive-table command, I get the following: > > [hduser@cmsvm01 ~]$ sqoop create-hive-table --connect > jdbc:postgresql://cmsgp/kdi_big --table kdi_eligibility --username xxxxxx -P > --verbose --fields-terminated-by ',' > 12/02/27 10:06:07 DEBUG tool.BaseSqoopTool: Enabled debug logging. > Enter password: > 12/02/27 10:06:18 DEBUG sqoop.ConnFactory: Loaded manager factory: > com.cloudera.sqoop.manager.DefaultManagerFactory > 12/02/27 10:06:18 DEBUG sqoop.ConnFactory: Trying ManagerFactory: > com.cloudera.sqoop.manager.DefaultManagerFactory > 12/02/27 10:06:18 DEBUG manager.DefaultManagerFactory: Trying with scheme: > jdbc:postgresql: > 12/02/27 10:06:18 INFO manager.SqlManager: Using default fetchSize of 1000 > 12/02/27 10:06:18 DEBUG sqoop.ConnFactory: Instantiated ConnManager > com.cloudera.sqoop.manager.PostgresqlManager@39e87719 > 12/02/27 10:06:19 INFO hive.HiveImport: Loading uploaded data into Hive > 12/02/27 10:06:19 DEBUG hive.HiveImport: Hive.inputTable: kdi_eligibility > 12/02/27 10:06:19 DEBUG hive.HiveImport: Hive.outputTable: kdi_eligibility > 12/02/27 10:06:19 DEBUG manager.SqlManager: No connection paramenters > specified. Using regular API for making connection. > 12/02/27 10:06:19 DEBUG manager.SqlManager: Using fetchSize for next query: > 1000 > 12/02/27 10:06:19 INFO manager.SqlManager: Executing SQL statement: SELECT > t.* FROM "kdi_eligibility" AS t LIMIT 1 > 12/02/27 10:06:19 WARN hive.TableDefWriter: Column link_key had to be cast to > a less precise type in Hive > 12/02/27 10:06:19 WARN hive.TableDefWriter: Column birth_dt had to be cast to > a less precise type in Hive > 12/02/27 10:06:19 WARN hive.TableDefWriter: Column death_dt had to be cast to > a less precise type in Hive > 12/02/27 10:06:19 ERROR sqoop.Sqoop: Got exception running Sqoop: > java.lang.NullPointerException > java.lang.NullPointerException > at > com.cloudera.sqoop.hive.TableDefWriter.getCreateTableStmt(TableDefWriter.java:151) > at com.cloudera.sqoop.hive.HiveImport.importTable(HiveImport.java:193) > at > com.cloudera.sqoop.tool.CreateHiveTableTool.run(CreateHiveTableTool.java:60) > at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146) > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) > at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182) > at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221) > at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230) > at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239) > > > And with --verbose on the import command, this: > > > [hduser@cmsvm01 ~]$ sqoop import --connect jdbc:postgresql://cmsgp/kdi_big > --username xxxxxx -P --table kdi_eligibility --hive-import -m 1 --direct > --verbose > 12/02/27 10:04:41 DEBUG tool.BaseSqoopTool: Enabled debug logging. > Enter password: > 12/02/27 10:04:57 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for > output. You can override > 12/02/27 10:04:57 INFO tool.BaseSqoopTool: delimiters with > --fields-terminated-by, etc. > 12/02/27 10:04:57 DEBUG sqoop.ConnFactory: Loaded manager factory: > com.cloudera.sqoop.manager.DefaultManagerFactory > 12/02/27 10:04:57 DEBUG sqoop.ConnFactory: Trying ManagerFactory: > com.cloudera.sqoop.manager.DefaultManagerFactory > 12/02/27 10:04:57 DEBUG manager.DefaultManagerFactory: Trying with scheme: > jdbc:postgresql: > 12/02/27 10:04:57 INFO manager.SqlManager: Using default fetchSize of 1000 > 12/02/27 10:04:57 DEBUG sqoop.ConnFactory: Instantiated ConnManager > com.cloudera.sqoop.manager.DirectPostgresqlManager@1bbb60c3 > 12/02/27 10:04:57 INFO tool.CodeGenTool: Beginning code generation > 12/02/27 10:04:57 DEBUG manager.SqlManager: No connection paramenters > specified. Using regular API for making connection. > 12/02/27 10:04:57 DEBUG manager.SqlManager: Using fetchSize for next query: > 1000 > 12/02/27 10:04:57 INFO manager.SqlManager: Executing SQL statement: SELECT > t.* FROM "kdi_eligibility" AS t LIMIT 1 > 12/02/27 10:04:58 ERROR tool.ImportTool: Imported Failed: Column name > '........pg.dropped.16........' not in table > > > And, for reference, the table is defined as follows: > > CREATE TABLE kdi_eligibility ( > state varchar(2) NOT NULL, > fyq varchar(5) NULL, > link_key numeric(8, 0) NOT NULL, > rectype varchar(1) NULL, > ident varchar(20) NULL, > xxx varchar(1) NULL, > county varchar(3) NULL, > zip varchar(5) NULL, > hhh varchar(12) NULL, > casenum varchar(12) NULL, > birth_dt datetime NULL, > death_dt datetime NULL, > idnum varchar(9) NULL, > s_indicator varchar(1) NULL, > e_indicator varchar(1) NULL, > xrefnum varchar(12) NULL > ) > > > Thanks! >
signature.asc
Description: Digital signature