Yulei Wang created SQOOP-3247: --------------------------------- Summary: Sqoop doesn't handle unsigned bigints at least with MySQL Key: SQOOP-3247 URL: https://issues.apache.org/jira/browse/SQOOP-3247 Project: Sqoop Issue Type: Bug Components: connectors/mysql Affects Versions: 1.4.6 Reporter: Yulei Wang Fix For: 1.4.7
mysql> desc mysql2hdfs; +---------+---------------------+------+-----+---------+-------+ | Field | Type | Null | Key | Default | Extra | +---------+---------------------+------+-----+---------+-------+ | id | bigint(20) unsigned | YES | | NULL | | | address | varchar(20) | YES | | NULL | | +---------+---------------------+------+-----+---------+-------+ 2 rows in set (0.00 sec) mysql> select * from mysql2hdfs; +----------------------+---------+ | id | address | +----------------------+---------+ | 18446744073709551615 | suzhou | | 18446744073709551615 | suzhou | +----------------------+---------+ 2 rows in set (0.00 sec) Get's the following error 17/10/30 15:05:01 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: com.mysql.jdbc.exceptions.jdbc4.MySQLDataException: '18446744073709551615' in column '1' is outside valid range for the datatype BIGINT. at org.apache.sqoop.mapreduce.db.DataDrivenDBInputFormat.getSplits(DataDrivenDBInputFormat.java:174) at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:608) at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:625) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:494) at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1326) at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1323) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1858) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1323) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1344) at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:196) at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:169) at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:266) at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673) at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118) at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497) at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605) at org.apache.sqoop.Sqoop.run(Sqoop.java:143) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227) at org.apache.sqoop.Sqoop.main(Sqoop.java:236) Caused by: com.mysql.jdbc.exceptions.jdbc4.MySQLDataException: '18446744073709551615' in column '1' is outside valid range for the datatype BIGINT. at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:526) at com.mysql.jdbc.Util.handleNewInstance(Util.java:409) at com.mysql.jdbc.Util.getInstance(Util.java:384) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1025) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:987) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:973) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:918) at com.mysql.jdbc.ResultSetImpl.throwRangeException(ResultSetImpl.java:7875) at com.mysql.jdbc.ResultSetImpl.parseLongAsDouble(ResultSetImpl.java:7092) at com.mysql.jdbc.ResultSetImpl.getLong(ResultSetImpl.java:2977) at com.mysql.jdbc.ResultSetImpl.getLong(ResultSetImpl.java:2942) at org.apache.sqoop.mapreduce.db.IntegerSplitter.split(IntegerSplitter.java:44) at org.apache.sqoop.mapreduce.db.DataDrivenDBInputFormat.getSplits(DataDrivenDBInputFormat.java:171) ... 23 more -- This message was sent by Atlassian JIRA (v6.4.14#64029)