[ https://issues.apache.org/jira/browse/HIVE-1435?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Paul Yang updated HIVE-1435: ---------------------------- Description: We recently upgraded from Datanucleus 1.0 to 2.0, which changed some of the defaults for how field names get mapped to datastore identifiers. Because of this change, connecting to an existing database would throw exceptions such as: 2010-06-24 17:59:09,854 ERROR exec.DDLTask (SessionState.java:printError(277)) - FAILED: Error in metadata: javax.jdo.JDODataStoreException: Insert of object "org.apache.hadoop.hive.metastore.model.mstoragedescrip...@4ccd21c" using statement "INSERT INTO `SDS` (`SD_ID`,`NUM_BUCKETS`,`INPUT_FORMAT`,`OUTPUT_FORMAT`,`LOCATION`,`SERDE_ID`,`ISCOMPRESSED`) VALUES (?,?,?,?,?,?,?)" failed : Unknown column 'ISCOMPRESSED' in 'field list' NestedThrowables: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown column 'ISCOMPRESSED' in 'field list' org.apache.hadoop.hive.ql.metadata.HiveException: javax.jdo.JDODataStoreException: Insert of object "org.apache.hadoop.hive.metastore.model.mstoragedescrip...@4ccd21c" using statement "INSERT INTO `SDS` (`SD_ID`,`NUM_BUCKETS`,`INPUT_FORMAT`,`OUTPUT_FORMAT`,`LOCATION`,`SERDE_ID`,`ISCOMPRESSED`) VALUES (?,?,?,?,?,?,?)" failed : Unknown column 'ISCOMPRESSED' in 'field list' NestedThrowables: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown column 'ISCOMPRESSED' in 'field list' at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:325) at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:2012) at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:144) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:107) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:55) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:633) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:506) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:384) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:138) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:197) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:302) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.util.RunJar.main(RunJar.java:156) was: We recently upgraded from Datanucleus 1.0 to 2.0, which changed some of the defaults for how field names get mapped to datastore identifiers. Because of this change, connecting to an existing database would throw exceptions such as: {code} 2010-06-24 17:59:09,854 ERROR exec.DDLTask (SessionState.java:printError(277)) - FAILED: Error in metadata: javax.jdo.JDODataStoreException: Insert of object "org.apache.hadoop.hive.metastore.model.mstoragedescrip...@4ccd21c" using statement "INSERT INTO `SDS` (`SD_ID`,`NUM_BUCKETS`,`INPUT_FORMAT`,`OUTPUT_FORMAT`,`LOCATION`,`SERDE_ID`,`ISCOMPRESSED`) VALUES (?,?,?,?,?,?,?)" failed : Unknown column 'ISCOMPRESSED' in 'field list' NestedThrowables: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown column 'ISCOMPRESSED' in 'field list' org.apache.hadoop.hive.ql.metadata.HiveException: javax.jdo.JDODataStoreException: Insert of object "org.apache.hadoop.hive.metastore.model.mstoragedescrip...@4ccd21c" using statement "INSERT INTO `SDS` (`SD_ID`,`NUM_BUCKETS`,`INPUT_FORMAT`,`OUTPUT_FORMAT`,`LOCATION`,`SERDE_ID`,`ISCOMPRESSED`) VALUES (?,?,?,?,?,?,?)" failed : Unknown column 'ISCOMPRESSED' in 'field list' NestedThrowables: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown column 'ISCOMPRESSED' in 'field list' at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:325) at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:2012) at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:144) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:107) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:55) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:633) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:506) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:384) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:138) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:197) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:302) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.util.RunJar.main(RunJar.java:156) {code} > Upgraded naming scheme causes JDO exceptions > -------------------------------------------- > > Key: HIVE-1435 > URL: https://issues.apache.org/jira/browse/HIVE-1435 > Project: Hadoop Hive > Issue Type: Bug > Components: Metastore > Affects Versions: 0.6.0, 0.7.0 > Reporter: Paul Yang > Assignee: Paul Yang > > We recently upgraded from Datanucleus 1.0 to 2.0, which changed some of the > defaults for how field names get mapped to datastore identifiers. Because of > this change, connecting to an existing database would throw exceptions such > as: > 2010-06-24 17:59:09,854 ERROR exec.DDLTask > (SessionState.java:printError(277)) - FAILED: Error in metadata: > javax.jdo.JDODataStoreException: Insert of object > "org.apache.hadoop.hive.metastore.model.mstoragedescrip...@4ccd21c" using > statement "INSERT INTO `SDS` > (`SD_ID`,`NUM_BUCKETS`,`INPUT_FORMAT`,`OUTPUT_FORMAT`,`LOCATION`,`SERDE_ID`,`ISCOMPRESSED`) > VALUES (?,?,?,?,?,?,?)" failed : Unknown column 'ISCOMPRESSED' in 'field > list' > NestedThrowables: > com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown column > 'ISCOMPRESSED' in 'field list' > org.apache.hadoop.hive.ql.metadata.HiveException: > javax.jdo.JDODataStoreException: Insert of object > "org.apache.hadoop.hive.metastore.model.mstoragedescrip...@4ccd21c" using > statement "INSERT INTO `SDS` > (`SD_ID`,`NUM_BUCKETS`,`INPUT_FORMAT`,`OUTPUT_FORMAT`,`LOCATION`,`SERDE_ID`,`ISCOMPRESSED`) > VALUES (?,?,?,?,?,?,?)" failed : Unknown column 'ISCOMPRESSED' in 'field > list' > NestedThrowables: > com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown column > 'ISCOMPRESSED' in 'field list' > at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:325) > at > org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:2012) > at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:144) > at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:107) > at > org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:55) > at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:633) > at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:506) > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:384) > at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:138) > at > org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:197) > at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:302) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > at java.lang.reflect.Method.invoke(Method.java:597) > at org.apache.hadoop.util.RunJar.main(RunJar.java:156) -- This message is automatically generated by JIRA. - You can reply to this email to add a comment to the issue online.