rob created HIVE-22000: -------------------------- Summary: Trying to Create a Connection to an Oracle Data Key: HIVE-22000 URL: https://issues.apache.org/jira/browse/HIVE-22000 Project: Hive Issue Type: Bug Components: Hive Affects Versions: 3.1.1 Environment: hdfs version
Hadoop 3.2.0 Source code repository https://github.com/apache/hadoop.git -r e97acb3bd8f3befd27418996fa5d4b50bf2e17bf Compiled by sunilg on 2019-01-08T06:08Z Compiled with protoc 2.5.0 >From source with checksum d3f0795ed0d9dc378e2c785d3668f39 java -version openjdk version "1.8.0_201" OpenJDK Runtime Environment (build 1.8.0_201-b09) OpenJDK 64-Bit Server VM (build 25.201-b09, mixed mode) hive --version SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/hadoop/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/hadoop/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] Hive 3.1.1 Git git://daijymacpro-2.local/Users/daijy/commit/hive -r f4e0529634b6231a0072295da48af466cf2f10b7 Compiled by daijy on Tue Oct 23 17:19:24 PDT 2018 >From source with checksum 6deca5a8401bbb6c6b49898be6fcb80e Reporter: rob Hi I am trying to connect to an oracle database. I have put the relevant jar in the lib foldler ls -la hive/lib/ -rw-rw-r-- 1 hadoop hadoop 4036257 Jul 12 15:37 ojdbc8.jar Using beeline SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/hadoop/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/hadoop/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] Beeline version 3.1.1 by Apache Hive beeline> !scan scan complete in 214ms 8 driver classes found Compliant Version Driver Class yes 6.2 com.microsoft.sqlserver.jdbc.SQLServerDriver no 5.1 com.mysql.jdbc.Driver yes 12.2 oracle.jdbc.OracleDriver yes 1.16 org.apache.calcite.avatica.remote.Driver yes 1.16 org.apache.calcite.jdbc.Driver yes 10.14 org.apache.derby.jdbc.AutoloadedDriver no 3.1 org.apache.hive.jdbc.HiveDriver no 9.4 org.postgresql.Driver If I try and connect to the database via the beeline command line beeline -u jdbc:oracle:thin:@//robtest1.ceo8wqiptv9v.eu-west-1.rds.amazonaws.com:1521/ORCL -n dbadmin -p **** SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/hadoop/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/hadoop/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] Connecting to jdbc:oracle:thin:@//robtest1.ceo8wqiptv9v.eu-west-1.rds.amazonaws.com:1521/ORCL Connected to: Oracle (version Oracle Database 12c Enterprise Edition Release 12.1.0.2.0 - 64bit Production With the Partitioning, OLAP, Advanced Analytics and Real Application Testing options) Driver: Oracle JDBC driver (version 12.2.0.1.0) Error: READ_COMMITTED and SERIALIZABLE are the only valid transaction levels (state=99999,code=17030) Beeline version 3.1.1 by Apache Hive 0: jdbc:oracle:thin:@//robtest1.ceo8wqiptv9v.> select count(*) from user_tables; +-----------+ | COUNT(*) | +-----------+ | 1 | +-----------+ 1 row selected (0.376 seconds) 0: jdbc:oracle:thin:@//robtest1.ceo8wqiptv9v.> select count(*) from RobOracleTable; +-----------+ | COUNT(*) | +-----------+ | 3 | +-----------+ 1 row selected (0.027 seconds) When I try and create a table I get 0: jdbc:hive2://> CREATE EXTERNAL TABLE RobOracleTable( . . . . . . . . > id INT, . . . . . . . . > names STRING . . . . . . . . > ) . . . . . . . . > STORED BY 'org.apache.hive.storage.jdbc.JdbcStorageHandler' . . . . . . . . > TBLPROPERTIES ( . . . . . . . . > "hive.sql.database.type" = "ORACLE", . . . . . . . . > "hive.sql.jdbc.driver" = "oracle.jdbc.OracleDriver", . . . . . . . . > "hive.sql.jdbc.url" = "jdbc:oracle:thin:@//robtest1.ceo8wqiptv9v.eu-west-1.rds.amazonaws.com:1521/ORCL", . . . . . . . . > "hive.sql.query" = "select id,names from dbadmin.RobOracleTable", . . . . . . . . > "hive.sql.dbcp.username" = "dbadmin", . . . . . . . . > "hive.sql.dbcp.password" = "****" . . . . . . . . > ); 19/07/16 14:52:40 [HiveServer2-Background-Pool: Thread-51]: ERROR dao.GenericJdbcDatabaseAccessor: Error while trying to get column names. org.apache.commons.dbcp.SQLNestedException: Cannot create PoolableConnectionFactory (ORA-01017: invalid username/password; logon denied ) at org.apache.commons.dbcp.BasicDataSource.createPoolableConnectionFactory(BasicDataSource.java:1549) ~[commons-dbcp-1.4.jar:1.4] at org.apache.commons.dbcp.BasicDataSource.createDataSource(BasicDataSource.java:1388) ~[commons-dbcp-1.4.jar:1.4] at org.apache.commons.dbcp.BasicDataSource.getLogWriter(BasicDataSource.java:1098) ~[commons-dbcp-1.4.jar:1.4] at org.apache.commons.dbcp.BasicDataSourceFactory.createDataSource(BasicDataSourceFactory.java:350) ~[commons-dbcp-1.4.jar:1.4] at org.apache.hive.storage.jdbc.dao.GenericJdbcDatabaseAccessor.initializeDatabaseConnection(GenericJdbcDatabaseAccessor.java:223) ~[hive-jdbc-handler-3.1.1.jar:3.1.1] at org.apache.hive.storage.jdbc.dao.GenericJdbcDatabaseAccessor.getColumnNames(GenericJdbcDatabaseAccessor.java:67) [hive-jdbc-handler-3.1.1.jar:3.1.1] at org.apache.hive.storage.jdbc.JdbcSerDe.initialize(JdbcSerDe.java:73) [hive-jdbc-handler-3.1.1.jar:3.1.1] at org.apache.hadoop.hive.serde2.AbstractSerDe.initialize(AbstractSerDe.java:54) [hive-exec-3.1.1.jar:3.1.1] at org.apache.hadoop.hive.serde2.SerDeUtils.initializeSerDe(SerDeUtils.java:540) [hive-exec-3.1.1.jar:3.1.1] at org.apache.hadoop.hive.metastore.HiveMetaStoreUtils.getDeserializer(HiveMetaStoreUtils.java:90) [hive-exec-3.1.1.jar:3.1.1] at org.apache.hadoop.hive.metastore.HiveMetaStoreUtils.getDeserializer(HiveMetaStoreUtils.java:77) [hive-exec-3.1.1.jar:3.1.1] at org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:289) [hive-exec-3.1.1.jar:3.1.1] at org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:271) [hive-exec-3.1.1.jar:3.1.1] at org.apache.hadoop.hive.ql.metadata.Table.getColsInternal(Table.java:663) [hive-exec-3.1.1.jar:3.1.1] at org.apache.hadoop.hive.ql.metadata.Table.getCols(Table.java:646) [hive-exec-3.1.1.jar:3.1.1] at org.apache.hadoop.hive.ql.plan.CreateTableDesc.toTable(CreateTableDesc.java:869) [hive-exec-3.1.1.jar:3.1.1] at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4913) [hive-exec-3.1.1.jar:3.1.1] at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:428) [hive-exec-3.1.1.jar:3.1.1] at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:205) [hive-exec-3.1.1.jar:3.1.1] at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97) [hive-exec-3.1.1.jar:3.1.1] at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2664) [hive-exec-3.1.1.jar:3.1.1] at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2335) [hive-exec-3.1.1.jar:3.1.1] at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2011) [hive-exec-3.1.1.jar:3.1.1] at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1709) [hive-exec-3.1.1.jar:3.1.1] at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1703) [hive-exec-3.1.1.jar:3.1.1] at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157) [hive-exec-3.1.1.jar:3.1.1] at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:224) [hive-service-3.1.1.jar:3.1.1] at org.apache.hive.service.cli.operation.SQLOperation.access$700(SQLOperation.java:87) [hive-service-3.1.1.jar:3.1.1] at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:316) I am not sure what the problem is. It is the same username and password? -- This message was sent by Atlassian JIRA (v7.6.14#76016)