Hi, Sorry for slow reply. I have debugged and found why the error occurs. Please try using Hive2’s Beeline to connect to the SQL Gateway. I think it is a quick solution.
The ‘CLI_ODBC_KEYWORDS' is new in Hive3, and in Initialization phase of connection, Beeline will send a request using this InfoType. Because we didn’t handle it, the value is null, but Hive requires this value. So, an error occurred and Hive will close the I/O after finding it is null (this is why this shows in log: java.net.SocketException: Socket closed), and the client can’t communicate with gateway anymore. Hive2’s Beeline won’t send this, and I think using it to send your SQL to Hive3 environment is OK. We will fix this later. Best, yuzelin > 2022年11月1日 14:52,QiZhu Chan <qizhu...@163.com> 写道: > > Hi team, > I had starting the SQL Gateway with the HiveServer2 Endpoint, and then I > submit SQL with Apache Hive Beeline, but I get the following exception: > > java.lang.UnsupportedOperationException: Unrecognized TGetInfoType value: > CLI_ODBC_KEYWORDS. > at > org.apache.flink.table.endpoint.hive.HiveServer2Endpoint.GetInfo(HiveServer2Endpoint.java:371) > [flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0] > at > org.apache.hive.service.rpc.thrift.TCLIService$Processor$GetInfo.getResult(TCLIService.java:1537) > [flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0] > at > org.apache.hive.service.rpc.thrift.TCLIService$Processor$GetInfo.getResult(TCLIService.java:1522) > [flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0] > at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) > [flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0] > at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) > [flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0] > at > org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286) > [flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0] > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) > [?:?] > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) > [?:?] > at java.lang.Thread.run(Thread.java:834) [?:?] > 2022-11-01 13:55:33,885 ERROR org.apache.thrift.server.TThreadPoolServer > [] - Thrift error occurred during processing of message. > org.apache.thrift.protocol.TProtocolException: Required field 'infoValue' is > unset! Struct:TGetInfoResp(status:TStatus(statusCode:ERROR_STATUS, > infoMessages:[*java.lang.UnsupportedOperationException:Unrecognized > TGetInfoType value: CLI_ODBC_KEYWORDS.:9:8, > org.apache.flink.table.endpoint.hive.HiveServer2Endpoint:GetInfo:HiveServer2Endpoint.java:371, > > org.apache.hive.service.rpc.thrift.TCLIService$Processor$GetInfo:getResult:TCLIService.java:1537, > > org.apache.hive.service.rpc.thrift.TCLIService$Processor$GetInfo:getResult:TCLIService.java:1522, > org.apache.thrift.ProcessFunction:process:ProcessFunction.java:39, > org.apache.thrift.TBaseProcessor:process:TBaseProcessor.java:39, > org.apache.thrift.server.TThreadPoolServer$WorkerProcess:run:TThreadPoolServer.java:286, > > java.util.concurrent.ThreadPoolExecutor:runWorker:ThreadPoolExecutor.java:1128, > > java.util.concurrent.ThreadPoolExecutor$Worker:run:ThreadPoolExecutor.java:628, > java.lang.Thread:run:Thread.java:834], errorMessage:Unrecognized > TGetInfoType value: CLI_ODBC_KEYWORDS.), infoValue:null) > at > org.apache.hive.service.rpc.thrift.TGetInfoResp.validate(TGetInfoResp.java:379) > ~[flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0] > at > org.apache.hive.service.rpc.thrift.TCLIService$GetInfo_result.validate(TCLIService.java:5228) > ~[flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0] > at > org.apache.hive.service.rpc.thrift.TCLIService$GetInfo_result$GetInfo_resultStandardScheme.write(TCLIService.java:5285) > ~[flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0] > at > org.apache.hive.service.rpc.thrift.TCLIService$GetInfo_result$GetInfo_resultStandardScheme.write(TCLIService.java:5254) > ~[flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0] > at > org.apache.hive.service.rpc.thrift.TCLIService$GetInfo_result.write(TCLIService.java:5205) > ~[flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0] > at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:53) > ~[flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0] > at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) > ~[flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0] > at > org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286) > [flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0] > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) > [?:?] > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) > [?:?] > at java.lang.Thread.run(Thread.java:834) [?:?] > 2022-11-01 13:55:33,886 WARN org.apache.thrift.transport.TIOStreamTransport > [] - Error closing output stream. > java.net.SocketException: Socket closed > at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:113) > ~[?:?] > at java.net.SocketOutputStream.write(SocketOutputStream.java:150) ~[?:?] > at > java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:81) ~[?:?] > at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:142) > ~[?:?] > at java.io.FilterOutputStream.close(FilterOutputStream.java:182) ~[?:?] > at > org.apache.thrift.transport.TIOStreamTransport.close(TIOStreamTransport.java:110) > [flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0] > at org.apache.thrift.transport.TSocket.close(TSocket.java:235) > [flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0] > at > org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:303) > [flink-sql-connector-hive-3.1.2_2.12-1.16.0.jar:1.16.0] > at > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) > [?:?] > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) > [?:?] > at java.lang.Thread.run(Thread.java:834) [?:?] > ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- > I look up the source code and find this code, but I do not know how to > solve the above exception. > public TGetInfoResp GetInfo(TGetInfoReq tGetInfoReq) throws TException { > TGetInfoResp resp = new TGetInfoResp(); > try { > GatewayInfo info = service.getGatewayInfo(); > TGetInfoValue tInfoValue; > switch (tGetInfoReq.getInfoType()) { > case CLI_SERVER_NAME: > case CLI_DBMS_NAME: > tInfoValue = TGetInfoValue.stringValue(info.getProductName()); > break; > case CLI_DBMS_VER: > tInfoValue = > TGetInfoValue.stringValue(info.getVersion().toString()); > break; > default: > throw new UnsupportedOperationException( > String.format( > "Unrecognized TGetInfoType value: %s.", > tGetInfoReq.getInfoType())); > } > resp.setStatus(OK_STATUS); > resp.setInfoValue(tInfoValue); > } catch (Throwable t) { > LOG.error("Failed to GetInfo.", t); > resp.setStatus(toTStatus(t)); > } > return resp; > } > > Flink version: 1.16.0 > Hive version: 3.1.2 > > sql-gateway config : > sql-gateway.endpoint.type: hiveserver2 > sql-gateway.endpoint.hiveserver2.address: localhost > sql-gateway.endpoint.hiveserver2.port: 9002 > sql-gateway.endpoint.hiveserver2.catalog.hive-conf-dir: /usr/local/hive/conf > sql-gateway.endpoint.hiveserver2.thrift.host: localhost > sql-gateway.endpoint.hiveserver2.thrift.port: 10001 > > >