jini-lee opened a new issue, #8006:
URL: https://github.com/apache/iceberg/issues/8006
### Apache Iceberg version
1.3.0 (latest release)
### Query engine
Hive
### Please describe the bug 🐞
Version: Hive 3.1
I did create external table on hive which is path based catalog.
The data was inserted by spark-sql(3.1) and i did query on hive(beeline).
The count query works, but simply selecting data results in an error.
The result likes below.
```
INFO : Compiling
command(queryId=hive_20230707101858_9c601809-04be-421c-b4eb-fbc29121130b):
select count(*) from test.test
INFO : Semantic Analysis Completed (retrial = false)
INFO : Created Hive schema: Schema(fieldSchemas:[FieldSchema(name:_c0,
type:bigint, comment:null)], properties:null)
INFO : Completed compiling
command(queryId=hive_20230707101858_9c601809-04be-421c-b4eb-fbc29121130b); Time
taken: 0.113 seconds
INFO : Executing
command(queryId=hive_20230707101858_9c601809-04be-421c-b4eb-fbc29121130b):
select count(*) from test.test
INFO : Query ID = hive_20230707101858_9c601809-04be-421c-b4eb-fbc29121130b
INFO : Total jobs = 1
INFO : Launching Job 1 out of 1
INFO : Starting task [Stage-1:MAPRED] in serial mode
INFO : Subscribed to counters: [] for queryId:
hive_20230707101858_9c601809-04be-421c-b4eb-fbc29121130b
INFO : Session is already open
INFO : Dag name: select count(*) from test.test (Stage-1)
INFO : Status: Running (Executing on YARN cluster with App id
application_1682391016059_465643)
printed operations logs
----------------------------------------------------------------------------------------------
VERTICES MODE STATUS TOTAL COMPLETED RUNNING PENDING
FAILED KILLED
----------------------------------------------------------------------------------------------
Map 1 .......... container SUCCEEDED 1 1 0 0
0 0
Reducer 2 ...... container SUCCEEDED 1 1 0 0
0 0
----------------------------------------------------------------------------------------------
VERTICES: 02/02 [==========================>>] 100% ELAPSED TIME: 1.09 s
----------------------------------------------------------------------------------------------
Getting log thread is interrupted, since query is done!
----------------------------------------------------------------------------------------------
VERTICES MODE STATUS TOTAL COMPLETED RUNNING PENDING
FAILED KILLED
----------------------------------------------------------------------------------------------
Map 1 .......... container SUCCEEDED 1 1 0 0
0 0
Reducer 2 ...... container SUCCEEDED 1 1 0 0
0 0
----------------------------------------------------------------------------------------------
VERTICES: 02/02 [==========================>>] 100% ELAPSED TIME: 1.11 s
----------------------------------------------------------------------------------------------
+------+
| _c0 |
+------+
| 6 |
+------+
0: jdbc:hive2://beta-aphm-001-adh-jp2p-prod.l> select data from test.test
limit 1;
going to print operations logs
printed operations logs
Getting log thread is interrupted, since query is done!
INFO : Compiling
command(queryId=hive_20230707101934_d0bbb79c-13f6-4520-932f-b2852ab19f58):
select data from test.test limit 1
INFO : No Stats for test@test, Columns: data
INFO : Semantic Analysis Completed (retrial = false)
INFO : Created Hive schema: Schema(fieldSchemas:[FieldSchema(name:data,
type:string, comment:null)], properties:null)
INFO : Completed compiling
command(queryId=hive_20230707101934_d0bbb79c-13f6-4520-932f-b2852ab19f58); Time
taken: 0.093 seconds
INFO : Executing
command(queryId=hive_20230707101934_d0bbb79c-13f6-4520-932f-b2852ab19f58):
select data from test.test limit 1
INFO : Completed executing
command(queryId=hive_20230707101934_d0bbb79c-13f6-4520-932f-b2852ab19f58); Time
taken: 0.006 seconds
INFO : OK
org.apache.thrift.transport.TTransportException
at
org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
at
org.apache.thrift.transport.TSaslTransport.readLength(TSaslTransport.java:374)
at
org.apache.thrift.transport.TSaslTransport.readFrame(TSaslTransport.java:451)
at
org.apache.thrift.transport.TSaslTransport.read(TSaslTransport.java:433)
at
org.apache.thrift.transport.TSaslClientTransport.read(TSaslClientTransport.java:37)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
at
org.apache.hadoop.hive.metastore.security.TFilterTransport.readAll(TFilterTransport.java:62)
at
org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429)
at
org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318)
at
org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219)
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77)
at
org.apache.hive.service.rpc.thrift.TCLIService$Client.recv_FetchResults(TCLIService.java:567)
at
org.apache.hive.service.rpc.thrift.TCLIService$Client.FetchResults(TCLIService.java:554)
at sun.reflect.GeneratedMethodAccessor3.invoke(Unknown Source)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.hive.jdbc.HiveConnection$SynchronizedHandler.invoke(HiveConnection.java:1721)
at com.sun.proxy.$Proxy24.FetchResults(Unknown Source)
at
org.apache.hive.jdbc.HiveQueryResultSet.next(HiveQueryResultSet.java:378)
at org.apache.hive.beeline.BufferedRows.<init>(BufferedRows.java:56)
at
org.apache.hive.beeline.IncrementalRowsWithNormalization.<init>(IncrementalRowsWithNormalization.java:50)
at org.apache.hive.beeline.BeeLine.print(BeeLine.java:2322)
at org.apache.hive.beeline.Commands.executeInternal(Commands.java:1028)
at org.apache.hive.beeline.Commands.execute(Commands.java:1217)
at org.apache.hive.beeline.Commands.sql(Commands.java:1146)
at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:1497)
at org.apache.hive.beeline.BeeLine.execute(BeeLine.java:1355)
at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:1134)
at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:1082)
at
org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:546)
at org.apache.hive.beeline.BeeLine.main(BeeLine.java:528)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
Unexpected end of file when reading from HS2 server. The root cause might be
too many concurrent connections. Please ask the administrator to check the
number of active connections, and adjust hive.server2.thrift.max.worker.threads
if applicable.
Error: org.apache.thrift.transport.TTransportException (state=08S01,code=0)
java.sql.SQLException: org.apache.thrift.transport.TTransportException
at
org.apache.hive.jdbc.HiveStatement.closeStatementIfNeeded(HiveStatement.java:222)
at
org.apache.hive.jdbc.HiveStatement.closeClientOperation(HiveStatement.java:227)
at org.apache.hive.jdbc.HiveStatement.close(HiveStatement.java:243)
at org.apache.hive.beeline.Commands.executeInternal(Commands.java:1067)
at org.apache.hive.beeline.Commands.execute(Commands.java:1217)
at org.apache.hive.beeline.Commands.sql(Commands.java:1146)
at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:1497)
at org.apache.hive.beeline.BeeLine.execute(BeeLine.java:1355)
at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:1134)
at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:1082)
at
org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:546)
at org.apache.hive.beeline.BeeLine.main(BeeLine.java:528)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
Caused by: org.apache.thrift.transport.TTransportException
at
org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
at
org.apache.thrift.transport.TSaslTransport.readLength(TSaslTransport.java:374)
at
org.apache.thrift.transport.TSaslTransport.readFrame(TSaslTransport.java:451)
at
org.apache.thrift.transport.TSaslTransport.read(TSaslTransport.java:433)
at
org.apache.thrift.transport.TSaslClientTransport.read(TSaslClientTransport.java:37)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
at
org.apache.hadoop.hive.metastore.security.TFilterTransport.readAll(TFilterTransport.java:62)
at
org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429)
at
org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318)
at
org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219)
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77)
at
org.apache.hive.service.rpc.thrift.TCLIService$Client.recv_CloseOperation(TCLIService.java:521)
at
org.apache.hive.service.rpc.thrift.TCLIService$Client.CloseOperation(TCLIService.java:508)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.hive.jdbc.HiveConnection$SynchronizedHandler.invoke(HiveConnection.java:1721)
at com.sun.proxy.$Proxy24.CloseOperation(Unknown Source)
at
org.apache.hive.jdbc.HiveStatement.closeStatementIfNeeded(HiveStatement.java:215)
... 17 more
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]