Michael Smith created IMPALA-12663:
--------------------------------------
Summary: TestIcebergV2Table.test_optimize IOException
Key: IMPALA-12663
URL: https://issues.apache.org/jira/browse/IMPALA-12663
Project: IMPALA
Issue Type: Bug
Affects Versions: Impala 4.4.0
Reporter: Michael Smith
In [https://jenkins.impala.io/job/ubuntu-20.04-dockerised-tests/1016/,]
query_test.test_iceberg.TestIcebergV2Table.test_optimize failed to write an
Iceberg metadata file with \{{IOException: The stream is closed}}.
Stack trace
{code:java}
query_test/test_iceberg.py:1523: in test_optimize
self.execute_query("insert into {0} values (8);".format(tbl_name))
common/impala_test_suite.py:876: in wrapper
return function(*args, **kwargs)
common/impala_test_suite.py:908: in execute_query
return self.__execute_query(self.client, query, query_options)
common/impala_test_suite.py:1001: in __execute_query
return impalad_client.execute(query, user=user)
common/impala_connection.py:214: in execute
return self.__beeswax_client.execute(sql_stmt, user=user)
beeswax/impala_beeswax.py:191: in execute
handle = self.__execute_query(query_string.strip(), user=user)
beeswax/impala_beeswax.py:369: in __execute_query
self.wait_for_finished(handle)
beeswax/impala_beeswax.py:390: in wait_for_finished
raise ImpalaBeeswaxException("Query aborted:" + error_log, None)
E ImpalaBeeswaxException: ImpalaBeeswaxException:
E Query aborted:RuntimeIOException: Failed to write json to file:
hdfs://172.18.0.1:20500/test-warehouse/test_optimize_a6619d09.db/optimize_iceberg/metadata/00003-7c6b3f3a-c115-4cb5-81c3-edbf0c5eead9.metadata.json
E CAUSED BY: IOException: The stream is closed {code}
Relevant logs from catalogd
{code:java}
W1221 07:11:32.982712 9183 DataStreamer.java:829] DataStreamer Exception
Java exception follows:
java.nio.channels.ClosedByInterruptException
at
java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:202)
at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:477)
at
org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:63)
at
org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:141)
at
org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:159)
at
org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:117)
at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140)
at java.io.DataOutputStream.flush(DataOutputStream.java:123)
at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:778)
E1221 07:11:33.003883 1061 JniUtil.java:183]
424295ecec7b510b:f711981800000000] Error in Update catalog for
test_optimize_a6619d09.optimize_iceberg. Time spent: 335ms
I1221 07:11:33.005769 1061 jni-util.cc:302] 424295ecec7b510b:f711981800000000]
org.apache.iceberg.exceptions.RuntimeIOException: Failed to write json to file:
hdfs://172.18.0.1:20500/test-warehouse/test_optimize_a6619d09.db/optimize_iceberg/metadata/00003-7c6b3f3a-c115-4cb5-81c3-edbf0c5eead9.metadata.json
at
org.apache.iceberg.TableMetadataParser.internalWrite(TableMetadataParser.java:132)
at
org.apache.iceberg.TableMetadataParser.overwrite(TableMetadataParser.java:114)
at
org.apache.iceberg.BaseMetastoreTableOperations.writeNewMetadata(BaseMetastoreTableOperations.java:170)
at
org.apache.iceberg.BaseMetastoreTableOperations.writeNewMetadataIfRequired(BaseMetastoreTableOperations.java:160)
at
org.apache.iceberg.hive.HiveTableOperations.doCommit(HiveTableOperations.java:185)
at
org.apache.iceberg.BaseMetastoreTableOperations.commit(BaseMetastoreTableOperations.java:135)
at
org.apache.iceberg.BaseTransaction.lambda$commitSimpleTransaction$5(BaseTransaction.java:422)
at org.apache.iceberg.util.Tasks$Builder.runTaskWithRetry(Tasks.java:413)
at org.apache.iceberg.util.Tasks$Builder.runSingleThreaded(Tasks.java:219)
at org.apache.iceberg.util.Tasks$Builder.run(Tasks.java:203)
at org.apache.iceberg.util.Tasks$Builder.run(Tasks.java:196)
at
org.apache.iceberg.BaseTransaction.commitSimpleTransaction(BaseTransaction.java:418)
at
org.apache.iceberg.BaseTransaction.commitTransaction(BaseTransaction.java:302)
at
org.apache.impala.service.CatalogOpExecutor.updateCatalogImpl(CatalogOpExecutor.java:7042)
at
org.apache.impala.service.CatalogOpExecutor.updateCatalog(CatalogOpExecutor.java:6771)
at
org.apache.impala.service.JniCatalog.lambda$updateCatalog$15(JniCatalog.java:490)
at
org.apache.impala.service.JniCatalogOp.lambda$execAndSerialize$1(JniCatalogOp.java:90)
at org.apache.impala.service.JniCatalogOp.execOp(JniCatalogOp.java:58)
at
org.apache.impala.service.JniCatalogOp.execAndSerialize(JniCatalogOp.java:89)
at
org.apache.impala.service.JniCatalogOp.execAndSerialize(JniCatalogOp.java:100)
at
org.apache.impala.service.JniCatalog.execAndSerialize(JniCatalog.java:233)
at
org.apache.impala.service.JniCatalog.execAndSerialize(JniCatalog.java:247)
at org.apache.impala.service.JniCatalog.updateCatalog(JniCatalog.java:489)
Caused by: java.io.IOException: The stream is closed
at
org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:118)
at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140)
at java.io.DataOutputStream.flush(DataOutputStream.java:123)
at java.io.FilterOutputStream.close(FilterOutputStream.java:158)
at org.apache.hadoop.hdfs.DataStreamer.closeStream(DataStreamer.java:1015)
at org.apache.hadoop.hdfs.DataStreamer.closeInternal(DataStreamer.java:851)
at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:846)
Suppressed: java.io.IOException: The stream is closed
at
org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:118)
at
java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140)
at java.io.FilterOutputStream.close(FilterOutputStream.java:158)
at java.io.FilterOutputStream.close(FilterOutputStream.java:159)
... 3 more
I1221 07:11:33.005810 1061 status.cc:129] 424295ecec7b510b:f711981800000000]
RuntimeIOException: Failed to write json to file:
hdfs://172.18.0.1:20500/test-warehouse/test_optimize_a6619d09.db/optimize_iceberg/metadata/00003-7c6b3f3a-c115-4cb5-81c3-edbf0c5eead9.metadata.json
CAUSED BY: IOException: The stream is closed
@ 0x106afdd impala::Status::Status()
@ 0x1900f90 impala::JniUtil::GetJniExceptionMsg()
@ 0x10442b0 impala::JniCall::Call<>()
@ 0x101aa44 impala::Catalog::UpdateCatalog()
@ 0xffd2b7 impala::CatalogServiceThriftIf::UpdateCatalog()
@ 0xfa0f32
impala::CatalogServiceProcessorT<>::process_UpdateCatalog()
@ 0xfca785 impala::CatalogServiceProcessorT<>::dispatchCall()
@ 0xf256e5 apache::thrift::TDispatchProcessor::process()
@ 0x137beb0
apache::thrift::server::TAcceptQueueServer::Task::run()
@ 0x136894b impala::ThriftThread::RunRunnable()
@ 0x136a573
boost::detail::function::void_function_obj_invoker0<>::invoke()
@ 0x19d8a80 impala::Thread::SuperviseThread()
@ 0x19d9889 boost::detail::thread_data<>::run()
@ 0x24343e7 thread_proxy
@ 0x7fe924287609 start_thread
@ 0x7fe922399353 clone
E1221 07:11:33.086104 1061 catalog-server.cc:261]
424295ecec7b510b:f711981800000000] RuntimeIOException: Failed to write json to
file:
hdfs://172.18.0.1:20500/test-warehouse/test_optimize_a6619d09.db/optimize_iceberg/metadata/00003-7c6b3f3a-c115-4cb5-81c3-edbf0c5eead9.metadata.json
CAUSED BY: IOException: The stream is closed{code}
--
This message was sent by Atlassian Jira
(v8.20.10#820010)