[
https://issues.apache.org/jira/browse/FLINK-34036?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17805708#comment-17805708
]
Matthias Pohl edited comment on FLINK-34036 at 1/11/24 4:55 PM:
----------------------------------------------------------------
The problem seems to be that we're not building the Flink artifacts with the
PROFILE which causes issues when running the tests with the PROFILE. That is
why it's possible to reproduce it locally. Because I didn't trigger the clean
phase before running the tests with the profiles enabled.
This is also a problem in the GHA workflow configuration. The run_mvn
configuration doesn't inject the PROFILE env variable properly.
was (Author: mapohl):
The problem seems to be that we're not building the Flink artifacts with the
PROFILE which causes issues when running the tests with the PROFILE. That is
why it's possible to reproduce it locally. Because I didn't trigger the clean
phase before running the tests with the profiles enabled.
> Various HiveDialectQueryITCase tests fail in GitHub Actions workflow with
> Hadoop 3.1.3 enabled
> ----------------------------------------------------------------------------------------------
>
> Key: FLINK-34036
> URL: https://issues.apache.org/jira/browse/FLINK-34036
> Project: Flink
> Issue Type: Sub-task
> Components: Connectors / Hadoop Compatibility, Connectors / Hive
> Affects Versions: 1.18.0, 1.19.0
> Reporter: Matthias Pohl
> Priority: Major
> Labels: github-actions, test-stability
>
> The following {{HiveDialectQueryITCase}} tests fail consistently in the
> FLINK-27075 GitHub Actions [master nightly
> workflow|https://github.com/XComp/flink/actions/workflows/nightly-dev.yml] of
> Flink (and also the [release-1.18
> workflow|https://github.com/XComp/flink/actions/workflows/nightly-current.yml]):
> * {{testInsertDirectory}}
> * {{testCastTimeStampToDecimal}}
> * {{testNullLiteralAsArgument}}
> {code}
> Error: 03:38:45 03:38:45.661 [ERROR] Tests run: 22, Failures: 1, Errors: 2,
> Skipped: 0, Time elapsed: 379.0 s <<< FAILURE! -- in
> org.apache.flink.connectors.hive.HiveDialectQueryITCase
> Error: 03:38:45 03:38:45.662 [ERROR]
> org.apache.flink.connectors.hive.HiveDialectQueryITCase.testNullLiteralAsArgument
> -- Time elapsed: 0.069 s <<< ERROR!
> Jan 09 03:38:45 java.lang.NoSuchMethodError:
> org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorUtils.getTimestamp(Ljava/lang/Object;Lorg/apache/hadoop/hive/serde2/objectinspector/PrimitiveObjectInspector;)Ljava/sql/Timestamp;
> Jan 09 03:38:45 at
> org.apache.flink.connectors.hive.HiveDialectQueryITCase.testNullLiteralAsArgument(HiveDialectQueryITCase.java:959)
> Jan 09 03:38:45 at java.lang.reflect.Method.invoke(Method.java:498)
> Jan 09 03:38:45
> Error: 03:38:45 03:38:45.662 [ERROR]
> org.apache.flink.connectors.hive.HiveDialectQueryITCase.testCastTimeStampToDecimal
> -- Time elapsed: 0.007 s <<< ERROR!
> Jan 09 03:38:45 org.apache.flink.table.api.ValidationException: Table with
> identifier 'test-catalog.default.t1' does not exist.
> Jan 09 03:38:45 at
> org.apache.flink.table.catalog.CatalogManager.dropTableInternal(CatalogManager.java:1266)
> Jan 09 03:38:45 at
> org.apache.flink.table.catalog.CatalogManager.dropTable(CatalogManager.java:1206)
> Jan 09 03:38:45 at
> org.apache.flink.table.operations.ddl.DropTableOperation.execute(DropTableOperation.java:74)
> Jan 09 03:38:45 at
> org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:1107)
> Jan 09 03:38:45 at
> org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:735)
> Jan 09 03:38:45 at
> org.apache.flink.connectors.hive.HiveDialectQueryITCase.testCastTimeStampToDecimal(HiveDialectQueryITCase.java:835)
> Jan 09 03:38:45 at java.lang.reflect.Method.invoke(Method.java:498)
> Jan 09 03:38:45
> Error: 03:38:45 03:38:45.663 [ERROR]
> org.apache.flink.connectors.hive.HiveDialectQueryITCase.testInsertDirectory
> -- Time elapsed: 7.326 s <<< FAILURE!
> Jan 09 03:38:45 org.opentest4j.AssertionFailedError:
> Jan 09 03:38:45
> Jan 09 03:38:45 expected: "A:english=90#math=100#history=85"
> Jan 09 03:38:45 but was: "A:english=90math=100history=85"
> Jan 09 03:38:45 at
> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> Jan 09 03:38:45 at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> Jan 09 03:38:45 at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> Jan 09 03:38:45 at
> org.apache.flink.connectors.hive.HiveDialectQueryITCase.testInsertDirectory(HiveDialectQueryITCase.java:498)
> Jan 09 03:38:45 at java.lang.reflect.Method.invoke(Method.java:498)
> {code}
> Additionally, the {{HiveITCase}} in the e2e test suite is affected:
> {code}
> Error: 05:20:20 05:20:20.949 [ERROR] Tests run: 1, Failures: 0, Errors: 1,
> Skipped: 0, Time elapsed: 0.106 s <<< FAILURE! -- in
> org.apache.flink.tests.hive.HiveITCase
> Error: 05:20:20 05:20:20.949 [ERROR] org.apache.flink.tests.hive.HiveITCase
> -- Time elapsed: 0.106 s <<< ERROR!
> Jan 07 05:20:20 java.lang.ExceptionInInitializerError
> Jan 07 05:20:20 at sun.misc.Unsafe.ensureClassInitialized(Native Method)
> Jan 07 05:20:20 at
> java.lang.reflect.Field.acquireFieldAccessor(Field.java:1088)
> Jan 07 05:20:20 at
> java.lang.reflect.Field.getFieldAccessor(Field.java:1069)
> Jan 07 05:20:20 at java.lang.reflect.Field.get(Field.java:393)
> Jan 07 05:20:20 Caused by: java.lang.RuntimeException: java.io.IOException:
> Multiple resource files were found matching the pattern .*sql-hive-.*.jar.
> Matches=[/home/runner/work/flink/flink/flink-end-to-end-tests/flink-end-to-end-tests-hive/target/dependencies/sql-hive-2.3.9_2.12.jar,
>
> /home/runner/work/flink/flink/flink-end-to-end-tests/flink-end-to-end-tests-hive/target/dependencies/sql-hive-3.1.3_2.12.jar]
> Jan 07 05:20:20 at
> org.apache.flink.test.resources.ResourceTestUtils.getResource(ResourceTestUtils.java:76)
> Jan 07 05:20:20 at
> org.apache.flink.tests.hive.HiveITCase.<clinit>(HiveITCase.java:88)
> Jan 07 05:20:20 ... 4 more
> Jan 07 05:20:20 Caused by: java.io.IOException: Multiple resource files were
> found matching the pattern .*sql-hive-.*.jar.
> Matches=[/home/runner/work/flink/flink/flink-end-to-end-tests/flink-end-to-end-tests-hive/target/dependencies/sql-hive-2.3.9_2.12.jar,
>
> /home/runner/work/flink/flink/flink-end-to-end-tests/flink-end-to-end-tests-hive/target/dependencies/sql-hive-3.1.3_2.12.jar]
> Jan 07 05:20:20 ... 6 more
> {code}
> The most-recent build failures in GHA workflow failures are:
> *
> https://github.com/XComp/flink/actions/runs/7455836411/job/20285758541#step:12:23332
> *
> https://github.com/XComp/flink/actions/runs/7447254277/job/20259593089#step:12:23378
> *
> https://github.com/XComp/flink/actions/runs/7442459819/job/20246101021#step:12:23332
> *
> https://github.com/XComp/flink/actions/runs/7438111934/job/20236674470#step:12:23375
> *
> https://github.com/XComp/flink/actions/runs/7435499743/job/20231030744#step:12:23367
> Interestingly, the failure doesn't appear in the Azure Pipelines nightlies.
--
This message was sent by Atlassian Jira
(v8.20.10#820010)