pan3793 commented on pull request #1918:
URL:
https://github.com/apache/incubator-kyuubi/pull/1918#issuecomment-1046239940
I did a try in this weekend on Android ARM64 machine.
The brief environment information
```
╲ ▁▂▂▂▁ ╱
▄███████▄
▄██ ███ ██▄
▄███████████▄ OS: Android 11
▄█ ▄▄▄▄▄▄▄▄▄▄▄▄▄ █▄ Device: Lenovo TB-J716F (J716F)
██ █████████████ ██ ROM:
TB-J716F_CN_OPEN_USER_Q00209.3_R_ZUI_13.0.430_ST_220113
██ █████████████ ██ Baseband: apq
██ █████████████ ██ Kernel: aarch64 Linux 4.19.152-perf+
██ █████████████ ██ Uptime:
█████████████ CPU: Qualcomm Technologies, Inc KONA
███████████ GPU: Qualcomm Technologies, Inc KONA
██ ██ RAM: 3027MiB / 5667MiB
██ ██
```
And I can only got the JDK-17 provided by the OS package manager `apt`.
```
openjdk version "17-internal" 2021-09-14
OpenJDK Runtime Environment (build 17-internal+0-adhoc..src)
OpenJDK 64-Bit Server VM (build 17-internal+0-adhoc..src, mixed mode)
```
Encounter some issues related to JDK-17, partial fixed but still can not
pass all UTs.
Test command:
```
build/mvn install -Dzookeeper.version=3.5.9
-Dmaven.plugin.scalatest.exclude.tags=org.apache.kyuubi.tags.ExtendedSQLTest,org.apache.kyuubi.tags.DeltaTest,org.apache.kyuubi.tags.IcebergTest,org.apache.kyuubi.tags.HudiTest
```
- https://github.com/apache/incubator-kyuubi/pull/1939
- https://github.com/apache/incubator-kyuubi/pull/1940
- https://github.com/apache/incubator-kyuubi/issues/1941
- https://github.com/apache/incubator-kyuubi/pull/1943
The current status,
`kyuubi-flink-sql-engine` failed because of Flink 1.14 depends on`asm-7`
which does not support JDK 17.
```
Caused by: org.apache.flink.table.api.ValidationException: Unable to extract
a type inference from method:
public java.lang.String LowerUDF.eval(java.lang.String)
at
org.apache.flink.table.types.extraction.ExtractionUtils.extractionError(ExtractionUtils.java:362)
at
org.apache.flink.table.types.extraction.FunctionMappingExtractor.extractResultMappings(FunctionMappingExtractor.java:183)
at
org.apache.flink.table.types.extraction.FunctionMappingExtractor.extractOutputMapping(FunctionMappingExtractor.java:114)
... 29 more
Caused by: java.lang.IllegalArgumentException: Unsupported class file major
version 61
at
org.apache.flink.shaded.asm7.org.objectweb.asm.ClassReader.<init>(ClassReader.java:195)
at
org.apache.flink.shaded.asm7.org.objectweb.asm.ClassReader.<init>(ClassReader.java:176)
at
org.apache.flink.shaded.asm7.org.objectweb.asm.ClassReader.<init>(ClassReader.java:162)
at
org.apache.flink.shaded.asm7.org.objectweb.asm.ClassReader.<init>(ClassReader.java:283)
at
org.apache.flink.table.types.extraction.ExtractionUtils.getClassReader(ExtractionUtils.java:747)
at
org.apache.flink.table.types.extraction.ExtractionUtils.extractExecutableNames(ExtractionUtils.java:721)
at
org.apache.flink.table.types.extraction.ExtractionUtils.extractMethodParameterNames(ExtractionUtils.java:656)
at
org.apache.flink.table.types.extraction.FunctionMappingExtractor.extractArgumentNames(FunctionMappingExtractor.java:429)
at
org.apache.flink.table.types.extraction.FunctionMappingExtractor.lambda$createParameterSignatureExtraction$9(FunctionMappingExtractor.java:367)
at
org.apache.flink.table.types.extraction.FunctionMappingExtractor.putExtractedResultMappings(FunctionMappingExtractor.java:324)
at
org.apache.flink.table.types.extraction.FunctionMappingExtractor.collectMethodMappings(FunctionMappingExtractor.java:269)
at
org.apache.flink.table.types.extraction.FunctionMappingExtractor.extractResultMappings(FunctionMappingExtractor.java:169)
```
`kyuubi-spark-sql-engine` failed in Delta test with the following
stacktrace, passed with
`-Dmaven.plugin.scalatest.exclude.tags=org.apache.kyuubi.tags.ExtendedSQLTest,org.apache.kyuubi.tags.DeltaTest,org.apache.kyuubi.tags.IcebergTest,org.apache.kyuubi.tags.HudiTest`
```
Caused by: java.lang.IllegalArgumentException: newLimit > capacity: (84 > 78)
at java.base/java.nio.Buffer.createLimitException(Buffer.java:395)
at java.base/java.nio.Buffer.limit(Buffer.java:369)
at java.base/java.nio.ByteBuffer.limit(ByteBuffer.java:1529)
at java.base/java.nio.MappedByteBuffer.limit(MappedByteBuffer.java:330)
at java.base/java.nio.MappedByteBuffer.limit(MappedByteBuffer.java:73)
at org.xerial.snappy.Snappy.compress(Snappy.java:156)
at
org.apache.parquet.hadoop.codec.SnappyCompressor.compress(SnappyCompressor.java:78)
at
org.apache.hadoop.io.compress.CompressorStream.compress(CompressorStream.java:81)
at
org.apache.hadoop.io.compress.CompressorStream.finish(CompressorStream.java:92)
at
org.apache.parquet.hadoop.CodecFactory$HeapBytesCompressor.compress(CodecFactory.java:167)
at
org.apache.parquet.hadoop.ColumnChunkPageWriteStore$ColumnChunkPageWriter.writePage(ColumnChunkPageWriteStore.java:168)
at
org.apache.parquet.column.impl.ColumnWriterV1.writePage(ColumnWriterV1.java:59)
at
org.apache.parquet.column.impl.ColumnWriterBase.writePage(ColumnWriterBase.java:387)
at
org.apache.parquet.column.impl.ColumnWriteStoreBase.flush(ColumnWriteStoreBase.java:186)
at
org.apache.parquet.column.impl.ColumnWriteStoreV1.flush(ColumnWriteStoreV1.java:29)
at
org.apache.parquet.hadoop.InternalParquetRecordWriter.flushRowGroupToStore(InternalParquetRecordWriter.java:185)
at
org.apache.parquet.hadoop.InternalParquetRecordWriter.close(InternalParquetRecordWriter.java:124)
at
org.apache.parquet.hadoop.ParquetRecordWriter.close(ParquetRecordWriter.java:164)
at
org.apache.spark.sql.execution.datasources.parquet.ParquetOutputWriter.close(ParquetOutputWriter.scala:41)
at
org.apache.spark.sql.execution.datasources.FileFormatDataWriter.releaseCurrentWriter(FileFormatDataWriter.scala:64)
at
org.apache.spark.sql.execution.datasources.FileFormatDataWriter.releaseResources(FileFormatDataWriter.scala:75)
at
org.apache.spark.sql.execution.datasources.FileFormatDataWriter.commit(FileFormatDataWriter.scala:105)
at
org.apache.spark.sql.execution.datasources.FileFormatWriter$.$anonfun$executeTask$1(FileFormatWriter.scala:305)
at
org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1496)
at
org.apache.spark.sql.execution.datasources.FileFormatWriter$.executeTask(FileFormatWriter.scala:311)
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]