[
https://issues.apache.org/jira/browse/HIVE-21987?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16923179#comment-16923179
]
Hive QA commented on HIVE-21987:
--------------------------------
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12979387/HIVE-21987.3.patch
{color:red}ERROR:{color} -1 due to build exiting with an error
Test results:
https://builds.apache.org/job/PreCommit-HIVE-Build/18449/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/18449/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-18449/
Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit
status 1 and output '+ date '+%Y-%m-%d %T.%3N'
2019-09-05 08:22:37.248
+ [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
+ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ export
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'MAVEN_OPTS=-Xmx1g '
+ MAVEN_OPTS='-Xmx1g '
+ cd /data/hiveptest/working/
+ tee /data/hiveptest/logs/PreCommit-HIVE-Build-18449/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z master ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ date '+%Y-%m-%d %T.%3N'
2019-09-05 08:22:37.251
+ cd apache-github-source-source
+ git fetch origin
+ git reset --hard HEAD
HEAD is now at ebcc9bc HIVE-22161: UDF: FunctionRegistry synchronizes on
org.apache.hadoop.hive.ql.udf.UDFType class (Gopal V, reviewed by Ashutosh
Chauhan)
+ git clean -f -d
Removing standalone-metastore/metastore-server/src/gen/
+ git checkout master
Already on 'master'
Your branch is up-to-date with 'origin/master'.
+ git reset --hard origin/master
HEAD is now at ebcc9bc HIVE-22161: UDF: FunctionRegistry synchronizes on
org.apache.hadoop.hive.ql.udf.UDFType class (Gopal V, reviewed by Ashutosh
Chauhan)
+ git merge --ff-only origin/master
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2019-09-05 08:22:38.293
+ rm -rf ../yetus_PreCommit-HIVE-Build-18449
+ mkdir ../yetus_PreCommit-HIVE-Build-18449
+ git gc
+ cp -R . ../yetus_PreCommit-HIVE-Build-18449
+ mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-18449/yetus
+ patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hiveptest/working/scratch/build.patch
+ [[ -f /data/hiveptest/working/scratch/build.patch ]]
+ chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
+ /data/hiveptest/working/scratch/smart-apply-patch.sh
/data/hiveptest/working/scratch/build.patch
error: cannot apply binary patch to 'data/files/parquet_int_decimal_1.parquet'
without full index line
Falling back to three-way merge...
error: cannot apply binary patch to 'data/files/parquet_int_decimal_1.parquet'
without full index line
error: data/files/parquet_int_decimal_1.parquet: patch does not apply
error: cannot apply binary patch to 'data/files/parquet_int_decimal_2.parquet'
without full index line
Falling back to three-way merge...
error: cannot apply binary patch to 'data/files/parquet_int_decimal_2.parquet'
without full index line
error: data/files/parquet_int_decimal_2.parquet: patch does not apply
error: cannot apply binary patch to 'files/parquet_int_decimal_1.parquet'
without full index line
Falling back to three-way merge...
error: cannot apply binary patch to 'files/parquet_int_decimal_1.parquet'
without full index line
error: files/parquet_int_decimal_1.parquet: patch does not apply
error: cannot apply binary patch to 'files/parquet_int_decimal_2.parquet'
without full index line
Falling back to three-way merge...
error: cannot apply binary patch to 'files/parquet_int_decimal_2.parquet'
without full index line
error: files/parquet_int_decimal_2.parquet: patch does not apply
error:
src/java/org/apache/hadoop/hive/ql/io/parquet/convert/ETypeConverter.java: does
not exist in index
error:
src/java/org/apache/hadoop/hive/ql/io/parquet/vector/ParquetDataColumnReaderFactory.java:
does not exist in index
error: src/test/results/clientpositive/type_change_test_fraction.q.out: does
not exist in index
error: cannot apply binary patch to 'parquet_int_decimal_1.parquet' without
full index line
Falling back to three-way merge...
error: cannot apply binary patch to 'parquet_int_decimal_1.parquet' without
full index line
error: parquet_int_decimal_1.parquet: patch does not apply
error: cannot apply binary patch to 'parquet_int_decimal_2.parquet' without
full index line
Falling back to three-way merge...
error: cannot apply binary patch to 'parquet_int_decimal_2.parquet' without
full index line
error: parquet_int_decimal_2.parquet: patch does not apply
error: java/org/apache/hadoop/hive/ql/io/parquet/convert/ETypeConverter.java:
does not exist in index
error:
java/org/apache/hadoop/hive/ql/io/parquet/vector/ParquetDataColumnReaderFactory.java:
does not exist in index
error: test/results/clientpositive/type_change_test_fraction.q.out: does not
exist in index
The patch does not appear to apply with p0, p1, or p2
+ result=1
+ '[' 1 -ne 0 ']'
+ rm -rf yetus_PreCommit-HIVE-Build-18449
+ exit 1
'
{noformat}
This message is automatically generated.
ATTACHMENT ID: 12979387 - PreCommit-HIVE-Build
> Hive is unable to read Parquet int32 annotated with decimal
> -----------------------------------------------------------
>
> Key: HIVE-21987
> URL: https://issues.apache.org/jira/browse/HIVE-21987
> Project: Hive
> Issue Type: Improvement
> Reporter: Nandor Kollar
> Assignee: Marta Kuczora
> Priority: Major
> Attachments: HIVE-21987.1.patch, HIVE-21987.2.patch,
> HIVE-21987.3.patch,
> part-00000-e5287735-8dcf-4dda-9c6e-4d5c98dc15f2-c000.snappy.parquet
>
>
> When I tried to read a Parquet file from a Hive (with Tez execution engine)
> table with a small decimal column, I got the following exception:
> {code}
> Caused by: java.lang.UnsupportedOperationException:
> org.apache.hadoop.hive.ql.io.parquet.convert.ETypeConverter$8$1
> at
> org.apache.parquet.io.api.PrimitiveConverter.addInt(PrimitiveConverter.java:98)
> at
> org.apache.parquet.column.impl.ColumnReaderImpl$2$3.writeValue(ColumnReaderImpl.java:248)
> at
> org.apache.parquet.column.impl.ColumnReaderImpl.writeCurrentValueToConverter(ColumnReaderImpl.java:367)
> at
> org.apache.parquet.io.RecordReaderImplementation.read(RecordReaderImplementation.java:406)
> at
> org.apache.parquet.hadoop.InternalParquetRecordReader.nextKeyValue(InternalParquetRecordReader.java:226)
> ... 28 more
> {code}
> Steps to reproduce:
> - Create a Hive table with a single decimal(4, 2) column
> - Create a Parquet file with int32 column annotated with decimal(4, 2)
> logical type, put it into the previously created table location (or use the
> attached parquet file, in this case the column should be named as 'd', to
> match the Hive schema with the Parquet schema in the file)
> - Execute a {{select *}} on this table
> Also, I'm afraid that similar problems can happen with int64 decimals too.
> [Parquet specification |
> https://github.com/apache/parquet-format/blob/master/LogicalTypes.md] allows
> both of these cases.
--
This message was sent by Atlassian Jira
(v8.3.2#803003)