[
https://issues.apache.org/jira/browse/HIVE-17794?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16841252#comment-16841252
]
Hive QA commented on HIVE-17794:
--------------------------------
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12901796/HIVE-17794.03.patch
{color:red}ERROR:{color} -1 due to build exiting with an error
Test results:
https://builds.apache.org/job/PreCommit-HIVE-Build/17234/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/17234/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-17234/
Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit
status 1 and output '+ date '+%Y-%m-%d %T.%3N'
2019-05-16 12:04:51.210
+ [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
+ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ export
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'MAVEN_OPTS=-Xmx1g '
+ MAVEN_OPTS='-Xmx1g '
+ cd /data/hiveptest/working/
+ tee /data/hiveptest/logs/PreCommit-HIVE-Build-17234/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z master ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ date '+%Y-%m-%d %T.%3N'
2019-05-16 12:04:51.216
+ cd apache-github-source-source
+ git fetch origin
+ git reset --hard HEAD
HEAD is now at ac477b6 HIVE-21730: HiveStatement.getQueryId throws
TProtocolException when response is null (Sankar Hariappan, reviewed by Mahesh
Kumar Behera)
+ git clean -f -d
Removing standalone-metastore/metastore-server/src/gen/
+ git checkout master
Already on 'master'
Your branch is up-to-date with 'origin/master'.
+ git reset --hard origin/master
HEAD is now at ac477b6 HIVE-21730: HiveStatement.getQueryId throws
TProtocolException when response is null (Sankar Hariappan, reviewed by Mahesh
Kumar Behera)
+ git merge --ff-only origin/master
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2019-05-16 12:04:52.449
+ rm -rf ../yetus_PreCommit-HIVE-Build-17234
+ mkdir ../yetus_PreCommit-HIVE-Build-17234
+ git gc
+ cp -R . ../yetus_PreCommit-HIVE-Build-17234
+ mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-17234/yetus
+ patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hiveptest/working/scratch/build.patch
+ [[ -f /data/hiveptest/working/scratch/build.patch ]]
+ chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
+ /data/hiveptest/working/scratch/smart-apply-patch.sh
/data/hiveptest/working/scratch/build.patch
error: a/hcatalog/core/src/test/java/org/apache/hive/hcatalog/MiniCluster.java:
does not exist in index
error:
a/hcatalog/core/src/test/java/org/apache/hive/hcatalog/data/HCatDataCheckUtil.java:
does not exist in index
error: a/hcatalog/hcatalog-pig-adapter/pom.xml: does not exist in index
error:
a/hcatalog/hcatalog-pig-adapter/src/main/java/org/apache/hive/hcatalog/pig/PigHCatUtil.java:
does not exist in index
error:
a/hcatalog/webhcat/java-client/src/test/java/org/apache/hive/hcatalog/api/TestHCatClient.java:
does not exist in index
error: patch failed:
hcatalog/core/src/test/java/org/apache/hive/hcatalog/data/HCatDataCheckUtil.java:55
Falling back to three-way merge...
Applied patch to
'hcatalog/core/src/test/java/org/apache/hive/hcatalog/data/HCatDataCheckUtil.java'
with conflicts.
Going to apply patch with: git apply -p1
/data/hiveptest/working/scratch/build.patch:228: new blank line at EOF.
+
error: patch failed:
hcatalog/core/src/test/java/org/apache/hive/hcatalog/data/HCatDataCheckUtil.java:55
Falling back to three-way merge...
Applied patch to
'hcatalog/core/src/test/java/org/apache/hive/hcatalog/data/HCatDataCheckUtil.java'
with conflicts.
U
hcatalog/core/src/test/java/org/apache/hive/hcatalog/data/HCatDataCheckUtil.java
warning: 1 line adds whitespace errors.
+ result=1
+ '[' 1 -ne 0 ']'
+ rm -rf yetus_PreCommit-HIVE-Build-17234
+ exit 1
'
{noformat}
This message is automatically generated.
ATTACHMENT ID: 12901796 - PreCommit-HIVE-Build
> HCatLoader breaks when a member is added to a struct-column of a table
> ----------------------------------------------------------------------
>
> Key: HIVE-17794
> URL: https://issues.apache.org/jira/browse/HIVE-17794
> Project: Hive
> Issue Type: Bug
> Components: HCatalog
> Affects Versions: 2.2.0, 3.0.0, 2.4.0
> Reporter: Mithun Radhakrishnan
> Assignee: Mithun Radhakrishnan
> Priority: Major
> Attachments: HIVE-17794.02.patch, HIVE-17794.03.patch,
> HIVE-17794.1.patch
>
>
> When a table's schema evolves to add a new member to a struct column, Hive
> queries work fine, but {{HCatLoader}} breaks with the following trace:
> {noformat}
> TaskAttempt 1 failed, info=
> Error: Failure while running
> task:org.apache.pig.backend.executionengine.ExecException: ERROR 0: Exception
> while executing (Name: kite_composites_with_segments: Local Rearrange
> tuple
> {chararray}(false) - scope-555-> scope-974 Operator Key: scope-555):
> org.apache.pig.backend.executionengine.ExecException: ERROR 0: Exception
> while executing (Name: gup: New For Each(false,false)
> bag
> - scope-548 Operator Key: scope-548):
> org.apache.pig.backend.executionengine.ExecException: ERROR 0: Exception
> while executing (Name: gup_filtered: Filter
> bag
> - scope-522 Operator Key: scope-522):
> org.apache.pig.backend.executionengine.ExecException: ERROR 0:
> org.apache.pig.backend.executionengine.ExecException: ERROR 6018: Error
> converting read value to tuple
> at
> org.apache.pig.backend.hadoop.executionengine.physicalLayer.PhysicalOperator.processInput(PhysicalOperator.java:314)
> at
> org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators.POLocalRearrange.getNextTuple(POLocalRearrange.java:287)
> at
> org.apache.pig.backend.hadoop.executionengine.tez.plan.operator.POLocalRearrangeTez.getNextTuple(POLocalRearrangeTez.java:127)
> at
> org.apache.pig.backend.hadoop.executionengine.tez.runtime.PigProcessor.runPipeline(PigProcessor.java:376)
> at
> org.apache.pig.backend.hadoop.executionengine.tez.runtime.PigProcessor.run(PigProcessor.java:241)
> at
> org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:362)
> at
> org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:179)
> at
> org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:171)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:422)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1679)
> at
> org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:171)
> at
> org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:167)
> at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: org.apache.pig.backend.executionengine.ExecException: ERROR 0:
> Exception while executing (Name: gup: New For Each(false,false)
> bag
> - scope-548 Operator Key: scope-548):
> org.apache.pig.backend.executionengine.ExecException: ERROR 0: Exception
> while executing (Name: gup_filtered: Filter
> bag
> - scope-522 Operator Key: scope-522):
> org.apache.pig.backend.executionengine.ExecException: ERROR 0:
> org.apache.pig.backend.executionengine.ExecException: ERROR 6018: Error
> converting read value to tuple
> at
> org.apache.pig.backend.hadoop.executionengine.physicalLayer.PhysicalOperator.processInput(PhysicalOperator.java:314)
> at
> org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators.POForEach.getNextTuple(POForEach.java:252)
> at
> org.apache.pig.backend.hadoop.executionengine.physicalLayer.PhysicalOperator.processInput(PhysicalOperator.java:305)
> ... 17 more
> Caused by: org.apache.pig.backend.executionengine.ExecException: ERROR 0:
> Exception while executing (Name: gup_filtered: Filter
> bag
> - scope-522 Operator Key: scope-522):
> org.apache.pig.backend.executionengine.ExecException: ERROR 0:
> org.apache.pig.backend.executionengine.ExecException: ERROR 6018: Error
> converting read value to tuple
> at
> org.apache.pig.backend.hadoop.executionengine.physicalLayer.PhysicalOperator.processInput(PhysicalOperator.java:314)
> at
> org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators.POFilter.getNextTuple(POFilter.java:90)
> at
> org.apache.pig.backend.hadoop.executionengine.physicalLayer.PhysicalOperator.processInput(PhysicalOperator.java:305)
> ... 19 more
> Caused by: org.apache.pig.backend.executionengine.ExecException: ERROR 0:
> org.apache.pig.backend.executionengine.ExecException: ERROR 6018: Error
> converting read value to tuple
> at
> org.apache.pig.backend.hadoop.executionengine.tez.plan.operator.POSimpleTezLoad.getNextTuple(POSimpleTezLoad.java:160)
> at
> org.apache.pig.backend.hadoop.executionengine.physicalLayer.PhysicalOperator.processInput(PhysicalOperator.java:305)
> ... 21 more
> Caused by: org.apache.pig.backend.executionengine.ExecException: ERROR 6018:
> Error converting read value to tuple
> at org.apache.hive.hcatalog.pig.HCatBaseLoader.getNext(HCatBaseLoader.java:76)
> at org.apache.hive.hcatalog.pig.HCatLoader.getNext(HCatLoader.java:63)
> at
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader.nextKeyValue(PigRecordReader.java:204)
> at
> org.apache.tez.mapreduce.lib.MRReaderMapReduce.next(MRReaderMapReduce.java:118)
> at
> org.apache.pig.backend.hadoop.executionengine.tez.plan.operator.POSimpleTezLoad.getNextTuple(POSimpleTezLoad.java:140)
> ... 22 more
> Caused by: java.lang.IndexOutOfBoundsException: Index: 31, Size: 31
> at java.util.ArrayList.rangeCheck(ArrayList.java:653)
> at java.util.ArrayList.get(ArrayList.java:429)
> at
> org.apache.hive.hcatalog.pig.PigHCatUtil.transformToTuple(PigHCatUtil.java:468)
> at
> org.apache.hive.hcatalog.pig.PigHCatUtil.transformToTuple(PigHCatUtil.java:451)
> at
> org.apache.hive.hcatalog.pig.PigHCatUtil.extractPigObject(PigHCatUtil.java:410)
> at
> org.apache.hive.hcatalog.pig.PigHCatUtil.transformToTuple(PigHCatUtil.java:468)
> at
> org.apache.hive.hcatalog.pig.PigHCatUtil.transformToTuple(PigHCatUtil.java:386)
> at org.apache.hive.hcatalog.pig.HCatBaseLoader.getNext(HCatBaseLoader.java:64)
> ... 26 more
> {noformat}
> When filling out values for columns, the {{HCatLoader}} should've filled out
> nulls for non-existent columns. A patch will be made available shortly.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)