[
https://issues.apache.org/jira/browse/HIVE-19750?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16506338#comment-16506338
]
Hive QA commented on HIVE-19750:
--------------------------------
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12926783/HIVE-19750.01-branch-3.patch
{color:red}ERROR:{color} -1 due to build exiting with an error
Test results:
https://builds.apache.org/job/PreCommit-HIVE-Build/11623/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/11623/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-11623/
Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit
status 1 and output '+ date '+%Y-%m-%d %T.%3N'
2018-06-08 18:00:20.895
+ [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
+ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ export
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'MAVEN_OPTS=-Xmx1g '
+ MAVEN_OPTS='-Xmx1g '
+ cd /data/hiveptest/working/
+ tee /data/hiveptest/logs/PreCommit-HIVE-Build-11623/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z branch-3 ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ date '+%Y-%m-%d %T.%3N'
2018-06-08 18:00:20.897
+ cd apache-github-source-source
+ git fetch origin
>From https://github.com/apache/hive
13960aa..a0c465d master -> origin/master
cf492a8..f6c8c12 branch-3 -> origin/branch-3
acfd209..1c6d946 branch-3.0 -> origin/branch-3.0
+ git reset --hard HEAD
HEAD is now at 13960aa HIVE-18079 : Statistics: Allow HyperLogLog to be merged
to the lowest-common-denominator bit-size (Gopal V via Prasanth J)
+ git clean -f -d
+ git checkout branch-3
Switched to branch 'branch-3'
Your branch is behind 'origin/branch-3' by 8 commits, and can be fast-forwarded.
(use "git pull" to update your local branch)
+ git reset --hard origin/branch-3
HEAD is now at f6c8c12 HIVE-19817: Hive streaming API + dynamic partitioning +
json/regex writer does not work (Prasanth Jayachandran reviewed by Ashutosh
Chauhan)
+ git merge --ff-only origin/branch-3
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2018-06-08 18:00:23.190
+ rm -rf ../yetus_PreCommit-HIVE-Build-11623
+ mkdir ../yetus_PreCommit-HIVE-Build-11623
+ git gc
+ cp -R . ../yetus_PreCommit-HIVE-Build-11623
+ mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-11623/yetus
+ patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hiveptest/working/scratch/build.patch
+ [[ -f /data/hiveptest/working/scratch/build.patch ]]
+ chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
+ /data/hiveptest/working/scratch/smart-apply-patch.sh
/data/hiveptest/working/scratch/build.patch
Going to apply patch with: git apply -p0
+ [[ maven == \m\a\v\e\n ]]
+ rm -rf /data/hiveptest/working/maven/org/apache/hive
+ mvn -B clean install -DskipTests -T 4 -q
-Dmaven.repo.local=/data/hiveptest/working/maven
protoc-jar: executing: [/tmp/protoc7367033663309093566.exe, --version]
libprotoc 2.5.0
protoc-jar: executing: [/tmp/protoc7367033663309093566.exe,
-I/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore,
--java_out=/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources,
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/protobuf/org/apache/hadoop/hive/metastore/metastore.proto]
ANTLR Parser Generator Version 3.5.2
Output file
/data/hiveptest/working/apache-github-source-source/standalone-metastore/target/generated-sources/org/apache/hadoop/hive/metastore/parser/FilterParser.java
does not exist: must build
/data/hiveptest/working/apache-github-source-source/standalone-metastore/src/main/java/org/apache/hadoop/hive/metastore/parser/Filter.g
org/apache/hadoop/hive/metastore/parser/Filter.g
[ERROR] Failed to execute goal
org.apache.maven.plugins:maven-remote-resources-plugin:1.5:process
(process-resource-bundles) on project hive-shims: Execution
process-resource-bundles of goal
org.apache.maven.plugins:maven-remote-resources-plugin:1.5:process failed.
ConcurrentModificationException -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please
read the following articles:
[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :hive-shims
+ result=1
+ '[' 1 -ne 0 ']'
+ rm -rf yetus_PreCommit-HIVE-Build-11623
+ exit 1
'
{noformat}
This message is automatically generated.
ATTACHMENT ID: 12926783 - PreCommit-HIVE-Build
> Initialize NEXT_WRITE_ID. NWI_NEXT on converting an existing table to full
> acid
> -------------------------------------------------------------------------------
>
> Key: HIVE-19750
> URL: https://issues.apache.org/jira/browse/HIVE-19750
> Project: Hive
> Issue Type: Bug
> Components: Transactions
> Affects Versions: 3.0.0
> Reporter: Eugene Koifman
> Assignee: Eugene Koifman
> Priority: Critical
> Fix For: 3.1.0
>
> Attachments: HIVE-19750.01-branch-3.patch, HIVE-19750.01.patch,
> HIVE-19750.02.patch, HIVE-19750.03.patch
>
>
> Need to set this to a reasonably high value the the table.
> This will reserve a range of write IDs that will be treated by the system as
> committed.
> This is needed so that we can assign unique ROW__IDs to each row in files
> that already exist in the table. For example, if the value is initialized to
> the number of files currently in the table, we can think of each file as
> written by a separate transaction and thus a free to assign bucketProperty
> (BucketCodec) of ROW_ID in whichever way is convenient.
> it's guaranteed that all rows get unique ROW_IDs this way.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)