[
https://issues.apache.org/jira/browse/HIVE-21292?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16779381#comment-16779381
]
Hive QA commented on HIVE-21292:
--------------------------------
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12960353/HIVE-21292.15.patch
{color:red}ERROR:{color} -1 due to build exiting with an error
Test results:
https://builds.apache.org/job/PreCommit-HIVE-Build/16269/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/16269/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-16269/
Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Tests exited with: NonZeroExitCodeException
Command 'bash /data/hiveptest/working/scratch/source-prep.sh' failed with exit
status 1 and output '+ date '+%Y-%m-%d %T.%3N'
2019-02-27 14:37:53.853
+ [[ -n /usr/lib/jvm/java-8-openjdk-amd64 ]]
+ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
+ export
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+
PATH=/usr/lib/jvm/java-8-openjdk-amd64/bin/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m '
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m '
+ export 'MAVEN_OPTS=-Xmx1g '
+ MAVEN_OPTS='-Xmx1g '
+ cd /data/hiveptest/working/
+ tee /data/hiveptest/logs/PreCommit-HIVE-Build-16269/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ git = \s\v\n ]]
+ [[ git = \g\i\t ]]
+ [[ -z master ]]
+ [[ -d apache-github-source-source ]]
+ [[ ! -d apache-github-source-source/.git ]]
+ [[ ! -d apache-github-source-source ]]
+ date '+%Y-%m-%d %T.%3N'
2019-02-27 14:37:53.858
+ cd apache-github-source-source
+ git fetch origin
+ git reset --hard HEAD
HEAD is now at 66533e2 HIVE-21297: Replace all occurences of new Long, Boolean,
Double etc with the corresponding .valueOf (Ivan Suller via )
+ git clean -f -d
Removing standalone-metastore/metastore-server/src/gen/
+ git checkout master
Already on 'master'
Your branch is up-to-date with 'origin/master'.
+ git reset --hard origin/master
HEAD is now at 66533e2 HIVE-21297: Replace all occurences of new Long, Boolean,
Double etc with the corresponding .valueOf (Ivan Suller via )
+ git merge --ff-only origin/master
Already up-to-date.
+ date '+%Y-%m-%d %T.%3N'
2019-02-27 14:37:54.558
+ rm -rf ../yetus_PreCommit-HIVE-Build-16269
+ mkdir ../yetus_PreCommit-HIVE-Build-16269
+ git gc
+ cp -R . ../yetus_PreCommit-HIVE-Build-16269
+ mkdir /data/hiveptest/logs/PreCommit-HIVE-Build-16269/yetus
+ patchCommandPath=/data/hiveptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hiveptest/working/scratch/build.patch
+ [[ -f /data/hiveptest/working/scratch/build.patch ]]
+ chmod +x /data/hiveptest/working/scratch/smart-apply-patch.sh
+ /data/hiveptest/working/scratch/smart-apply-patch.sh
/data/hiveptest/working/scratch/build.patch
error:
a/hcatalog/core/src/main/java/org/apache/hive/hcatalog/cli/SemanticAnalysis/CreateDatabaseHook.java:
does not exist in index
error:
a/hcatalog/core/src/main/java/org/apache/hive/hcatalog/cli/SemanticAnalysis/HCatSemanticAnalyzer.java:
does not exist in index
error:
a/hcatalog/core/src/main/java/org/apache/hive/hcatalog/cli/SemanticAnalysis/HCatSemanticAnalyzerBase.java:
does not exist in index
error:
a/itests/hive-unit/src/test/java/org/apache/hadoop/hive/ql/parse/TestReplicationScenariosAcrossInstances.java:
does not exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/DDLTask.java: does not
exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/exec/TaskFactory.java: does not
exist in index
error:
a/ql/src/java/org/apache/hadoop/hive/ql/exec/repl/bootstrap/load/LoadDatabase.java:
does not exist in index
error:
a/ql/src/java/org/apache/hadoop/hive/ql/exec/repl/incremental/IncrementalLoadTasksBuilder.java:
does not exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/lockmgr/DbTxnManager.java: does
not exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/lockmgr/HiveTxnManager.java:
does not exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/lockmgr/HiveTxnManagerImpl.java:
does not exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/parse/DDLSemanticAnalyzer.java:
does not exist in index
error:
a/ql/src/java/org/apache/hadoop/hive/ql/parse/repl/load/message/AlterDatabaseHandler.java:
does not exist in index
error:
a/ql/src/java/org/apache/hadoop/hive/ql/parse/repl/load/message/CreateDatabaseHandler.java:
does not exist in index
error:
a/ql/src/java/org/apache/hadoop/hive/ql/parse/repl/load/message/DropDatabaseHandler.java:
does not exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/plan/AlterDatabaseDesc.java:
does not exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/plan/CreateDatabaseDesc.java:
does not exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/plan/DDLWork.java: does not
exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/plan/DescDatabaseDesc.java: does
not exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/plan/DropDatabaseDesc.java: does
not exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/plan/LockDatabaseDesc.java: does
not exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/plan/ShowDatabasesDesc.java:
does not exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/plan/SwitchDatabaseDesc.java:
does not exist in index
error: a/ql/src/java/org/apache/hadoop/hive/ql/plan/UnlockDatabaseDesc.java:
does not exist in index
error:
a/ql/src/test/results/clientnegative/database_create_already_exists.q.out: does
not exist in index
error: a/ql/src/test/results/clientnegative/database_create_invalid_name.q.out:
does not exist in index
error: a/ql/src/test/results/clientnegative/database_drop_not_empty.q.out: does
not exist in index
error:
a/ql/src/test/results/clientnegative/database_drop_not_empty_restrict.q.out:
does not exist in index
error: a/ql/src/test/results/clientnegative/dbtxnmgr_nodblock.q.out: does not
exist in index
error: a/ql/src/test/results/clientnegative/dbtxnmgr_nodbunlock.q.out: does not
exist in index
error:
a/ql/src/test/results/clientnegative/lockneg_query_tbl_in_locked_db.q.out: does
not exist in index
error: a/ql/src/test/results/clientnegative/lockneg_try_db_lock_conflict.q.out:
does not exist in index
error: a/ql/src/test/results/clientnegative/lockneg_try_lock_db_in_use.q.out:
does not exist in index
error:
a/ql/src/test/results/clientpositive/encrypted/encryption_move_tbl.q.out: does
not exist in index
Going to apply patch with: git apply -p1
/data/hiveptest/working/scratch/build.patch:4345: trailing whitespace.
FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.ddl.DDLTask2. Database lockneg1 is not locked
warning: 1 line adds whitespace errors.
+ [[ maven == \m\a\v\e\n ]]
+ rm -rf /data/hiveptest/working/maven/org/apache/hive
+ mvn -B clean install -DskipTests -T 4 -q
-Dmaven.repo.local=/data/hiveptest/working/maven
protoc-jar: executing: [/tmp/protoc303940622697441816.exe, --version]
libprotoc 2.5.0
protoc-jar: executing: [/tmp/protoc303940622697441816.exe,
-I/data/hiveptest/working/apache-github-source-source/standalone-metastore/metastore-common/src/main/protobuf/org/apache/hadoop/hive/metastore,
--java_out=/data/hiveptest/working/apache-github-source-source/standalone-metastore/metastore-common/target/generated-sources,
/data/hiveptest/working/apache-github-source-source/standalone-metastore/metastore-common/src/main/protobuf/org/apache/hadoop/hive/metastore/metastore.proto]
ANTLR Parser Generator Version 3.5.2
protoc-jar: executing: [/tmp/protoc6475338510799965499.exe, --version]
libprotoc 2.5.0
ANTLR Parser Generator Version 3.5.2
Output file
/data/hiveptest/working/apache-github-source-source/standalone-metastore/metastore-server/target/generated-sources/org/apache/hadoop/hive/metastore/parser/FilterParser.java
does not exist: must build
/data/hiveptest/working/apache-github-source-source/standalone-metastore/metastore-server/src/main/java/org/apache/hadoop/hive/metastore/parser/Filter.g
org/apache/hadoop/hive/metastore/parser/Filter.g
log4j:WARN No appenders could be found for logger (DataNucleus.Persistence).
log4j:WARN Please initialize the log4j system properly.
DataNucleus Enhancer (version 4.1.17) for API "JDO"
DataNucleus Enhancer completed with success for 41 classes.
ANTLR Parser Generator Version 3.5.2
Output file
/data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveLexer.java
does not exist: must build
/data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveLexer.g
org/apache/hadoop/hive/ql/parse/HiveLexer.g
Output file
/data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HiveParser.java
does not exist: must build
/data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g
org/apache/hadoop/hive/ql/parse/HiveParser.g
warning(200): IdentifiersParser.g:424:5:
Decision can match input such as "KW_UNKNOWN" using multiple alternatives: 1, 10
As a result, alternative(s) 10 were disabled for that input
Output file
/data/hiveptest/working/apache-github-source-source/ql/target/generated-sources/antlr3/org/apache/hadoop/hive/ql/parse/HintParser.java
does not exist: must build
/data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/parse/HintParser.g
org/apache/hadoop/hive/ql/parse/HintParser.g
Generating vector expression code
Generating vector expression test code
Processing annotations
Annotations processed
Processing annotations
No elements to process
[ERROR] COMPILATION ERROR :
[ERROR]
/data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/exec/repl/bootstrap/load/LoadDatabase.java:[128,9]
method updateDbProps in class
org.apache.hadoop.hive.ql.exec.repl.bootstrap.load.LoadDatabase cannot be
applied to given types;
required:
org.apache.hadoop.hive.metastore.api.Database,java.lang.String,boolean
found: org.apache.hadoop.hive.metastore.api.Database,java.lang.String
reason: actual and formal argument lists differ in length
[ERROR] Failed to execute goal
org.apache.maven.plugins:maven-compiler-plugin:3.6.1:compile (default-compile)
on project hive-exec: Compilation failure
[ERROR]
/data/hiveptest/working/apache-github-source-source/ql/src/java/org/apache/hadoop/hive/ql/exec/repl/bootstrap/load/LoadDatabase.java:[128,9]
method updateDbProps in class
org.apache.hadoop.hive.ql.exec.repl.bootstrap.load.LoadDatabase cannot be
applied to given types;
[ERROR] required:
org.apache.hadoop.hive.metastore.api.Database,java.lang.String,boolean
[ERROR] found: org.apache.hadoop.hive.metastore.api.Database,java.lang.String
[ERROR] reason: actual and formal argument lists differ in length
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please
read the following articles:
[ERROR] [Help 1]
http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :hive-exec
+ result=1
+ '[' 1 -ne 0 ']'
+ rm -rf yetus_PreCommit-HIVE-Build-16269
+ exit 1
'
{noformat}
This message is automatically generated.
ATTACHMENT ID: 12960353 - PreCommit-HIVE-Build
> Break up DDLTask 1 - extract Database related operations
> --------------------------------------------------------
>
> Key: HIVE-21292
> URL: https://issues.apache.org/jira/browse/HIVE-21292
> Project: Hive
> Issue Type: Improvement
> Components: Hive
> Affects Versions: 3.1.1
> Reporter: Miklos Gergely
> Assignee: Miklos Gergely
> Priority: Major
> Labels: pull-request-available
> Fix For: 4.0.0
>
> Attachments: HIVE-21292.01.patch, HIVE-21292.02.patch,
> HIVE-21292.03.patch, HIVE-21292.04.patch, HIVE-21292.05.patch,
> HIVE-21292.06.patch, HIVE-21292.07.patch, HIVE-21292.08.patch,
> HIVE-21292.09.patch, HIVE-21292.10.patch, HIVE-21292.11.patch,
> HIVE-21292.12.patch, HIVE-21292.13.patch, HIVE-21292.14.patch,
> HIVE-21292.15.patch, HIVE-21292.15.patch
>
> Time Spent: 7h
> Remaining Estimate: 0h
>
> DDLTask is a huge class, more than 5000 lines long. The related DDLWork is
> also a huge class, which has a field for each DDL operation it supports. The
> goal is to refactor these in order to have everything cut into more
> handleable classes under the package org.apache.hadoop.hive.ql.exec.ddl:
> * have a separate class for each operation
> * have a package for each operation group (database ddl, table ddl, etc), so
> the amount of classes under a package is more manageable
> * make all the requests (DDLDesc subclasses) immutable
> * DDLTask should be agnostic to the actual operations
> * right now let's ignore the issue of having some operations handled by
> DDLTask which are not actual DDL operations (lock, unlock, desc...)
> In the interim time when there are two DDLTask and DDLWork classes in the
> code base the new ones in the new package are called DDLTask2 and DDLWork2
> thus avoiding the usage of fully qualified class names where both the old and
> the new classes are in use.
> Step #1: extract all the database related operations from the old DDLTask,
> and move them under the new package. Also create the new internal framework.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)