[
https://issues.apache.org/jira/browse/HIVE-21891?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16867184#comment-16867184
]
Hive QA commented on HIVE-21891:
--------------------------------
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12972122/HIVE-21891.01.patch
{color:green}SUCCESS:{color} +1 due to 9 test(s) being added or modified.
{color:red}ERROR:{color} -1 due to 51 failed/errored test(s), 16164 tests
executed
*Failed tests:*
{noformat}
org.apache.hadoop.hive.cli.TestMiniSparkOnYarnCliDriver.testCliDriver[spark_explainuser_1]
(batchId=193)
org.apache.hadoop.hive.ql.parse.TestParseNegativeDriver.testCliDriver[ambiguous_join_col]
(batchId=286)
org.apache.hadoop.hive.ql.parse.TestParseNegativeDriver.testCliDriver[duplicate_alias]
(batchId=286)
org.apache.hadoop.hive.ql.parse.TestParseNegativeDriver.testCliDriver[garbage]
(batchId=286)
org.apache.hadoop.hive.ql.parse.TestParseNegativeDriver.testCliDriver[insert_wrong_number_columns]
(batchId=286)
org.apache.hadoop.hive.ql.parse.TestParseNegativeDriver.testCliDriver[invalid_create_table]
(batchId=286)
org.apache.hadoop.hive.ql.parse.TestParseNegativeDriver.testCliDriver[invalid_dot]
(batchId=286)
org.apache.hadoop.hive.ql.parse.TestParseNegativeDriver.testCliDriver[invalid_function_param2]
(batchId=286)
org.apache.hadoop.hive.ql.parse.TestParseNegativeDriver.testCliDriver[invalid_index]
(batchId=286)
org.apache.hadoop.hive.ql.parse.TestParseNegativeDriver.testCliDriver[invalid_select]
(batchId=286)
org.apache.hadoop.hive.ql.parse.TestParseNegativeDriver.testCliDriver[macro_reserved_word]
(batchId=286)
org.apache.hadoop.hive.ql.parse.TestParseNegativeDriver.testCliDriver[missing_overwrite]
(batchId=286)
org.apache.hadoop.hive.ql.parse.TestParseNegativeDriver.testCliDriver[nonkey_groupby]
(batchId=286)
org.apache.hadoop.hive.ql.parse.TestParseNegativeDriver.testCliDriver[quoted_string]
(batchId=286)
org.apache.hadoop.hive.ql.parse.TestParseNegativeDriver.testCliDriver[unknown_column1]
(batchId=286)
org.apache.hadoop.hive.ql.parse.TestParseNegativeDriver.testCliDriver[unknown_column2]
(batchId=286)
org.apache.hadoop.hive.ql.parse.TestParseNegativeDriver.testCliDriver[unknown_column3]
(batchId=286)
org.apache.hadoop.hive.ql.parse.TestParseNegativeDriver.testCliDriver[unknown_column4]
(batchId=286)
org.apache.hadoop.hive.ql.parse.TestParseNegativeDriver.testCliDriver[unknown_column5]
(batchId=286)
org.apache.hadoop.hive.ql.parse.TestParseNegativeDriver.testCliDriver[unknown_column6]
(batchId=286)
org.apache.hadoop.hive.ql.parse.TestParseNegativeDriver.testCliDriver[unknown_function1]
(batchId=286)
org.apache.hadoop.hive.ql.parse.TestParseNegativeDriver.testCliDriver[unknown_function2]
(batchId=286)
org.apache.hadoop.hive.ql.parse.TestParseNegativeDriver.testCliDriver[unknown_function3]
(batchId=286)
org.apache.hadoop.hive.ql.parse.TestParseNegativeDriver.testCliDriver[unknown_function4]
(batchId=286)
org.apache.hadoop.hive.ql.parse.TestParseNegativeDriver.testCliDriver[unknown_table1]
(batchId=286)
org.apache.hadoop.hive.ql.parse.TestParseNegativeDriver.testCliDriver[unknown_table2]
(batchId=286)
org.apache.hadoop.hive.ql.parse.TestParseNegativeDriver.testCliDriver[wrong_distinct2]
(batchId=286)
org.apache.hive.spark.client.TestSparkClient.testAddJarsAndFiles (batchId=344)
org.apache.hive.spark.client.TestSparkClient.testCounters (batchId=344)
org.apache.hive.spark.client.TestSparkClient.testErrorJob (batchId=344)
org.apache.hive.spark.client.TestSparkClient.testErrorJobNotSerializable
(batchId=344)
org.apache.hive.spark.client.TestSparkClient.testJobSubmission (batchId=344)
org.apache.hive.spark.client.TestSparkClient.testMetricsCollection (batchId=344)
org.apache.hive.spark.client.TestSparkClient.testSimpleSparkJob (batchId=344)
org.apache.hive.spark.client.TestSparkClient.testSyncRpc (batchId=344)
org.apache.hive.spark.client.rpc.TestKryoMessageCodec.testAutoRegistration
(batchId=344)
org.apache.hive.spark.client.rpc.TestKryoMessageCodec.testDecryptionOnly
(batchId=344)
org.apache.hive.spark.client.rpc.TestKryoMessageCodec.testEmbeddedChannel
(batchId=344)
org.apache.hive.spark.client.rpc.TestKryoMessageCodec.testEncryptDecrypt
(batchId=344)
org.apache.hive.spark.client.rpc.TestKryoMessageCodec.testEncryptionOnly
(batchId=344)
org.apache.hive.spark.client.rpc.TestKryoMessageCodec.testFragmentation
(batchId=344)
org.apache.hive.spark.client.rpc.TestKryoMessageCodec.testKryoCodec
(batchId=344)
org.apache.hive.spark.client.rpc.TestKryoMessageCodec.testMaxMessageSize
(batchId=344)
org.apache.hive.spark.client.rpc.TestKryoMessageCodec.testNegativeMessageSize
(batchId=344)
org.apache.hive.spark.client.rpc.TestRpc.testBadHello (batchId=344)
org.apache.hive.spark.client.rpc.TestRpc.testClientServer (batchId=344)
org.apache.hive.spark.client.rpc.TestRpc.testCloseListener (batchId=344)
org.apache.hive.spark.client.rpc.TestRpc.testEncryption (batchId=344)
org.apache.hive.spark.client.rpc.TestRpc.testNotDeserializableRpc (batchId=344)
org.apache.hive.spark.client.rpc.TestRpc.testRpcDispatcher (batchId=344)
org.apache.hive.spark.client.rpc.TestRpc.testRpcServerMultiThread (batchId=344)
{noformat}
Test results:
https://builds.apache.org/job/PreCommit-HIVE-Build/17639/testReport
Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/17639/console
Test logs: http://104.198.109.242/logs/PreCommit-HIVE-Build-17639/
Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.YetusPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 51 tests failed
{noformat}
This message is automatically generated.
ATTACHMENT ID: 12972122 - PreCommit-HIVE-Build
> Break up DDLTask - cleanup
> --------------------------
>
> Key: HIVE-21891
> URL: https://issues.apache.org/jira/browse/HIVE-21891
> Project: Hive
> Issue Type: Sub-task
> Components: Hive
> Affects Versions: 3.1.1
> Reporter: Miklos Gergely
> Assignee: Miklos Gergely
> Priority: Major
> Labels: refactor-ddl
> Fix For: 4.0.0
>
> Attachments: HIVE-21891.01.patch
>
>
> DDLTask was a huge class, more than 5000 lines long. The related DDLWork was
> also a huge class, which had a field for each DDL operation it supported. The
> goal was to refactor these in order to have everything cut into more
> handleable classes under the package org.apache.hadoop.hive.ql.exec.ddl:
> * have a separate class for each operation
> * have a package for each operation group (database ddl, table ddl, etc), so
> the amount of classes under a package is more manageable
> * make all the requests (DDLDesc subclasses) immutable - most of them are now
> * DDLTask should be agnostic to the actual operations
> * right now let's ignore the issue of having some operations handled by
> DDLTask which are not actual DDL operations (lock, unlock, desc...)
> In the interim time when there were two DDLTask and DDLWork classes in the
> code base the new ones in the new package were called DDLTask2 and DDLWork2
> thus avoiding the usage of fully qualified class names where both the old and
> the new classes were in use.
> Step #12: rename DDLTask2 and DDLWork2, now that they are alone. Remove the
> old DDLDesc. Instead of registering, now DDLTask finds the DDLOperations, and
> registers them itself.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)