[
https://issues.apache.org/jira/browse/SPARK-32305?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17157629#comment-17157629
]
Dongjoon Hyun commented on SPARK-32305:
---------------------------------------
For a record, `hive-1.2` profile is officially and strongly discouraged by
Apache Spark community because it's a violation of global Apache project policy
since Apache Spark 3.0.0.
> Make `mvn clean` remove `metastore_db` and `spark-warehouse`
> ------------------------------------------------------------
>
> Key: SPARK-32305
> URL: https://issues.apache.org/jira/browse/SPARK-32305
> Project: Spark
> Issue Type: Improvement
> Components: Build, Tests
> Affects Versions: 3.1.0
> Reporter: Yang Jie
> Assignee: Yang Jie
> Priority: Minor
> Fix For: 3.1.0
>
>
> Now we support two version of build-in hive and there are some test generated
> meta data not in target dir like `spark-warehouse`, `metastore_db`, they
> don't cleanup automatically when we run `mvn clean` command.
> So if we run `mvn clean test -pl sql/hive -am -Phadoop-2.7 -Phive -Phive-1.2
> ` , the `metastore_db` dir will created and meta data will remains after test
> complete.
> Then we need manual cleanup `metastore_db` directory to ensure `mvn clean
> test -pl sql/hive -am -Phadoop-2.7 -Phive` command can succeed because the
> residual metastore data is not compatible.
> `spark-warehouse` will also cause test failure in some data residual
> scenarios too.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]