Github user shivaram commented on the issue:
https://github.com/apache/spark/pull/16290
@gatorsmile I think I figured out the problem with `HiveSparkSubmitSuite`
but I'm not sure how to solve it. The problem is that in one of the test cases
we check the DB location to be same as warehouse path [1].
Now the warehouse path correctly points to `spark-warehouse` - The catalog
though seems to be configured to an existing `metastore_db`. Now in my machine
(and I guess in Jenkins) if we have a metastore_db left over from running
SparkR tests then the default location points to the R `tempdir` which is no
longer valid.
It seems to me that the right solution here is to also clear the
`metastore_db` once a set of tests finish ? Is there a way to do this from
within the tests or should we be running `rm` from some of the test running
scripts ?
[1]
https://github.com/apache/spark/blob/1e5c51f336b90cd1eed43e9c6cf00faee696174c/sql/hive/src/test/scala/org/apache/spark/sql/hive/HiveSparkSubmitSuite.scala#L824
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]