Please take a look at
sql/hive/src/main/scala/org/apache/spark/sql/hive/test/TestHive.scala :

  protected def configure(): Unit = {
    warehousePath.delete()
    metastorePath.delete()
    setConf("javax.jdo.option.ConnectionURL",
      s"jdbc:derby:;databaseName=$metastorePath;create=true")
    setConf("hive.metastore.warehouse.dir", warehousePath.toString)
  }

Cheers

On Wed, Apr 8, 2015 at 1:07 PM, Daniel Siegmann <daniel.siegm...@teamaol.com
> wrote:

> I am trying to unit test some code which takes an existing HiveContext and
> uses it to execute a CREATE TABLE query (among other things). Unfortunately
> I've run into some hurdles trying to unit test this, and I'm wondering if
> anyone has a good approach.
>
> The metastore DB is automatically created in the local directory, but it
> doesn't seem to be cleaned up afterward. Is there any way to get Spark to
> clean this up when the context is stopped? Or can I point this to some
> other location, such as a temp directory?
>
> Trying to create a table fails because it is using the default warehouse
> directory (/user/hive/warehouse). Is there some way to change this without
> hard-coding a directory in a hive-site.xml; again, I'd prefer to point it
> to a temp directory so it will be automatically removed. I tried a couple
> of things that didn't work:
>
>    - hiveContext.sql("SET hive.metastore.warehouse.dir=/tmp/dir/xyz")
>    - hiveContext.setConf("hive.metastore.warehouse.dir", "/tmp/dir/xyz")
>
> Any advice from those who have been here before would be appreciated.
>

Reply via email to