Re: Spark 2.0.0 and Hive metastore

2017-09-05 Thread Dylan Wan
You can put the hive-site.xml in $SPARK_HOME/conf directory. This property can control where the data are located. spark.sql.warehouse.dir /home/myuser/spark-2.2.0/spark-warehouse location of the warehouse directory ~Dylan On Tue, Aug 29, 2017 at 1:53 PM, Andrés Ivaldi

Re: Spark 2.0.0 and Hive metastore

2017-08-29 Thread Andrés Ivaldi
Every comment are welcome If I´m not wrong it's because we are using percentile aggregation which comes with Hive support, apart from that nothing else. On Tue, Aug 29, 2017 at 11:23 AM, Jean Georges Perrin wrote: > Sorry if my comment is not helping, but... why do you need

Re: Spark 2.0.0 and Hive metastore

2017-08-29 Thread Jean Georges Perrin
Sorry if my comment is not helping, but... why do you need Hive? Can't you save your aggregation using parquet for example? jg > On Aug 29, 2017, at 08:34, Andrés Ivaldi wrote: > > Hello, I'm using Spark API and with Hive support, I dont have a Hive > instance, just

Spark 2.0.0 and Hive metastore

2017-08-29 Thread Andrés Ivaldi
Hello, I'm using Spark API and with Hive support, I dont have a Hive instance, just using Hive for some aggregation functions. The problem is that Hive crete the hive and metastore_db folder at the temp folder, I want to change that location Regards. -- Ing. Ivaldi Andres