Sharing spark context across multiple spark sql cli initializations

2014-10-22 Thread Sadhan Sood
We want to run multiple instances of spark sql cli on our yarn cluster. Each instance of the cli is to be used by a different user. This looks non-optimal if each user brings up a different cli given how spark works on yarn by running executor processes (and hence consuming resources) on worker

Re: Sharing spark context across multiple spark sql cli initializations

2014-10-22 Thread Michael Armbrust
The JDBC server is what you are looking for: http://spark.apache.org/docs/latest/sql-programming-guide.html#running-the-thrift-jdbc-server On Wed, Oct 22, 2014 at 11:10 AM, Sadhan Sood sadhan.s...@gmail.com wrote: We want to run multiple instances of spark sql cli on our yarn cluster. Each