Is the cli in fact a full replacement for shark?

The description says "The Spark SQL CLI is a convenient tool to run the Hive metastore service in local mode ...". The way I've used Shark in the past, however, is to run the shark shell on a client machine and connect it to a Hive metastore on a different machine. (The Spark master node.) So in this use case, running the Hive metastore in local mode on the client machine would not meet this goal.

Is it possible for the Spark SQL Cli to connect to an external Hive metatore rather than running one internally?

Thanks,

DR

On 09/18/2014 02:18 AM, Michael Armbrust wrote:
Check out the Spark SQL cli
<https://spark.apache.org/docs/latest/sql-programming-guide.html#running-the-spark-sql-cli>
.

On Wed, Sep 17, 2014 at 10:50 PM, David Rosenstrauch <dar...@darose.net>
wrote:

Is there a shell available for Spark SQL, similar to the way the Shark or
Hive shells work?

 From my reading up on Spark SQL, it seems like one can execute SQL queries
in the Spark shell, but only from within code in a programming language
such as Scala.  There does not seem to be any way to directly issue SQL (or
HQL) queries from the shell, send files of queries to the shell (i.e.,
using the "-f" flag), etc.

Does this functionality exist somewhere and I'm just missing it?

Thanks,

DR


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to