Is there a shell available for Spark SQL, similar to the way the Shark
or Hive shells work?
From my reading up on Spark SQL, it seems like one can execute SQL
queries in the Spark shell, but only from within code in a programming
language such as Scala. There does not seem to be any way to directly
issue SQL (or HQL) queries from the shell, send files of queries to the
shell (i.e., using the "-f" flag), etc.
Does this functionality exist somewhere and I'm just missing it?
Thanks,
DR
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org