Sometimes the underlying Hive code will also print exceptions during
successful execution (for example CREATE TABLE IF NOT EXISTS).  If there is
actually a problem Spark SQL should throw an exception.

What is the command you are running and what is the error you are seeing?


On Sat, Sep 6, 2014 at 2:11 PM, Davies Liu <dav...@databricks.com> wrote:

> The SQLContext.sql() will return an SchemaRDD, you need to call collect()
> to pull the data in.
>
> On Sat, Sep 6, 2014 at 6:02 AM, jamborta <jambo...@gmail.com> wrote:
> > Hi,
> >
> > I am using Spark SQL to run some administrative queries and joins (e.g.
> > create table, insert overwrite, etc), where the query does not return any
> > data. I noticed if the query fails it prints some error message on the
> > console, but does not actually throw an exception (this is spark 1.0.2).
> >
> > Is there any way to get these errors from the returned object?
> >
> > thanks,
> >
> >
> >
> > --
> > View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-check-if-query-is-completed-pyspark-tp13630.html
> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> > For additional commands, e-mail: user-h...@spark.apache.org
> >
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to