Hi,

I am trying to Spark Jobserver(
https://github.com/spark-jobserver/spark-jobserver
<https://github.com/spark-jobserver/spark-jobserver>  ) for running Spark
SQL jobs.

I was able to start the server but when I run my application(my Scala class
which extends SparkSqlJob), I am getting the following as response:

{
  "status": "ERROR",
  "result": "Invalid job type for this context"
}

Can any one suggest me what is going wrong or provide a detailed procedure
for setting up jobserver for SparkSQL? 



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Setup-Spark-jobserver-for-Spark-SQL-tp22352.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to