Hi,
I think I have found what was causing the exception.
"spark.app.name" seams to be required in sparkProperties to
successfully submit the job. At least when I include the app name my
job is successfully submitted to the spark cluster.
Silly mistakes, but the error message is not helping much :)
Best,
Kristinn R.
On fös, mar 3, 2017 at 1:22 EH, Kristinn Rúnarsson
<krist...@activitystream.com> wrote:
Hi,
I am trying to submit spark jobs via the "hidden" REST
API(http://spark-cluster-ip:6066/v1/submissions/..), but I am getting
ErrorResponse and I cant find what I am doing wrong.
I have been following the instructions from this blog post:
http://arturmkrtchyan.com/apache-spark-hidden-rest-api
When I try to send a CreateSubmissionRequest action I get the
following ErrorResponse:
{
"action" : "ErrorResponse",
"message" : "Malformed request:
org.apache.spark.deploy.rest.SubmitRestProtocolException: Validation
of message CreateSubmissionRequest
failed!\n\torg.apache.spark.deploy.rest.SubmitRestProtocolMessage.validate(SubmitRestProtocolMessage.scala:70)\n\torg.apache.spark.deploy.rest.SubmitRequestServlet.doPost(RestSubmissionServer.scala:272)\n\tjavax.servlet.http.HttpServlet.service(HttpServlet.java:707)\n\tjavax.servlet.http.HttpServlet.service(HttpServlet.java:790)\n\torg.spark_project.jetty.servlet.ServletHolder.handle(ServletHolder.java:812)\n\torg.spark_project.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:587)\n\torg.spark_project.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1127)\n\torg.spark_project.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515)\n\torg.spark_project.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1061)\n\torg.spark_project.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)\n\torg.spark_project.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)\n\torg.spark_project.jetty.server.Server.handle(Server.java:
499)\n\torg.spark_project.jetty.server.HttpChannel.handle(HttpChannel.java:311)\n\torg.spark_project.jetty.server.HttpConnection.onFillable(HttpConnection.java:257)\n\torg.spark_project.jetty.io.AbstractConnection$2.run(AbstractConnection.java:544)\n\torg.spark_project.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:635)\n\torg.spark_project.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:555)\n\tjava.lang.Thread.run(Thread.java:745)",
"serverSparkVersion" : "2.1.0"
}
This is how my request looks like:
curl -X POST http://kristinn:6066/v1/submissions/create --header
"Content-Type:application/json;charset=UTF-8" --data '{
"action": "CreateSubmissionRequest",
"appArgs": [
100
],
"appResource":
"/opt/spark-2.1.0/examples/jars/spark-examples_2.11-2.1.0.jar",
"clientSparkVersion": "2.1.0",
"environmentVariables": {
"SPARK_ENV_LOADED": "1"
},
"mainClass": "org.apache.spark.examples.SparkPi",
"sparkProperties": {
"spark.master": "spark://kristinn:6066",
"spark.submit.deployMode": "cluster",
"spark.jars":
"/opt/spark-2.1.0/examples/jars/spark-examples_2.11-2.1.0.jar"
}
}'
I can not see what is causing this by looking at the source code.
Is something wrong in my request or has anyone have a solution for
this issue?
Best,
Kristinn Rúnarrson.