We execute Spark jobs from a Play application but we don't use
spark-submit. I don't know if you really want to use spark-submit, but if
not you can just create a SparkContext programmatically in your app.
In development I typically run Spark locally. Creating the Spark context is
pretty trivial:
Siegmann [mailto:daniel.siegm...@velos.io]
*Sent:* Thursday, October 16, 2014 7:15 AM
*To:* Mohammed Guller
*Cc:* user@spark.apache.org
*Subject:* Re: Play framework
We execute Spark jobs from a Play application but we don't use
spark-submit. I don't know if you really want to use spark-submit
From: Surendranauth Hiraman suren.hira...@velos.io
Sent: Thursday, October 16, 2014 12:42 PM
To: Mohammed Guller
Cc: Daniel Siegmann; user@spark.apache.org
Subject: Re: Play framework
Mohammed,
Jumping in for Daniel, we actually address the configuration issue by pulling
values from
: Play framework
We integrated Spark into Play and use SparkSQL extensively on an ec2 spark
cluster on Hadoop hdfs 1.2.1 and tachyon 0.4.
Step 1: Create a play scala application as usual
Step 2. In Build.sbt put all your spark dependencies. What works for us is Play
2.2.3 Scala 2.10.4 Spark 1.1
Hi,
Below is the link for a simple Play + SparkSQL example -
http://blog.knoldus.com/2014/07/14/play-with-spark-building-apache-spark-with-play-framework-part-3/
https://github.com/knoldus/Play-Spark-Scala
Manu
On Thu, Oct 16, 2014 at 1:00 PM, Mohammed Guller moham...@glassbeam.com
wrote
To: Mohammed Guller
Cc: US Office Admin; Surendranauth Hiraman; Daniel Siegmann;
user@spark.apache.org
Subject: Re: Play framework
Hi,
Below is the link for a simple Play + SparkSQL example -
http://blog.knoldus.com/2014/07/14/play-with-spark-building-apache-spark-with-play-framework-part-3
: Thursday, October 16, 2014 4:00 PM
To: US Office Admin; Surendranauth Hiraman
Cc: Daniel Siegmann; user@spark.apache.org
Subject: RE: Play framework
Thanks, Suren and Raju.
Raju – if I remember correctly, Play package command just creates a jar for
your app. That jar file will not include other
; Surendranauth Hiraman
Cc: Daniel Siegmann; user@spark.apache.org
Subject: Re: Play framework
The remaining dependencies (Spark libraries) are available for the context from
the sparkhome. I have installed spark such that all the slaves to have same
sparkhome. Code looks like this.
val conf = new
Guller; Surendranauth Hiraman
*Cc:* Daniel Siegmann; user@spark.apache.org
*Subject:* Re: Play framework
The remaining dependencies (Spark libraries) are available for the context
from the sparkhome. I have installed spark such that all the slaves to have
same sparkhome. Code looks like
To: Mohammed Guller
Cc: US Office Admin; Surendranauth Hiraman; Daniel Siegmann;
user@spark.apache.org
Subject: Re: Play framework
In our case, Play libraries are not required to run spark jobs. Hence they are
available only on master and play runs as a regular scala application. I can't
think
Hi -
Has anybody figured out how to integrate a Play application with Spark and run
it on a Spark cluster using spark-submit script? I have seen some blogs about
creating a simple Play app and running it locally on a dev machine with sbt run
command. However, those steps don't work for
Hi
I am trying to connect to Spark from Play framework. Getting the following
Akka error...
[ERROR] [08/16/2014 17:12:05.249]
[spark-akka.actor.default-dispatcher-3] [ActorSystem(spark)] Uncaught
fatal error from thread [spark-akka.actor.default-dispatcher-3]
shutting down ActorSystem [spark
12 matches
Mail list logo