We execute Spark jobs from a Play application but we don't use
spark-submit. I don't know if you really want to use spark-submit, but if
not you can just create a SparkContext programmatically in your app.
In development I typically run Spark locally. Creating the Spark context is
pretty trivial:
Siegmann [mailto:daniel.siegm...@velos.io]
*Sent:* Thursday, October 16, 2014 7:15 AM
*To:* Mohammed Guller
*Cc:* user@spark.apache.org
*Subject:* Re: Play framework
We execute Spark jobs from a Play application but we don't use
spark-submit. I don't know if you really want to use spark-submit
From: Surendranauth Hiraman suren.hira...@velos.io
Sent: Thursday, October 16, 2014 12:42 PM
To: Mohammed Guller
Cc: Daniel Siegmann; user@spark.apache.org
Subject: Re: Play framework
Mohammed,
Jumping in for Daniel, we actually address the configuration issue by pulling
values from
From: Surendranauth Hiraman
suren.hira...@velos.iomailto:suren.hira...@velos.io
Sent: Thursday, October 16, 2014 12:42 PM
To: Mohammed Guller
Cc: Daniel Siegmann; user@spark.apache.orgmailto:user@spark.apache.org
Subject: Re: Play framework
Mohammed,
Jumping
@spark.apache.org
*Subject:* Re: Play framework
We integrated Spark into Play and use SparkSQL extensively on an ec2
spark cluster on Hadoop hdfs 1.2.1 and tachyon 0.4.
Step 1: Create a play scala application as usual
Step 2. In Build.sbt put all your spark dependencies. What works for us is
Play
To: Mohammed Guller
Cc: US Office Admin; Surendranauth Hiraman; Daniel Siegmann;
user@spark.apache.org
Subject: Re: Play framework
Hi,
Below is the link for a simple Play + SparkSQL example -
http://blog.knoldus.com/2014/07/14/play-with-spark-building-apache-spark-with-play-framework-part-3
: Thursday, October 16, 2014 4:00 PM
To: US Office Admin; Surendranauth Hiraman
Cc: Daniel Siegmann; user@spark.apache.org
Subject: RE: Play framework
Thanks, Suren and Raju.
Raju – if I remember correctly, Play package command just creates a jar for
your app. That jar file will not include other
; Surendranauth Hiraman
Cc: Daniel Siegmann; user@spark.apache.org
Subject: Re: Play framework
The remaining dependencies (Spark libraries) are available for the context from
the sparkhome. I have installed spark such that all the slaves to have same
sparkhome. Code looks like this.
val conf = new
Guller; Surendranauth Hiraman
*Cc:* Daniel Siegmann; user@spark.apache.org
*Subject:* Re: Play framework
The remaining dependencies (Spark libraries) are available for the context
from the sparkhome. I have installed spark such that all the slaves to have
same sparkhome. Code looks like
To: Mohammed Guller
Cc: US Office Admin; Surendranauth Hiraman; Daniel Siegmann;
user@spark.apache.org
Subject: Re: Play framework
In our case, Play libraries are not required to run spark jobs. Hence they are
available only on master and play runs as a regular scala application. I can't
think
10 matches
Mail list logo