You are welcome Davies. Just to clarify, I didn't write the post (not sure
if my earlier post gave that impression, apologize if so), although I agree
its great :-).

-sujit


On Wed, Jul 8, 2015 at 10:36 AM, Davies Liu <dav...@databricks.com> wrote:

> Great post, thanks for sharing with us!
>
> On Wed, Jul 8, 2015 at 9:59 AM, Sujit Pal <sujitatgt...@gmail.com> wrote:
> > Hi Julian,
> >
> > I recently built a Python+Spark application to do search relevance
> > analytics. I use spark-submit to submit PySpark jobs to a Spark cluster
> on
> > EC2 (so I don't use the PySpark shell, hopefully thats what you are
> looking
> > for). Can't share the code, but the basic approach is covered in this
> blog
> > post - scroll down to the section "Writing a Spark Application".
> >
> >
> https://districtdatalabs.silvrback.com/getting-started-with-spark-in-python
> >
> > Hope this helps,
> >
> > -sujit
> >
> >
> > On Wed, Jul 8, 2015 at 7:46 AM, Julian <julian+sp...@magnetic.com>
> wrote:
> >>
> >> Hey.
> >>
> >> Is there a resource that has written up what the necessary steps are for
> >> running PySpark without using the PySpark shell?
> >>
> >> I can reverse engineer (by following the tracebacks and reading the
> shell
> >> source) what the relevant Java imports needed are, but I would assume
> >> someone has attempted this before and just published something I can
> >> either
> >> follow or install? If not, I have something that pretty much works and
> can
> >> publish it, but I'm not a heavy Spark user, so there may be some things
> >> I've
> >> left out that I haven't hit because of how little of pyspark I'm playing
> >> with.
> >>
> >> Thanks,
> >> Julian
> >>
> >>
> >>
> >> --
> >> View this message in context:
> >>
> http://apache-spark-user-list.1001560.n3.nabble.com/PySpark-without-PySpark-tp23719.html
> >> Sent from the Apache Spark User List mailing list archive at Nabble.com.
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> >> For additional commands, e-mail: user-h...@spark.apache.org
> >>
> >
>

Reply via email to