If you prefer the py.test framework, I just wrote a blog post with some
examples:
Unit testing Apache Spark with py.test
https://engblog.nextdoor.com/unit-testing-apache-spark-with-py-test-3b8970dc013b
On Fri, Feb 5, 2016 at 11:43 AM, Steve Annessa
wrote:
> Thanks for
Thanks for all of the responses.
I do have an afterAll that stops the sc.
While looking over Holden's readme I noticed she mentioned "Make sure to
disable parallel execution." That was what I was missing; I added the
follow to my build.sbt:
```
parallelExecution in Test := false
```
Now all of
Hi Steve,
Have you looked at the spark-testing-base package by Holden? It’s really useful
for unit testing Spark apps as it handles all the bootstrapping for you.
https://github.com/holdenk/spark-testing-base
DataFrame examples are here:
Hi Steve,
Have you cleaned up your SparkContext ( sc.stop()) , in a afterAll(). The
error suggests you are creating more than one SparkContext.
On Fri, Feb 5, 2016 at 10:04 AM, Holden Karau wrote:
> Thanks for recommending spark-testing-base :) Just wanted to add if
Thanks for recommending spark-testing-base :) Just wanted to add if anyone
has feature requests for Spark testing please get in touch (or add an issue
on the github) :)
On Thu, Feb 4, 2016 at 8:25 PM, Silvio Fiorito <
silvio.fior...@granturing.com> wrote:
> Hi Steve,
>
> Have you looked at the