Hey Holden,

It would be helpful if you could outline the set of features you'd imagine
being part of Spark in a short doc. I didn't see a README on the existing
repo, so it's hard to know exactly what is being proposed.

As a general point of process, we've typically avoided merging modules into
Spark that can exist outside of the project. A testing utility package that
is based on Spark's public API's seems like a really useful thing for the
community, but it does seem like a good fit for a package library. At
least, this is my first question after taking a look at the project.

In any case, getting some high level view of the functionality you imagine
would be helpful to give more detailed feedback.

- Patrick

On Tue, Oct 6, 2015 at 3:12 PM, Holden Karau <hol...@pigscanfly.ca> wrote:

> Hi Spark Devs,
>
> So this has been brought up a few times before, and generally on the user
> list people get directed to use spark-testing-base. I'd like to start
> moving some of spark-testing-base's functionality into Spark so that people
> don't need a library to do what is (hopefully :p) a very common requirement
> across all Spark projects.
>
> To that end I was wondering what peoples thoughts are on where this should
> live inside of Spark. I was thinking it could either be a separate testing
> project (like sql or similar), or just put the bits to enable testing
> inside of each relevant project.
>
> I was also thinking it probably makes sense to only move the unit testing
> parts at the start and leave things like integration testing in a testing
> project since that could vary depending on the users environment.
>
> What are peoples thoughts?
>
> Cheers,
>
> Holden :)
>

Reply via email to