I think that @holdenk's *spark-testing-base* project publishes some of
these test classes as well as some helper classes for testing streaming
jobs: https://github.com/holdenk/spark-testing-base

On Thu, May 21, 2015 at 10:39 PM, Reynold Xin <r...@databricks.com> wrote:

> It is just 15 lines of code to copy, isn't it?
>
> On Thu, May 21, 2015 at 7:46 PM, Nathan Kronenfeld <
> nkronenfeld@uncharted.software> wrote:
>
>> see discussions about Spark not really liking multiple contexts in the
>>> same JVM
>>>
>>
>> Speaking of this - is there a standard way of writing unit tests that
>> require a SparkContext?
>>
>> We've ended up copying out the code of SharedSparkContext to our own
>> testing hierarchy, but it occurs to me someone would have published a test
>> jar by now if that was the best way.
>>
>>           -Nathan
>>
>>
>

Reply via email to