Thanks Amit, it worked perfect !

On Wed, May 18, 2016 at 12:33 PM, Amit Sela <[email protected]> wrote:

> You can pass this system property: "
> dataflow.spark.test.reuseSparkContext=true"
> And this will reuse the context, see:
> https://github.com/apache/incubator-beam/blob/d627266d8d39ff0ec94dc9f3f84893c1026abde7/runners/spark/src/main/java/org/apache/beam/runners/spark/translation/SparkContextFactory.java#L35
>
> On Wed, May 18, 2016 at 1:28 PM Ismaël Mejía <[email protected]> wrote:
>
>> Hello,
>>
>> I am trying to run a set of tests that use the spark runner, I build the
>> Pipeline in a setUp method and then reuse it in different tests, however
>> when it is invoked for the second time it throws an exception:
>>
>> java.lang.RuntimeException: org.apache.spark.SparkException: Only one
>> SparkContext may be running in this JVM (see SPARK-2243). To ignore this
>> error, set spark.driver.allowMultipleContexts = true. The currently running
>> SparkContext was created at:
>>
>> Do you know how can I pass such variable to the runner, or if I can skip
>> this issue in another way ?
>>
>> -Ismael
>>
>

Reply via email to