Amit Sela commented on BEAM-769:

>From my experience those failures happen when a "bad" context is reused - 
>since the context is reused in Spark tests, if it failed and closed for some 
>reason, it will try to keep reusing that closed context, and that's how it 
I fixed this while working on BEAM-259, this is the fix: 
I could patch this up as part of this ticket, because BEAM-259 is still in 
review, but it will be tomorrow morning (IDT).

> Spark translation tests failing with NullPointerException
> ---------------------------------------------------------
>                 Key: BEAM-769
>                 URL: https://issues.apache.org/jira/browse/BEAM-769
>             Project: Beam
>          Issue Type: Bug
>          Components: runner-spark
>    Affects Versions: Not applicable
>            Reporter: Daniel Halperin
>            Assignee: Amit Sela
> https://builds.apache.org/job/beam_PreCommit_MavenVerify/4071/
> {code}
> org.apache.beam.runners.spark.translation.streaming.EmptyStreamAssertionTest.testFixedWindows
> org.apache.beam.runners.spark.translation.streaming.FlattenStreamingTest.testFlattenUnbounded
> org.apache.beam.runners.spark.translation.streaming.KafkaStreamingTest.testRun
> org.apache.beam.runners.spark.translation.streaming.SimpleStreamingWordCountTest.testFixedWindows
> {code}
> Amit, can you please take a look?

This message was sent by Atlassian JIRA

Reply via email to