@mxm @aromanenko-dev thanks! 

I added tests for Flink. Due to the fact that Spark runner invokes the code in 
run() method and creates SparkContext there (but does not delete it later?), I 
had some trouble with creating similar tests for Spark too so i didn't submit 
them in this PR. Maybe we can do this later. I got the following error when I 
run similar tests: 

> Cannot reuse spark context with different spark master URL. Existing: 
> local[1], requested: spark://localhost:7077. 

I tried to reuse the context (with `ReuseContextRule.java` class). I also tried 
to stop the context somehow but all this lead me nowhere. The code I used is 
here: 
https://github.com/apache/beam/compare/master...lgajowy:spark-integration-tests-2?expand=1#diff-9336eb87a4aea9ba0f254a1318f1fc90
 

@mxm @aromanenko-dev could you take a look again? 











[ Full content available at: https://github.com/apache/beam/pull/6244 ]
This message was relayed via gitbox.apache.org for [email protected]

Reply via email to