[
https://issues.apache.org/jira/browse/BEAM-1802?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15940274#comment-15940274
]
ASF GitHub Bot commented on BEAM-1802:
--------------------------------------
GitHub user aviemzur opened a pull request:
https://github.com/apache/beam/pull/2313
[BEAM-1802] Call stop in SparkPipelineResult#waitUntilFinish
Be sure to do all of the following to help us incorporate your contribution
quickly and easily:
- [ ] Make sure the PR title is formatted like:
`[BEAM-<Jira issue #>] Description of pull request`
- [ ] Make sure tests pass via `mvn clean verify`. (Even better, enable
Travis-CI on your fork and ensure the whole test matrix passes).
- [ ] Replace `<Jira issue #>` in the title with the actual Jira issue
number, if there is one.
- [ ] If this contribution is large, please file an Apache
[Individual Contributor License
Agreement](https://www.apache.org/licenses/icla.txt).
---
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/aviemzur/beam finally-stop-spark-pipelines
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/beam/pull/2313.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #2313
----
commit a3384206a04f327f8ca2098ac0ab078486633096
Author: Aviem Zur <[email protected]>
Date: 2017-03-24T12:12:52Z
[BEAM-1802] Call stop in SparkPipelineResult#waitUntilFinish
----
> Spark Runner does not shutdown correctly when executing multiple pipelines in
> sequence
> --------------------------------------------------------------------------------------
>
> Key: BEAM-1802
> URL: https://issues.apache.org/jira/browse/BEAM-1802
> Project: Beam
> Issue Type: Bug
> Components: runner-spark
> Reporter: Ismaël Mejía
> Assignee: Aviem Zur
>
> I found this while running the Nexmark queries in sequence in local mode. I
> had the correct configuration but it didn't seem to work.
> 17/03/24 12:07:49 WARN org.apache.spark.SparkContext: Multiple running
> SparkContexts detected in the same JVM!
> org.apache.spark.SparkException: Only one SparkContext may be running in this
> JVM (see SPARK-2243). To ignore this error, set
> spark.driver.allowMultipleContexts = true. The currently running SparkContext
> was created at:
> org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:59)
> org.apache.beam.runners.spark.translation.SparkContextFactory.createSparkContext(SparkContextFactory.java:100)
> org.apache.beam.runners.spark.translation.SparkContextFactory.getSparkContext(SparkContextFactory.java:69)
> org.apache.beam.runners.spark.SparkRunner.run(SparkRunner.java:206)
> org.apache.beam.runners.spark.SparkRunner.run(SparkRunner.java:91)
> org.apache.beam.sdk.Pipeline.run(Pipeline.java:266)
> org.apache.beam.integration.nexmark.NexmarkRunner.run(NexmarkRunner.java:1233)
> org.apache.beam.integration.nexmark.NexmarkDriver.runAll(NexmarkDriver.java:69)
> org.apache.beam.integration.nexmark.drivers.NexmarkSparkDriver.main(NexmarkSparkDriver.java:46)
> at
> org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:2257)
> at
> org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$1.apply(SparkContext.scala:2239)
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)