Github user JoshRosen commented on the pull request:
https://github.com/apache/spark/pull/4136#issuecomment-71293352
@ankurdave The exception from the new unit test sounds suspiciously similar
to https://issues.apache.org/jira/browse/SPARK-4133. Your new test creates a
new `sc` local variable then never stops it, so if that test runs first then
its leaked context will keep running and will interfere with contexts created
in the other tests.
Because some SparkSQL tests could not pass without it, our unit tests set
`spark.driver.allowMultipleContexts=false` to disable the check, so this might
be hard to notice. If you have `unit-tests.log`, though, I'd take a look to
see whether there are any warning messages about multiple contexts.
I'd check to see if those failures still persist after properly cleaning up
the SparkContext created in your new test.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]