Github user budde commented on the issue:
https://github.com/apache/spark/pull/16942
Tests appear to be failing due to the following error:
```
[info] Exception encountered when attempting to run a suite with class
name: org.apache.spark.sql.streaming.FileStreamSourceSuite *** ABORTED *** (0
milliseconds)
[info] org.apache.spark.SparkException: Only one SparkContext may be
running in this JVM (see SPARK-2243). To ignore this error, set
spark.driver.allowMultipleContexts = true. The currently running SparkContext
was created at:
org.apache.spark.sql.execution.SQLExecutionSuite$$anonfun$3.apply(SQLExecutionSuite.scala:107)
...
```
I don't think anything in this PR should've changed the behavior of core
SQL tests, but I'll look in to this.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]