In various previous versions of Spark (and I believe the current version, 1.0.0, as well) we have noticed that it does not seem possible to have a "local" SparkContext and a SparkContext connected to a cluster via either a Spark Cluster (i.e. using the Spark resource manager) or a YARN cluster. Is this a known issue? If not, then I would be happy to write up a bug report on what the bad/unexpected behavior is and how to reproduce it. If so, then are there plans to fix this? Perhaps there is a Jira issue I could be pointed to or other related discussion.

Thanks,
Philip

Reply via email to