Hi Ashic,

Yes, there is one - local-cluster[N, cores, memory] - that you can use
for simulating a Spark cluster of [N, cores, memory] locally.

https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/SparkContext.scala#L2478

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Wed, Aug 10, 2016 at 10:24 AM, Ashic Mahtab <as...@live.com> wrote:
> Hi,
> Is there a way to simulate "networked" spark when running local (i.e.
> master=local[4])? Ideally, some setting that'll ensure any "Task not
> serializable" errors are caught during local testing? I seem to vaguely
> remember something, but am having trouble pinpointing it.
>
> Cheers,
> Ashic.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to