Qing Yang, Andy is correct in answering your direct question.

At the same time, depending on your context, you may be able to apply a
pattern where you turn the single Spark application into a service, and
multiple clients if that service can indeed share access to the same RDDs.

Several groups have built apps based on this pattern, and we will also show
something with this behavior at the upcoming Spark Summit (multiple users
collaborating on named DDFs with the same underlying RDDs).

Sent while mobile. Pls excuse typos etc.
On May 18, 2014 9:40 AM, "Andy Konwinski" <andykonwin...@gmail.com> wrote:

> RDDs cannot currently be shared across multiple SparkContexts without using
> something like the Tachyon project (which is a separate project/codebase).
>
> Andy
> On May 16, 2014 2:14 PM, "qingyang li" <liqingyang1...@gmail.com> wrote:
>
> >
> >
>

Reply via email to