>From my rough high-level view, there is nothing stopping us from adding
broadcast variables to Spark Connect; we essentially have to lift them to
the Spark Session. This would not be different from what we're doing for
artifact management or what we've done for job cancellation.

If you're interested in working on this, I'm happy to guide a bit.

Martin



On Mon, Oct 14, 2024 at 7:08 PM Deependra Patel <pateldeependr...@gmail.com>
wrote:

> Hi,
> I see that Spark Context methods are not supported in Spark Connect. There
> are many common use cases eg. broadcast machine learning model weights to
> all executors, so no need to fetch individually.
>
> This makes migration of workloads to Spark Connect tougher. I know there
> are plans to add more & more functionalities to Spark Connect eg.
> SparkMLlib.
>
> My question is: Is there an ETA to support broadcasts in Spark Connect API
> too? Or it won't be supported due to how Spark Connect is designed,
> maybe separate JVMs/security etc? I can also consider implementing this
> based on the above answers and effort.
>
> Regards,
> Deependra
>

Reply via email to