Hi,
My use case is that, I have a long running process (orchestrator) with
multiple tasks, some tasks might require extra spark dependencies. It seems
once the spark context is started it's not possible to update
`spark.jars.packages`? I have reported an issue at
https://issues.apache.org/jira/browse/SPARK-38438, together with a
workaround ("hard reset of the cluster"). I wonder if anyone has a solution
for this?
Cheers - Rafal

Reply via email to