Hi, My use case is that, I have a long running process (orchestrator) with multiple tasks, some tasks might require extra spark dependencies. It seems once the spark context is started it's not possible to update `spark.jars.packages`? I have reported an issue at https://issues.apache.org/jira/browse/SPARK-38438, together with a workaround ("hard reset of the cluster"). I wonder if anyone has a solution for this? Cheers - Rafal
- [SPARK-38438] pyspark - how to update spark.jars.packages ... Rafał Wojdyła
- Re: [SPARK-38438] pyspark - how to update spark.jars.... Sean Owen
- Re: [SPARK-38438] pyspark - how to update spark.j... Rafał Wojdyła
- Re: [SPARK-38438] pyspark - how to update spa... Sean Owen
- Re: [SPARK-38438] pyspark - how to update... Rafał Wojdyła
- Re: [SPARK-38438] pyspark - how to u... Artemis User
- Re: [SPARK-38438] pyspark - how to u... Rafał Wojdyła
- Re: [SPARK-38438] pyspark - how ... Artemis User
- Re: [SPARK-38438] pyspark - how ... Rafał Wojdyła
- Re: [SPARK-38438] pyspark - ... Sean Owen
- Re: [SPARK-38438] pyspark - ... Artemis User