Hi.

Can I add jars to the spark executor classpath in a running context?
Basically if I have a running spark session, if I edit the spark.jars in
the middle of the code, will it pick up the changes?

If not, is there any way to add new dependent jars to a running spark
context ?

We’re using Livy to keep the session up.

Thanks.

Reply via email to