Github user vanzin commented on the issue:

    https://github.com/apache/spark/pull/15009
  
    Sorry I haven't had the time to fully review this, but I wanted to make 
some comments about the SPARK-11035 part of this bug.
    
    The difficulty of that bug is not the implementation; yeah, it's just a 
case of using reflection. The hard part is the semantics of what it means to 
launch multiple applications in the same JVM. And this goes beyond just what 
happens when you create multiple SparkContexts at the same time (which will 
result in an exception).
    
    For example, because SparkSubmit propagates configs to the actual class 
being run using system properties, you can't launch multiple applications 
without risking the configs getting mixed up.
    
    e.g. if you launch an app with `spark.dynamicAllocation.enabled=false`, and 
after it finishes you launch a second app without explicitly setting that 
option, the second app now has "inherited" the config from the first one, 
because it's set in the system properties.
    
    So in my view to properly implement SPARK-11035 you first have to solve 
that problem, otherwise it's too easy for people to run into weird issues.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to