Github user tgravescs commented on the pull request:

    https://github.com/apache/spark/pull/969#issuecomment-46037439
  
    Ok. I haven't had time to look at that in depth.  Unfortunately the configs 
and things have become a bit of a mess due to everything changing to 
spark-submit and changed to use sparkConf all at once. 
    I think at one point not all the configs were being set in SparkConf and 
some were still just system properties and some of those were set after the 
SparkConf creation. I think that maybe why that was done that way.  But I need 
to look at it in depth more.
    
    I don't really see how that fixes this issue though.  That may forward the 
configs on but in yarn-cluster mode it doesn't do anything with the 
spark.yarn.dist.* configs without witgo's change.  ie it doesn't populate 
archives or files in anyway even though the configs are there.  You need some 
code to tell it to 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

Reply via email to