Github user ryan-williams commented on the pull request:

    https://github.com/apache/spark/pull/3918#issuecomment-68963588
  
    makes sense. Looking again at the POMs, it seems like I need to revise my 
mental model a little further.
    
    Step 3 above (`yarn.version` assigned to value of `hadoop.version`) happens 
[right 
alongside](https://github.com/apache/spark/blob/bb38ebb1abd26b57525d7d29703fd449e40cd6de/pom.xml#L122-L124)
 Step 1 (`hadoop.version` given a default of `1.0.4`).
    
    Based on that, it seems like the process must be more like:
    
    1. set properties based on `-Dhadoop.version=...`-style system args.
    1. set _any remaining unset properties_ based on `<profile>` declarations.
    1. set _further remaining unset properties_ based on the defaults in 
`<properties>`.
    
    Then I can see a few ways in which it might handle parsing child POMs and 
incorporating values found there at different steps that would result in the 
wrong behavior happening here.
    
    In any case, I'll close this out and get back to you on the JIRA, unless 
you have any other ruminations to offer about Maven-prop-value-resolution 
arcana. Thanks!



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to