Github user ryan-williams commented on the pull request:
https://github.com/apache/spark/pull/3918#issuecomment-68962022
Interesting. So the profiles in the parent POM will override the default
`hadoop.version` there (and set `yarn.version` to the `hadoop.version` only
_after_ the profiles have had a chance to override `hadoop.version`), but they
will not override defaults that are set in child POMs? So the order is roughly:
1. set `hadoop.version` to `1.0.4` by default (parent POM)
1. let a specific profile override this default (parent POM)
1. set `yarn.version <- hadoop.version`
1. let child POMs further override `{hadoop,yarn}.version`
1. go fetch appropriate dependencies based on last values of
`{hadoop,yarn}.version`
?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]