srowen commented on pull request #29843: URL: https://github.com/apache/spark/pull/29843#issuecomment-699102799
Why does hadoop-3.2 have to become the default - I'm missing that. (It may be a fine idea.) I see, so some dependencies can _only_ be activated in hadoop-2.7 and not hadoop-3.2, or it fails? OK that's harder, because you can't un-set them in the hadoop-3.2 profile. We can simply declare that now you must set one of the two profiles to build - that is also valid IMHO. ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
