sunchao commented on pull request #34100:
URL: https://github.com/apache/spark/pull/34100#issuecomment-927345684


   There are two ways to fix this:
   1. move `spark.yarn.isHadoopProvided` to Spark parent pom, so that 
`-Phadoop-3.2` can become the default profile in the YARN module's pom. I don't 
see any side effect on this - ideally this property can be more general such as 
`spark.isHadoopProvided`.
   2. move `hadoop-client-runtime.artifact` out of the `-Phadoop-3.2` profile. 
It should fix the build issue but someone that's using `-Phadoop-provided` to 
test Hadoop 3.2 could still fail. 
   
   I'm inclined to option 1) here but let me know if you have any thoughts on 
this.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to