Re: How to build Spark with my own version of Hadoop?
As you know, the hadoop versions and so on are available in the spark build files, iirc the top level pox.xml has all the maven variables for versions. So I think if you just build hadoop locally (i.e. build it as it to 2.2.1234-SNAPSHOT and mvn install it), you should be able to change the corres
How to build Spark with my own version of Hadoop?
Hi, I have modified some Hadoop code, and want to build Spark with the modified version of Hadoop. Do I need to change the compilation dependency files? How to then? Great thanks!