Hi,
I am just curious to know what are the difference between the prebuilt
packages for Hadoop1, 2, CDH etc.
I am using spark standalone cluster and we dont use hadoop at all.
Can we use any one of the pre-buil;t packages OR we have to run
make-distribution.sh script from the code?
Thanks,
--
HDFS driver keeps changing breaking compatibility, hence all the build
versions. If you dont use HDFS/YARN then you can safely ignore it.
Mayur Rustagi
Ph: +1 (760) 203 3257
http://www.sigmoidanalytics.com
@mayur_rustagi https://twitter.com/mayur_rustagi
On Tue, Jun 24, 2014 at 12:16 PM,