Hi ,

I am beginner with Apache spark.

Can anyone let me know if it is mandatory to build spark with the Hadoop 
version I am using or can I use a pre built package and use it with my existing 
HDFS root folder?
I am using Hadoop 2.5.0 and want to use Apache spark 1.2.0 with it.
I could see a pre built version for 2.4 and above in the downbloads section of 
Spark homepage -> downloads.

Siddharth Ubale,
Synchronized Communications
#43, Velankani Tech Park, Block No. II,
3rd Floor, Electronic City Phase I,
Bangalore – 560 100
Tel : +91 80 3202 4060
Web: www.syncoms.com<http://www.syncoms.com/>
[LogoNEWmohLARGE]
London|Bangalore|Orlando

we innovate, plan, execute, and transform the business​

Reply via email to