I've downloaded spark  1.2.0 to my laptop.  In the lib directory, it includes 
spark-assembly-1.2.0-hadoop2.4.0.jar

When I spin up a cluster using the ec2 scripts with 1.2.0 (and set 
--hadoop-major-version=2) I notice that in the lib directory for the 
master/slaves the assembly is for hadoop2.0.0 (and I think Cloudera).

Is there a way that I  can force the install of the same assembly to the 
cluster that comes with the 1.2 download of spark?

Thanks.

Darin.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to