I think the 1.0 AMI only contains the prebuilt packages (i.e just the
binaries) of Spark and not the source code. If you want to build Spark on
EC2, you'll can clone the github repo and then use sbt.

Thanks
Shivaram


On Mon, Jul 28, 2014 at 8:49 AM, redocpot <julien19890...@gmail.com> wrote:

> update:
>
> Just checked the python launch script, when retrieving spark, it will refer
> to this script:
> https://github.com/mesos/spark-ec2/blob/v3/spark/init.sh
>
> where each version number is mapped to a tar file,
>
>     0.9.2)
>       if [[ "$HADOOP_MAJOR_VERSION" == "1" ]]; then
>         wget
> http://s3.amazonaws.com/spark-related-packages/spark-0.9.2-bin-hadoop1.tgz
>       else
>         wget
> http://s3.amazonaws.com/spark-related-packages/spark-0.9.2-bin-cdh4.tgz
>       fi
>       ;;
>     1.0.0)
>       if [[ "$HADOOP_MAJOR_VERSION" == "1" ]]; then
>         wget
> http://s3.amazonaws.com/spark-related-packages/spark-1.0.0-bin-hadoop1.tgz
>       else
>         wget
> http://s3.amazonaws.com/spark-related-packages/spark-1.0.0-bin-cdh4.tgz
>       fi
>       ;;
>     1.0.1)
>       if [[ "$HADOOP_MAJOR_VERSION" == "1" ]]; then
>         wget
> http://s3.amazonaws.com/spark-related-packages/spark-1.0.1-bin-hadoop1.tgz
>       else
>         wget
> http://s3.amazonaws.com/spark-related-packages/spark-1.0.1-bin-cdh4.tgz
>       fi
>       ;;
>
> I just checked the three last tar file. I find the /sbt directory and many
> other directory like bagel, mllib, etc in 0.9.2 tar file. However, they are
> not in 1.0.0 and 1.0.1 tar files.
>
> I am not sure that 1.0.X versions are mapped to the correct tar files.
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/sbt-directory-missed-tp10783p10784.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Reply via email to