Spawned by this discussion
https://github.com/apache/spark/pull/1120#issuecomment-54305831.
See these 2 lines in spark_ec2.py:
- spark_ec2 L42
https://github.com/apache/spark/blob/6a72a36940311fcb3429bd34c8818bc7d513115c/ec2/spark_ec2.py#L42
- spark_ec2 L566
The spark-ec2 repository isn't a part of Mesos. Back in the days, Spark
used to be hosted in the Mesos github organization as well and so we put
scripts that were used by Spark under the same organization.
FWIW I don't think these scripts belong in the Spark repository. They are
helper scripts
that's not a bad idea. it would also break the circular dep in versions
that results in spark X's ec2 script installing spark X-1 by default.
best,
matt
On 09/03/2014 01:17 PM, Shivaram Venkataraman wrote:
The spark-ec2 repository isn't a part of Mesos. Back in the days, Spark
used to be
Actually the circular dependency doesn't depend on the spark-ec2 scripts --
The scripts contain download links to many Spark versions and you can
configure which one should be used.
Shivaram
On Wed, Sep 3, 2014 at 10:22 AM, Matthew Farrellee m...@redhat.com wrote:
that's not a bad idea. it
oh, i see pwendell is did a patch to the release branch to make the
release version == --spark-version default
best,
matt
On 09/03/2014 01:30 PM, Shivaram Venkataraman wrote:
Actually the circular dependency doesn't depend on the spark-ec2 scripts
-- The scripts contain download links to