I don't like the idea of removing Hadoop 1 unless it becomes a significant 
maintenance burden, which I don't think it is. You'll always be surprised how 
many people use old software, even though various companies may no longer 
support them.

With Hadoop 2 in particular, I may be misremembering, but I believe that the 
experience on Windows is considerably worse because it requires these shell 
scripts to set permissions that it won't find if you just download Spark. That 
would be one reason to keep Hadoop 1 in the default build. But I could be 
wrong, it's been a while since I tried Windows.

Matei


> On Jun 12, 2015, at 11:21 AM, Sean Owen <so...@cloudera.com> wrote:
> 
> I don't imagine that can be guaranteed to be supported anyway... the
> 0.x branch has never necessarily worked with Spark, even if it might
> happen to. Is this really something you would veto for everyone
> because of your deployment?
> 
> On Fri, Jun 12, 2015 at 7:18 PM, Thomas Dudziak <tom...@gmail.com> wrote:
>> -1 to this, we use it with an old Hadoop version (well, a fork of an old
>> version, 0.23). That being said, if there were a nice developer api that
>> separates Spark from Hadoop (or rather, two APIs, one for scheduling and one
>> for HDFS), then we'd be happy to maintain our own plugins for those.
>> 
>> cheers,
>> Tom
>> 
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to