Hi Mark,

The Spark build components and install scripts still check that it's defined.

I'll try dropping the check for SCALA_HOME and see if they build and run 
without issues.  Follow up with a JIRA.

Youngwoo, you worked on the Spark RPMs most recently -- do you foresee any 
issues?

Thanks,
RJ

> On Sep 2, 2015, at 12:46 PM, Mark Grover <[email protected]> wrote:
> 
> Hey RJ,
> Actually, I don't think it does anymore.
> 
> A change was made starting Spark 1.1.0 that doesn't make it require it
> anymore.
> https://github.com/apache/spark/commit/d8c005d5371f81a2a06c5d27c7021e1ae43d7193
> 
>> On Wed, Sep 2, 2015 at 10:19 AM, RJ Nowling <[email protected]> wrote:
>> 
>> Hi all,
>> 
>> I noticed that building the Spark RPMs required SCALA_HOME to be set.
>> 
>> I'm confused about this since building Spark from source doesn't require
>> SCALA_HOME -- Scala libraries are automatically downloaded and packaged by
>> maven.
>> 
>> Does BigTop create a separate Scala package?  Or does Debian provide a
>> Scala package?
>> 
>> Thanks!
>> RJ
>> 

Reply via email to