At least a warning. I’ll do that when I get svn rights.

I think you only need the “HOME” envs as an expediency. Hadoop deprecated 
HADOOP_HOME and I thought that was the way things were going. 

I put Spark in my path by setting a symlink in /user/local/spark to change 
versions 0.9.0 to 0.9.1 is just a “ln -s” away. We may be able to reliably 
detect the current home with something like “which”. If I get time I’ll look at 
how Hadoop does it. 

I don’t see a page in the dev section of the wiki for Spark setup so I'll start 
that too. 

On May 5, 2014, at 12:09 PM, Dmitriy Lyubimov <[email protected]> wrote:

it does seem however that not setting SPARK_HOME should be caught and
reported as an error, and it is not currently. If you can fix it, that
would be a viable fix.


On Mon, May 5, 2014 at 12:08 PM, Dmitriy Lyubimov <[email protected]> wrote:

> At this point it is just that we have MAHOUT_HOME and SPARK_HOME set up.
> and Spark must be a specific version (0.9.1) .
> 
> If we know how we might be able to determine Spark's class path without
> knowing SPARK_HOME, we can drop requirements for having SPARK_HOME, but so
> far i was not able to see how. For once, a person may have 3 or 4 versions
> of spark set up at the same time (wink-wink, yours truly is the case here),
> so i can't see how asking for SPARK_HOME can be waived.
> 
> 
> On Mon, May 5, 2014 at 11:37 AM, Pat Ferrel (JIRA) <[email protected]>wrote:
> 
>> 
>>    [
>> https://issues.apache.org/jira/browse/MAHOUT-1546?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13989819#comment-13989819]
>> 
>> Pat Ferrel commented on MAHOUT-1546:
>> ------------------------------------
>> 
>> Scala works fine without SCALA_HOME. That doesn't apply here.
>> MAHOUT_HOME is set otherwise the fix wouldn't have worked.
>> 
>> Running "mahout -spark classpath" from bash returns nothing. In the bash
>> script is looks like SPARK_HOME is checked but I haven't set it and Spark
>> itself seems to run fine without it. In fact the Spark install doesn't ask
>> you to set it. In any case setting that seems to fix the problem.
>> 
>> Do we have a wiki page that describes setup for Spark, maybe I'm a good
>> guinea pig to write or edit it.
>> 
>>> building spark context fails due to incorrect classpath query
>>> -------------------------------------------------------------
>>> 
>>>                Key: MAHOUT-1546
>>>                URL: https://issues.apache.org/jira/browse/MAHOUT-1546
>>>            Project: Mahout
>>>         Issue Type: Bug
>>>        Environment: Spark running locally
>>>           Reporter: Pat Ferrel
>>>           Assignee: Dmitriy Lyubimov
>>>           Priority: Critical
>>> 
>>> The classpath retrieval is using a "-spark" flag that returns nothing,
>> using the default "mahout classpath" seems to get all needed jar paths so
>> commenting out the "-spark" makes it work for me. Not sure this is the best
>> fix though.
>>> This is in def mahoutSparkContext(...)
>>> {code}
>>>        //val p = Runtime.getRuntime.exec(Array(exec.getAbsolutePath,
>> "-spark", "classpath"))
>>>        val p = Runtime.getRuntime.exec(Array(exec.getAbsolutePath,
>> "classpath"))
>>> {code}
>> 
>> 
>> 
>> --
>> This message was sent by Atlassian JIRA
>> (v6.2#6252)
>> 
> 
> 

Reply via email to