Pat Ferrel created MAHOUT-1546:
----------------------------------

             Summary: building spark context fails due to incorrect classpath 
query
                 Key: MAHOUT-1546
                 URL: https://issues.apache.org/jira/browse/MAHOUT-1546
             Project: Mahout
          Issue Type: Bug
         Environment: Spark running locally
            Reporter: Pat Ferrel
            Assignee: Dmitriy Lyubimov
            Priority: Critical


The classpath retrieval is using a "-spark" flag that returns nothing, using 
the default "mahout classpath" seems to get all needed jar paths so commenting 
out the "-spark" make it work for me. Not sure this is the best fix though.

This is in def mahoutSparkContext(...)

{code=Scala}
        //val p = Runtime.getRuntime.exec(Array(exec.getAbsolutePath, "-spark", 
"classpath"))
        val p = Runtime.getRuntime.exec(Array(exec.getAbsolutePath, 
"classpath"))
{code}



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to