[
https://issues.apache.org/jira/browse/MAHOUT-1546?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Pat Ferrel updated MAHOUT-1546:
-------------------------------
Description:
The classpath retrieval is using a "-spark" flag that returns nothing, using
the default "mahout classpath" seems to get all needed jar paths so commenting
out the "-spark" makes it work for me. Not sure this is the best fix though.
This is in def mahoutSparkContext(...)
{code}
//val p = Runtime.getRuntime.exec(Array(exec.getAbsolutePath, "-spark",
"classpath"))
val p = Runtime.getRuntime.exec(Array(exec.getAbsolutePath,
"classpath"))
{code}
was:
The classpath retrieval is using a "-spark" flag that returns nothing, using
the default "mahout classpath" seems to get all needed jar paths so commenting
out the "-spark" make it work for me. Not sure this is the best fix though.
This is in def mahoutSparkContext(...)
{code}
//val p = Runtime.getRuntime.exec(Array(exec.getAbsolutePath, "-spark",
"classpath"))
val p = Runtime.getRuntime.exec(Array(exec.getAbsolutePath,
"classpath"))
{code}
> building spark context fails due to incorrect classpath query
> -------------------------------------------------------------
>
> Key: MAHOUT-1546
> URL: https://issues.apache.org/jira/browse/MAHOUT-1546
> Project: Mahout
> Issue Type: Bug
> Environment: Spark running locally
> Reporter: Pat Ferrel
> Assignee: Dmitriy Lyubimov
> Priority: Critical
>
> The classpath retrieval is using a "-spark" flag that returns nothing, using
> the default "mahout classpath" seems to get all needed jar paths so
> commenting out the "-spark" makes it work for me. Not sure this is the best
> fix though.
> This is in def mahoutSparkContext(...)
> {code}
> //val p = Runtime.getRuntime.exec(Array(exec.getAbsolutePath,
> "-spark", "classpath"))
> val p = Runtime.getRuntime.exec(Array(exec.getAbsolutePath,
> "classpath"))
> {code}
--
This message was sent by Atlassian JIRA
(v6.2#6252)