[
https://issues.apache.org/jira/browse/MAHOUT-1643?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14349655#comment-14349655
]
Andrew Palumbo edited comment on MAHOUT-1643 at 3/6/15 12:50 AM:
-----------------------------------------------------------------
Yeah- talking about the shell. Do we want to process CLI args for the spark
configuration here ie:
{code}
$bin/mahout spark-shell -D:k=n
{code}
or should i just close this and we'll just go off of MAHOUT_OPTS?
was (Author: andrew_palumbo):
Yeah- talking about the shell. Do we want to process CLI args here ie:
{code}
$bin/mahout spark-shell -D:k=n
{code}
or should i just close this and we'll just go off of MAHOUT_OPTS?
> CLI arguments are not being processed in spark-shell
> ----------------------------------------------------
>
> Key: MAHOUT-1643
> URL: https://issues.apache.org/jira/browse/MAHOUT-1643
> Project: Mahout
> Issue Type: Bug
> Components: CLI, spark
> Affects Versions: 1.0
> Environment: spark spark-shell
> Reporter: Andrew Palumbo
> Labels: DSL, scala, spark, spark-shell
> Fix For: 1.0
>
>
> The CLI arguments are not being processed in spark-shell. Most importantly
> the spark options are not being passed to the spark configuration via:
> {code}
> $ mahout spark-shell -D:k=n
> {code}
> The arguments are preserved it through {code}$ bin/mahout{code}There should
> be a relatively easy fix either by using the MahoutOptionParser, Scopt or by
> simply parsing the args array.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)