nope afaik.
MAHOUT_OPTS is the place  to set that (if we are talking about shell).

On Thu, Mar 5, 2015 at 3:50 PM, Andrew Palumbo (JIRA) <[email protected]> wrote:
>
>     [ 
> https://issues.apache.org/jira/browse/MAHOUT-1643?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14349640#comment-14349640
>  ]
>
> Andrew Palumbo commented on MAHOUT-1643:
> ----------------------------------------
>
> I don't think you can change things like
> {code} spark.akka.frameSize{code} once the context has been launched can you?
>
>> CLI arguments are not being processed in spark-shell
>> ----------------------------------------------------
>>
>>                 Key: MAHOUT-1643
>>                 URL: https://issues.apache.org/jira/browse/MAHOUT-1643
>>             Project: Mahout
>>          Issue Type: Bug
>>          Components: CLI, spark
>>    Affects Versions: 1.0
>>         Environment: spark spark-shell
>>            Reporter: Andrew Palumbo
>>              Labels: DSL, scala, spark, spark-shell
>>             Fix For: 1.0
>>
>>
>> The CLI arguments are not being processed in spark-shell.  Most importantly 
>> the spark options are not being passed to the spark configuration via:
>> {code}
>> $ mahout spark-shell -D:k=n
>> {code}
>> The arguments are preserved it through {code}$ bin/mahout{code}There should 
>> be a relatively easy fix either by using the MahoutOptionParser, Scopt or by 
>> simply parsing the args array.
>
>
>
> --
> This message was sent by Atlassian JIRA
> (v6.3.4#6332)

Reply via email to