[ 
https://issues.apache.org/jira/browse/MAHOUT-1489?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13965062#comment-13965062
 ] 

Saikat Kanjilal commented on MAHOUT-1489:
-----------------------------------------

Here's what I see when I run unitTests:

[INFO] --- scalatest-maven-plugin:1.0-M2:test (test) @ mahout-shell ---
WARNING: -p has been deprecated and will be reused for a different (but still 
very cool) purpose in ScalaTest 2.0. Please change all uses of -p to -R.
Discovery starting.
Discovery completed in 632 milliseconds.
Run starting. Expected test count is: 9
DiscoverySuite:
MahoutShellSuite:
2014-04-09 23:35:51.462 java[583:b07] Unable to load realm info from 
SCDynamicStore

ls: 
/Users/skanjila/code/java/mahout-scala-spark-shell/shell/assembly/target/scala-2.10.3/spark-assembly*hadoop*.jar:
 No such file or directory
ls: 
/Users/skanjila/code/java/mahout-scala-spark-shell/shell/assembly/target/scala-2.10.3/spark-assembly*hadoop*.jar:
 No such file or directory
ls: 
/Users/skanjila/code/java/mahout-scala-spark-shell/shell/assembly/target/scala-2.10.3/spark-assembly*hadoop*.jar:
 No such file or directory
ls: 
/Users/skanjila/code/java/mahout-scala-spark-shell/shell/assembly/target/scala-2.10.3/spark-assembly*hadoop*.jar:
 No such file or directory
ls: 
/Users/skanjila/code/java/mahout-scala-spark-shell/shell/assembly/target/scala-2.10.3/spark-assembly*hadoop*.jar:
 No such file or directory
ls: 
/Users/skanjila/code/java/mahout-scala-spark-shell/shell/assembly/target/scala-2.10.3/spark-assembly*hadoop*.jar:
 No such file or directory
ls: 
/Users/skanjila/code/java/mahout-scala-spark-shell/shell/assembly/target/scala-2.10.3/spark-assembly*hadoop*.jar:
 No such file or directory
ls: 
/Users/skanjila/code/java/mahout-scala-spark-shell/shell/assembly/target/scala-2.10.3/spark-assembly*hadoop*.jar:
 No such file or directory
ls: 
/Users/skanjila/code/java/mahout-scala-spark-shell/shell/assembly/target/scala-2.10.3/spark-assembly*hadoop*.jar:
 No such file or directory
ls: 
/Users/skanjila/code/java/mahout-scala-spark-shell/shell/assembly/target/scala-2.10.3/spark-assembly*hadoop*.jar:
 No such file or directory
0 [sparkMaster-akka.actor.default-dispatcher-4] ERROR 
org.apache.spark.deploy.master.Master  - Application Spark shell with ID 
app-20140409233702-0000 failed 10 times, removing it
21 [spark-akka.actor.default-dispatcher-2] ERROR 
org.apache.spark.deploy.client.AppClient$ClientActor  - Master removed our 
application: FAILED; stopping client


My questions:
1) Given that I have repurposed the spark-repl unit tests, I am wondering 
whether we (as in the mahout spark shell) should have the same requirements for 
data store as in running or needing the spark assembly hadoop jar files which 
essentially means that a local install of spark is needed
2) When I run the unit tests now I see something like this: 
- propagation of local properties
- simple foreach with accumulator
- external vars
- external classes
- external functions
- external functions that access vars
- broadcast vars
- interacting with files

I'm assuming this means its bypassing all the unit tests, I'll investigate this 
further

3) Earlier you mentioned the mahout.sh script, should we merge the contents of 
this script with the one I have above and place that in the bin sub-directory, 
or more importantly I need to understand how mahout.sh is related to 
computeClasspath.sh

Eager to hear your thoughts to proceed quickly with next steps :)

> Interactive Scala & Spark Bindings Shell & Script processor
> -----------------------------------------------------------
>
>                 Key: MAHOUT-1489
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-1489
>             Project: Mahout
>          Issue Type: New Feature
>    Affects Versions: 1.0
>            Reporter: Saikat Kanjilal
>            Assignee: Dmitriy Lyubimov
>             Fix For: 1.0
>
>
> Build an interactive shell /scripting (just like spark shell). Something very 
> similar in R interactive/script runner mode.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to