hm. seems fine to me on trunk.


On Mon, Apr 28, 2014 at 2:12 PM, Dmitriy Lyubimov <[email protected]> wrote:

> i think mahout classpath includes stuff in maven repo when one compiles
> it. which is why one needs to actually compile mahout to pull all
> dependencies in. Not doing so relies on classpath taking jars from mahout
> assembly, but i did not do mahout assembly classpath adjustments.
>
>
>
>
> On Mon, Apr 28, 2014 at 2:09 PM, Dmitriy Lyubimov <[email protected]>wrote:
>
>> this got to be problem with computing Mahout classpath in. make sure you
>> actually compile mahout with mvn install -DskipTests. maybe script got
>> de-stabilized. let me check the trunk really quick.
>>
>>
>> On Mon, Apr 28, 2014 at 2:07 PM, Sebastian Schelter (JIRA) <
>> [email protected]> wrote:
>>
>>>
>>>     [
>>> https://issues.apache.org/jira/browse/MAHOUT-1489?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13983538#comment-13983538]
>>>
>>> Sebastian Schelter commented on MAHOUT-1489:
>>> --------------------------------------------
>>>
>>> I'm also running Ubuntu 12 LTS. I'm getting a NoClassDefFoundError:
>>>
>>> {code}
>>> java.lang.NoClassDefFoundError: org/apache/mahout/common/IOUtils
>>>         at
>>> org.apache.mahout.sparkbindings.package$.mahoutSparkContext(package.scala:131)
>>>         at
>>> org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.createSparkContext(MahoutSparkILoop.scala:44)
>>>         at $iwC$$iwC.<init>(<console>:8)
>>>         at $iwC.<init>(<console>:14)
>>>         at <init>(<console>:16)
>>>         at .<init>(<console>:20)
>>>         at .<clinit>(<console>)
>>>         at .<init>(<console>:7)
>>>         at .<clinit>(<console>)
>>>         at $print(<console>)
>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>         at
>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>         at
>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>         at java.lang.reflect.Method.invoke(Method.java:606)
>>>         at
>>> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:772)
>>>         at
>>> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1040)
>>>         at
>>> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:609)
>>>         at
>>> org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:640)
>>>         at
>>> org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:604)
>>>         at
>>> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:793)
>>>         at
>>> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:838)
>>>         at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:750)
>>>         at
>>> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:119)
>>>         at
>>> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:118)
>>>         at
>>> org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:258)
>>>         at
>>> org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:118)
>>>         at
>>> org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:53)
>>>         at
>>> org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:908)
>>>         at
>>> org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:140)
>>>         at
>>> org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:53)
>>>         at
>>> org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:102)
>>>         at
>>> org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.postInitialization(MahoutSparkILoop.scala:20)
>>>         at
>>> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:925)
>>>         at
>>> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:881)
>>>         at
>>> org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:881)
>>>         at
>>> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>>>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:881)
>>>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:973)
>>>         at
>>> org.apache.mahout.sparkbindings.shell.Main$.main(Main.scala:14)
>>>         at org.apache.mahout.sparkbindings.shell.Main.main(Main.scala)
>>> Caused by: java.lang.ClassNotFoundException:
>>> org.apache.mahout.common.IOUtils
>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>>         at java.security.AccessController.doPrivileged(Native Method)
>>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>>>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>>>         ... 40 more
>>> {code}
>>>
>>> > Interactive Scala & Spark Bindings Shell & Script processor
>>> > -----------------------------------------------------------
>>> >
>>> >                 Key: MAHOUT-1489
>>> >                 URL: https://issues.apache.org/jira/browse/MAHOUT-1489
>>> >             Project: Mahout
>>> >          Issue Type: New Feature
>>> >    Affects Versions: 1.0
>>> >            Reporter: Saikat Kanjilal
>>> >            Assignee: Dmitriy Lyubimov
>>> >             Fix For: 1.0
>>> >
>>> >         Attachments: MAHOUT-1489.patch, MAHOUT-1489.patch.1,
>>> mahout-spark-shell-running-standalone.png
>>> >
>>> >
>>> > Build an interactive shell /scripting (just like spark shell).
>>> Something very similar in R interactive/script runner mode.
>>>
>>>
>>>
>>> --
>>> This message was sent by Atlassian JIRA
>>> (v6.2#6252)
>>>
>>
>>
>

Reply via email to