just spend last 5 minutes to cut-and-paste the tutorial on HEAD with 1.0.1
in local mode. Everything works without problem in local mode. What was
used for MASTER setting with this problem?


On Thu, Aug 14, 2014 at 11:29 AM, Dmitriy Lyubimov <[email protected]>
wrote:

> for the same reason it may have screwed mahout context creation so that
> mahout jars are now not shpped to the backend properly.
>
>
> if the sole purpose of exercise is to get the totorial working, i'd
> suggest to just roll back to commit level before Anand's change and Spark
> 0.9.1 dependency, I am pretty sure it should work then. e.g. this one
> should be the last good commit (this requires Spark 0.9.1)
>
> commit 7a50a291b4598e9809f9acf609b92175ce7f953b
> Author: Dmitriy Lyubimov <[email protected]>
> Date:   Wed Aug 6 12:30:51 2014 -0700
>
>     MAHOUT-1597: A + 1.0 (fixes)
>
>
> (use
>
> git reset 7a50a291 --hard
>
> to sync to this one)
>
>
>
> On Thu, Aug 14, 2014 at 11:20 AM, Dmitriy Lyubimov <[email protected]>
> wrote:
>
>> not sure either at this point. I guess PR from Anand renaming artifacts
>> created classpath problems but somehow it did not necessarily manifest in
>> my local tests since my maven repo holds the old ones as well.
>>
>>
>> On Thu, Aug 14, 2014 at 9:55 AM, Pat Ferrel <[email protected]>
>> wrote:
>>
>>> There are two problems here:
>>>
>>> 1) a bug in the mahout script. Just pushed your fix, thx. The jars got
>>> renamed is seems.
>>>
>>> 2) not sure what’s happening with the array serializer, maybe Dmitriy
>>> has an idea?
>>>
>>>
>>> On Aug 14, 2014, at 8:13 AM, Andrea Abelli <[email protected]>
>>> wrote:
>>>
>>> Hi Again
>>>
>>> new version of spark, new stack trace:
>>> http://pastebin.com/KPNZ3rYQ
>>>
>>> I'm going to have a look at it tomorrow.
>>>
>>> Good evening
>>> Andrea
>>>
>>>
>>
>

Reply via email to