Yes, asap.

To test this right it has to run on a cluster so I’m upgrading. When ready it 
will just be a “mvn clean install" if you already have Spark 1.1.0 running.

I would have only expected errors on the CLI drivers so if anyone else sees 
runtime errors please let us know. Some errors are very hard to unit test since 
the environment is different for local(unit tests) and cluster execution.


On Oct 20, 2014, at 9:14 AM, Mahesh Balija <[email protected]> wrote:

Hi Pat,

Can you please give detailed steps to build Mahout against Spark 1.1.0.
I build against 1.1.0 but still had class not found errors, thats why I
reverted back to Spark 1.0.2 even though first few steps are successful
but still facing some issues in running Mahout spark-shell sample commands
(drmData) throws some errors even on 1.0.2.

Best,
Mahesh.B.

On Mon, Oct 20, 2014 at 1:46 AM, peng <[email protected]> wrote:

> From my experience 1.1.0 is quite stable, plus some performance
> improvements that totally worth the effort.
> 
> 
> On 10/19/2014 06:30 PM, Ted Dunning wrote:
> 
>> On Sun, Oct 19, 2014 at 1:49 PM, Pat Ferrel <[email protected]>
>> wrote:
>> 
>> Getting off the dubious Spark 1.0.1 version is turning out to be a bit of
>>> work. Does anyone object to upgrading our Spark dependency? I’m not sure
>>> if
>>> Mahout built for Spark 1.1.0 will run on 1.0.1 so it may mean upgrading
>>> your Spark cluster.
>>> 
>> 
>> It is going to have to happen sooner or later.
>> 
>> Sooner may actually be less total pain.
>> 
>> 
> 

Reply via email to