>From your description, it seems the error was similar to the one in the
description of HADOOP-13866 where netty-all-4.0.2?.Final.jar (from hdfs or
Spark) is ahead of 4.1.1.Final jar (used by hbase) in the classpath.

In pom.xml of Spark 1.6, you can see:
        <artifactId>netty-all</artifactId>
        <version>4.0.29.Final</version>

Please check the classpath and confirm.

Cheers

On Sat, Dec 10, 2016 at 3:08 AM, Yiannis Gkoufas <johngou...@gmail.com>
wrote:

> Hi there,
>
> I have been trying to do a bulk load in HBase 2.0.0-SNAPSHOT using Spark
> 1.6.3.
> I followed the example in BulkLoadSuite, while the test works ok, I cannot
> make it work in my cluster.
> The error I have been getting has to do with a netty dependency conflict I
> guess because I get:
>
> no such method bytebuf retainedDuplicate
>
> Would really appreciate any hints.
>
> Thanks!
>

Reply via email to