Re: Problem with jackson lib running on spark

2016-04-01 Thread Ted Yu
Thanks for sharing the workaround.

Probably send a PR on tranquilizer github :-)

On Fri, Apr 1, 2016 at 12:50 PM, Marcelo Oikawa  wrote:

> Hi, list.
>
> Just to close the thread. Unfortunately, I didnt solve the jackson lib
> problem but I did a workaround that works fine for me. Perhaps this help
> another one.
>
> The problem raised from this line when I try to create tranquilizer object
> (used to connect to Druid) using this utility *fromConfig*.
>
> tranquilizer = 
> DruidBeams.fromConfig(dataSourceConfig).buildTranquilizer(tranquilizerBuider);
>
> Instead, I create the tranquilizer object-by-object as showed below:
>
> DruidDimensions dimentions = getDimensions(props);
> List aggregators = getAggregations(props);
>
> TimestampSpec timestampSpec = new TimestampSpec("timestamp", "auto", null);
> Timestamper> timestamper = map -> new 
> DateTime(map.get("timestamp"));
> DruidLocation druidLocation = DruidLocation.create("overlord", 
> "druid:firehose:%s", dataSource);
> DruidRollup druidRollup = DruidRollup.create(dimentions, aggregators, 
> QueryGranularity.ALL);
>
>
> ClusteredBeamTuning clusteredBeamTuning = ClusteredBeamTuning.builder()
> 
> .segmentGranularity(Granularity.FIVE_MINUTE)
> .windowPeriod(new 
> Period("PT60m"))
> .partitions(1)
> .replicants(1)
> .build();
>
> tranquilizer = DruidBeams.builder(timestamper)
>.curator(buildCurator(props))
>.discoveryPath("/druid/discovery")
>.location(druidLocation)
>.timestampSpec(timestampSpec)
>.rollup(druidRollup)
>.tuning(clusteredBeamTuning)
>.buildTranquilizer();
>
> tranquilizer.start();
>
> That worked for me. Thank you Ted, Alonso and other users.
>
>
> On Thu, Mar 31, 2016 at 4:08 PM, Marcelo Oikawa <
> marcelo.oik...@webradar.com> wrote:
>
>>
>> Please exclude jackson-databind - that was where the AnnotationMap class
>>> comes from.
>>>
>>
>> I tried as you suggest but i getting the same error. Seems strange
>> because when I see the generated jar there is nothing related as
>> AnnotationMap but there is a databind there.
>>
>>
>> ​
>>
>>
>>>
>>> On Thu, Mar 31, 2016 at 11:37 AM, Marcelo Oikawa <
>>> marcelo.oik...@webradar.com> wrote:
>>>
 Hi, Alonso.

 As you can see jackson-core is provided by several libraries, try to
> exclude it from spark-core, i think the minor version is included within
> it.
>

 There is no more than one jackson-core provides by spark-core. There
 are jackson-core and jackson-core-asl but are differents artifacts. BTW, I
 tried to exclude then but no sucess. Same error:

 java.lang.IllegalAccessError: tried to access method
 com.fasterxml.jackson.databind.introspect.AnnotatedMember.getAllAnnotations()Lcom/fasterxml/jackson/databind/introspect/AnnotationMap;
 from class
 com.fasterxml.jackson.databind.introspect.GuiceAnnotationIntrospector
 at
 com.fasterxml.jackson.databind.introspect.GuiceAnnotationIntrospector.findInjectableValueId(GuiceAnnotationIntrospector.java:39)
 at
 com.fasterxml.jackson.databind.introspect.AnnotationIntrospectorPair.findInjectableValueId(AnnotationIntrospectorPair.java:269)
 at
 com.fasterxml.jackson.databind.deser.BasicDeserializerFactory._addDeserializerConstructors(BasicDeserializerFactory.java:433)
 ...

 I guess the problem is incopabilities between jackson artifacts that
 comes from tranquility dependency vs spark prodided but I also tried to
 find same jackson artifacts but in different versions but there is no one.
 What is missing?


 Use this guide to see how to do it:
>
>
> https://maven.apache.org/guides/introduction/introduction-to-optional-and-excludes-dependencies.html
>
>
>
> Alonso Isidoro Roman.
>
> Mis citas preferidas (de hoy) :
> "Si depurar es el proceso de quitar los errores de software, entonces
> programar debe ser el proceso de introducirlos..."
>  -  Edsger Dijkstra
>
> My favorite quotes (today):
> "If debugging is the process of removing software bugs, then
> programming must be the process of putting ..."
>   - Edsger Dijkstra
>
> "If you pay peanuts you get monkeys"
>
>
> 2016-03-31 20:01 GMT+02:00 Marcelo Oikawa  >:
>
>> Hey, Alonso.
>>
>> here is the output:
>>
>> [INFO] spark-processor:spark-processor-druid:jar:1.0-SNAPSHOT
>> [INFO] +- org.apache.spark:spark-streaming_2.10:jar:1.6.1:provided
>> [INFO] |  +- org.apache.spark:spark-core_2.10:jar:1.6.1:provided
>> [INFO] |  |  +- org.apache.avro:avro-mapred:jar:hadoop2:1.7.7:provid

Re: Problem with jackson lib running on spark

2016-04-01 Thread Marcelo Oikawa
Hi, list.

Just to close the thread. Unfortunately, I didnt solve the jackson lib
problem but I did a workaround that works fine for me. Perhaps this help
another one.

The problem raised from this line when I try to create tranquilizer object
(used to connect to Druid) using this utility *fromConfig*.

tranquilizer = 
DruidBeams.fromConfig(dataSourceConfig).buildTranquilizer(tranquilizerBuider);

Instead, I create the tranquilizer object-by-object as showed below:

DruidDimensions dimentions = getDimensions(props);
List aggregators = getAggregations(props);

TimestampSpec timestampSpec = new TimestampSpec("timestamp", "auto", null);
Timestamper> timestamper = map -> new
DateTime(map.get("timestamp"));
DruidLocation druidLocation = DruidLocation.create("overlord",
"druid:firehose:%s", dataSource);
DruidRollup druidRollup = DruidRollup.create(dimentions, aggregators,
QueryGranularity.ALL);


ClusteredBeamTuning clusteredBeamTuning = ClusteredBeamTuning.builder()

.segmentGranularity(Granularity.FIVE_MINUTE)
.windowPeriod(new
Period("PT60m"))
.partitions(1)
.replicants(1)
.build();

tranquilizer = DruidBeams.builder(timestamper)
   .curator(buildCurator(props))
   .discoveryPath("/druid/discovery")
   .location(druidLocation)
   .timestampSpec(timestampSpec)
   .rollup(druidRollup)
   .tuning(clusteredBeamTuning)
   .buildTranquilizer();

tranquilizer.start();

That worked for me. Thank you Ted, Alonso and other users.


On Thu, Mar 31, 2016 at 4:08 PM, Marcelo Oikawa  wrote:

>
> Please exclude jackson-databind - that was where the AnnotationMap class
>> comes from.
>>
>
> I tried as you suggest but i getting the same error. Seems strange because
> when I see the generated jar there is nothing related as AnnotationMap but
> there is a databind there.
>
>
> ​
>
>
>>
>> On Thu, Mar 31, 2016 at 11:37 AM, Marcelo Oikawa <
>> marcelo.oik...@webradar.com> wrote:
>>
>>> Hi, Alonso.
>>>
>>> As you can see jackson-core is provided by several libraries, try to
 exclude it from spark-core, i think the minor version is included within
 it.

>>>
>>> There is no more than one jackson-core provides by spark-core. There are
>>> jackson-core and jackson-core-asl but are differents artifacts. BTW, I
>>> tried to exclude then but no sucess. Same error:
>>>
>>> java.lang.IllegalAccessError: tried to access method
>>> com.fasterxml.jackson.databind.introspect.AnnotatedMember.getAllAnnotations()Lcom/fasterxml/jackson/databind/introspect/AnnotationMap;
>>> from class
>>> com.fasterxml.jackson.databind.introspect.GuiceAnnotationIntrospector
>>> at
>>> com.fasterxml.jackson.databind.introspect.GuiceAnnotationIntrospector.findInjectableValueId(GuiceAnnotationIntrospector.java:39)
>>> at
>>> com.fasterxml.jackson.databind.introspect.AnnotationIntrospectorPair.findInjectableValueId(AnnotationIntrospectorPair.java:269)
>>> at
>>> com.fasterxml.jackson.databind.deser.BasicDeserializerFactory._addDeserializerConstructors(BasicDeserializerFactory.java:433)
>>> ...
>>>
>>> I guess the problem is incopabilities between jackson artifacts that
>>> comes from tranquility dependency vs spark prodided but I also tried to
>>> find same jackson artifacts but in different versions but there is no one.
>>> What is missing?
>>>
>>>
>>> Use this guide to see how to do it:


 https://maven.apache.org/guides/introduction/introduction-to-optional-and-excludes-dependencies.html



 Alonso Isidoro Roman.

 Mis citas preferidas (de hoy) :
 "Si depurar es el proceso de quitar los errores de software, entonces
 programar debe ser el proceso de introducirlos..."
  -  Edsger Dijkstra

 My favorite quotes (today):
 "If debugging is the process of removing software bugs, then
 programming must be the process of putting ..."
   - Edsger Dijkstra

 "If you pay peanuts you get monkeys"


 2016-03-31 20:01 GMT+02:00 Marcelo Oikawa 
 :

> Hey, Alonso.
>
> here is the output:
>
> [INFO] spark-processor:spark-processor-druid:jar:1.0-SNAPSHOT
> [INFO] +- org.apache.spark:spark-streaming_2.10:jar:1.6.1:provided
> [INFO] |  +- org.apache.spark:spark-core_2.10:jar:1.6.1:provided
> [INFO] |  |  +- org.apache.avro:avro-mapred:jar:hadoop2:1.7.7:provided
> [INFO] |  |  |  +- org.apache.avro:avro-ipc:jar:1.7.7:provided
> [INFO] |  |  |  \- org.apache.avro:avro-ipc:jar:tests:1.7.7:provided
> [INFO] |  |  +- com.twitter:chill_2.10:jar:0.5.0:provided
> [INFO] |  |  |  \- com.esotericsoftware.kryo:kryo:jar:2.21:provided
> [INFO] |  |  | +-
> com.esotericsoftware.reflectasm:reflectasm:jar:shaded:1.07:provided
> [

Re: Problem with jackson lib running on spark

2016-03-31 Thread Marcelo Oikawa
> Please exclude jackson-databind - that was where the AnnotationMap class
> comes from.
>

I tried as you suggest but i getting the same error. Seems strange because
when I see the generated jar there is nothing related as AnnotationMap but
there is a databind there.


​


>
> On Thu, Mar 31, 2016 at 11:37 AM, Marcelo Oikawa <
> marcelo.oik...@webradar.com> wrote:
>
>> Hi, Alonso.
>>
>> As you can see jackson-core is provided by several libraries, try to
>>> exclude it from spark-core, i think the minor version is included within
>>> it.
>>>
>>
>> There is no more than one jackson-core provides by spark-core. There are
>> jackson-core and jackson-core-asl but are differents artifacts. BTW, I
>> tried to exclude then but no sucess. Same error:
>>
>> java.lang.IllegalAccessError: tried to access method
>> com.fasterxml.jackson.databind.introspect.AnnotatedMember.getAllAnnotations()Lcom/fasterxml/jackson/databind/introspect/AnnotationMap;
>> from class
>> com.fasterxml.jackson.databind.introspect.GuiceAnnotationIntrospector
>> at
>> com.fasterxml.jackson.databind.introspect.GuiceAnnotationIntrospector.findInjectableValueId(GuiceAnnotationIntrospector.java:39)
>> at
>> com.fasterxml.jackson.databind.introspect.AnnotationIntrospectorPair.findInjectableValueId(AnnotationIntrospectorPair.java:269)
>> at
>> com.fasterxml.jackson.databind.deser.BasicDeserializerFactory._addDeserializerConstructors(BasicDeserializerFactory.java:433)
>> ...
>>
>> I guess the problem is incopabilities between jackson artifacts that
>> comes from tranquility dependency vs spark prodided but I also tried to
>> find same jackson artifacts but in different versions but there is no one.
>> What is missing?
>>
>>
>> Use this guide to see how to do it:
>>>
>>>
>>> https://maven.apache.org/guides/introduction/introduction-to-optional-and-excludes-dependencies.html
>>>
>>>
>>>
>>> Alonso Isidoro Roman.
>>>
>>> Mis citas preferidas (de hoy) :
>>> "Si depurar es el proceso de quitar los errores de software, entonces
>>> programar debe ser el proceso de introducirlos..."
>>>  -  Edsger Dijkstra
>>>
>>> My favorite quotes (today):
>>> "If debugging is the process of removing software bugs, then programming
>>> must be the process of putting ..."
>>>   - Edsger Dijkstra
>>>
>>> "If you pay peanuts you get monkeys"
>>>
>>>
>>> 2016-03-31 20:01 GMT+02:00 Marcelo Oikawa :
>>>
 Hey, Alonso.

 here is the output:

 [INFO] spark-processor:spark-processor-druid:jar:1.0-SNAPSHOT
 [INFO] +- org.apache.spark:spark-streaming_2.10:jar:1.6.1:provided
 [INFO] |  +- org.apache.spark:spark-core_2.10:jar:1.6.1:provided
 [INFO] |  |  +- org.apache.avro:avro-mapred:jar:hadoop2:1.7.7:provided
 [INFO] |  |  |  +- org.apache.avro:avro-ipc:jar:1.7.7:provided
 [INFO] |  |  |  \- org.apache.avro:avro-ipc:jar:tests:1.7.7:provided
 [INFO] |  |  +- com.twitter:chill_2.10:jar:0.5.0:provided
 [INFO] |  |  |  \- com.esotericsoftware.kryo:kryo:jar:2.21:provided
 [INFO] |  |  | +-
 com.esotericsoftware.reflectasm:reflectasm:jar:shaded:1.07:provided
 [INFO] |  |  | +-
 com.esotericsoftware.minlog:minlog:jar:1.2:provided
 [INFO] |  |  | \- org.objenesis:objenesis:jar:1.2:provided
 [INFO] |  |  +- com.twitter:chill-java:jar:0.5.0:provided
 [INFO] |  |  +- org.apache.xbean:xbean-asm5-shaded:jar:4.4:provided
 [INFO] |  |  +- org.apache.hadoop:hadoop-client:jar:2.2.0:provided
 [INFO] |  |  |  +- org.apache.hadoop:hadoop-common:jar:2.2.0:provided
 [INFO] |  |  |  |  +- org.apache.commons:commons-math:jar:2.1:provided
 [INFO] |  |  |  |  +- xmlenc:xmlenc:jar:0.52:provided
 [INFO] |  |  |  |  +-
 commons-configuration:commons-configuration:jar:1.6:provided
 [INFO] |  |  |  |  |  +-
 commons-digester:commons-digester:jar:1.8:provided
 [INFO] |  |  |  |  |  |  \-
 commons-beanutils:commons-beanutils:jar:1.7.0:provided
 [INFO] |  |  |  |  |  \-
 commons-beanutils:commons-beanutils-core:jar:1.8.0:provided
 [INFO] |  |  |  |  \- org.apache.hadoop:hadoop-auth:jar:2.2.0:provided
 [INFO] |  |  |  +- org.apache.hadoop:hadoop-hdfs:jar:2.2.0:provided
 [INFO] |  |  |  |  \- org.mortbay.jetty:jetty-util:jar:6.1.26:provided
 [INFO] |  |  |  +-
 org.apache.hadoop:hadoop-mapreduce-client-app:jar:2.2.0:provided
 [INFO] |  |  |  |  +-
 org.apache.hadoop:hadoop-mapreduce-client-common:jar:2.2.0:provided
 [INFO] |  |  |  |  |  +-
 org.apache.hadoop:hadoop-yarn-client:jar:2.2.0:provided
 [INFO] |  |  |  |  |  |  +-
 com.sun.jersey.jersey-test-framework:jersey-test-framework-grizzly2:jar:1.9:provided
 [INFO] |  |  |  |  |  |  |  +-
 com.sun.jersey.jersey-test-framework:jersey-test-framework-core:jar:1.9:provided
 [INFO] |  |  |  |  |  |  |  |  \-
 com.sun.jersey:jersey-client:jar:1.9:provided
 [INFO] |  |  |  |  |  |  |  \-
 com.sun.jersey:jersey-grizzly2:jar:1.9:provided
 [INFO] |  |  |  |

Re: Problem with jackson lib running on spark

2016-03-31 Thread Ted Yu
Please exclude jackson-databind - that was where the AnnotationMap class
comes from.

On Thu, Mar 31, 2016 at 11:37 AM, Marcelo Oikawa <
marcelo.oik...@webradar.com> wrote:

> Hi, Alonso.
>
> As you can see jackson-core is provided by several libraries, try to
>> exclude it from spark-core, i think the minor version is included within
>> it.
>>
>
> There is no more than one jackson-core provides by spark-core. There are
> jackson-core and jackson-core-asl but are differents artifacts. BTW, I
> tried to exclude then but no sucess. Same error:
>
> java.lang.IllegalAccessError: tried to access method
> com.fasterxml.jackson.databind.introspect.AnnotatedMember.getAllAnnotations()Lcom/fasterxml/jackson/databind/introspect/AnnotationMap;
> from class
> com.fasterxml.jackson.databind.introspect.GuiceAnnotationIntrospector
> at
> com.fasterxml.jackson.databind.introspect.GuiceAnnotationIntrospector.findInjectableValueId(GuiceAnnotationIntrospector.java:39)
> at
> com.fasterxml.jackson.databind.introspect.AnnotationIntrospectorPair.findInjectableValueId(AnnotationIntrospectorPair.java:269)
> at
> com.fasterxml.jackson.databind.deser.BasicDeserializerFactory._addDeserializerConstructors(BasicDeserializerFactory.java:433)
> ...
>
> I guess the problem is incopabilities between jackson artifacts that comes
> from tranquility dependency vs spark prodided but I also tried to find same
> jackson artifacts but in different versions but there is no one. What is
> missing?
>
>
> Use this guide to see how to do it:
>>
>>
>> https://maven.apache.org/guides/introduction/introduction-to-optional-and-excludes-dependencies.html
>>
>>
>>
>> Alonso Isidoro Roman.
>>
>> Mis citas preferidas (de hoy) :
>> "Si depurar es el proceso de quitar los errores de software, entonces
>> programar debe ser el proceso de introducirlos..."
>>  -  Edsger Dijkstra
>>
>> My favorite quotes (today):
>> "If debugging is the process of removing software bugs, then programming
>> must be the process of putting ..."
>>   - Edsger Dijkstra
>>
>> "If you pay peanuts you get monkeys"
>>
>>
>> 2016-03-31 20:01 GMT+02:00 Marcelo Oikawa :
>>
>>> Hey, Alonso.
>>>
>>> here is the output:
>>>
>>> [INFO] spark-processor:spark-processor-druid:jar:1.0-SNAPSHOT
>>> [INFO] +- org.apache.spark:spark-streaming_2.10:jar:1.6.1:provided
>>> [INFO] |  +- org.apache.spark:spark-core_2.10:jar:1.6.1:provided
>>> [INFO] |  |  +- org.apache.avro:avro-mapred:jar:hadoop2:1.7.7:provided
>>> [INFO] |  |  |  +- org.apache.avro:avro-ipc:jar:1.7.7:provided
>>> [INFO] |  |  |  \- org.apache.avro:avro-ipc:jar:tests:1.7.7:provided
>>> [INFO] |  |  +- com.twitter:chill_2.10:jar:0.5.0:provided
>>> [INFO] |  |  |  \- com.esotericsoftware.kryo:kryo:jar:2.21:provided
>>> [INFO] |  |  | +-
>>> com.esotericsoftware.reflectasm:reflectasm:jar:shaded:1.07:provided
>>> [INFO] |  |  | +- com.esotericsoftware.minlog:minlog:jar:1.2:provided
>>> [INFO] |  |  | \- org.objenesis:objenesis:jar:1.2:provided
>>> [INFO] |  |  +- com.twitter:chill-java:jar:0.5.0:provided
>>> [INFO] |  |  +- org.apache.xbean:xbean-asm5-shaded:jar:4.4:provided
>>> [INFO] |  |  +- org.apache.hadoop:hadoop-client:jar:2.2.0:provided
>>> [INFO] |  |  |  +- org.apache.hadoop:hadoop-common:jar:2.2.0:provided
>>> [INFO] |  |  |  |  +- org.apache.commons:commons-math:jar:2.1:provided
>>> [INFO] |  |  |  |  +- xmlenc:xmlenc:jar:0.52:provided
>>> [INFO] |  |  |  |  +-
>>> commons-configuration:commons-configuration:jar:1.6:provided
>>> [INFO] |  |  |  |  |  +-
>>> commons-digester:commons-digester:jar:1.8:provided
>>> [INFO] |  |  |  |  |  |  \-
>>> commons-beanutils:commons-beanutils:jar:1.7.0:provided
>>> [INFO] |  |  |  |  |  \-
>>> commons-beanutils:commons-beanutils-core:jar:1.8.0:provided
>>> [INFO] |  |  |  |  \- org.apache.hadoop:hadoop-auth:jar:2.2.0:provided
>>> [INFO] |  |  |  +- org.apache.hadoop:hadoop-hdfs:jar:2.2.0:provided
>>> [INFO] |  |  |  |  \- org.mortbay.jetty:jetty-util:jar:6.1.26:provided
>>> [INFO] |  |  |  +-
>>> org.apache.hadoop:hadoop-mapreduce-client-app:jar:2.2.0:provided
>>> [INFO] |  |  |  |  +-
>>> org.apache.hadoop:hadoop-mapreduce-client-common:jar:2.2.0:provided
>>> [INFO] |  |  |  |  |  +-
>>> org.apache.hadoop:hadoop-yarn-client:jar:2.2.0:provided
>>> [INFO] |  |  |  |  |  |  +-
>>> com.sun.jersey.jersey-test-framework:jersey-test-framework-grizzly2:jar:1.9:provided
>>> [INFO] |  |  |  |  |  |  |  +-
>>> com.sun.jersey.jersey-test-framework:jersey-test-framework-core:jar:1.9:provided
>>> [INFO] |  |  |  |  |  |  |  |  \-
>>> com.sun.jersey:jersey-client:jar:1.9:provided
>>> [INFO] |  |  |  |  |  |  |  \-
>>> com.sun.jersey:jersey-grizzly2:jar:1.9:provided
>>> [INFO] |  |  |  |  |  |  | +-
>>> org.glassfish.grizzly:grizzly-http:jar:2.1.2:provided
>>> [INFO] |  |  |  |  |  |  | |  \-
>>> org.glassfish.grizzly:grizzly-framework:jar:2.1.2:provided
>>> [INFO] |  |  |  |  |  |  | | \-
>>> org.glassfish.gmbal:gmbal-api-only:jar:3.0.0-b023:provided
>>> [INFO] |  | 

Re: Problem with jackson lib running on spark

2016-03-31 Thread Marcelo Oikawa
Hi, Alonso.

As you can see jackson-core is provided by several libraries, try to
> exclude it from spark-core, i think the minor version is included within
> it.
>

There is no more than one jackson-core provides by spark-core. There are
jackson-core and jackson-core-asl but are differents artifacts. BTW, I
tried to exclude then but no sucess. Same error:

java.lang.IllegalAccessError: tried to access method
com.fasterxml.jackson.databind.introspect.AnnotatedMember.getAllAnnotations()Lcom/fasterxml/jackson/databind/introspect/AnnotationMap;
from class
com.fasterxml.jackson.databind.introspect.GuiceAnnotationIntrospector
at
com.fasterxml.jackson.databind.introspect.GuiceAnnotationIntrospector.findInjectableValueId(GuiceAnnotationIntrospector.java:39)
at
com.fasterxml.jackson.databind.introspect.AnnotationIntrospectorPair.findInjectableValueId(AnnotationIntrospectorPair.java:269)
at
com.fasterxml.jackson.databind.deser.BasicDeserializerFactory._addDeserializerConstructors(BasicDeserializerFactory.java:433)
...

I guess the problem is incopabilities between jackson artifacts that comes
from tranquility dependency vs spark prodided but I also tried to find same
jackson artifacts but in different versions but there is no one. What is
missing?


Use this guide to see how to do it:
>
>
> https://maven.apache.org/guides/introduction/introduction-to-optional-and-excludes-dependencies.html
>
>
>
> Alonso Isidoro Roman.
>
> Mis citas preferidas (de hoy) :
> "Si depurar es el proceso de quitar los errores de software, entonces
> programar debe ser el proceso de introducirlos..."
>  -  Edsger Dijkstra
>
> My favorite quotes (today):
> "If debugging is the process of removing software bugs, then programming
> must be the process of putting ..."
>   - Edsger Dijkstra
>
> "If you pay peanuts you get monkeys"
>
>
> 2016-03-31 20:01 GMT+02:00 Marcelo Oikawa :
>
>> Hey, Alonso.
>>
>> here is the output:
>>
>> [INFO] spark-processor:spark-processor-druid:jar:1.0-SNAPSHOT
>> [INFO] +- org.apache.spark:spark-streaming_2.10:jar:1.6.1:provided
>> [INFO] |  +- org.apache.spark:spark-core_2.10:jar:1.6.1:provided
>> [INFO] |  |  +- org.apache.avro:avro-mapred:jar:hadoop2:1.7.7:provided
>> [INFO] |  |  |  +- org.apache.avro:avro-ipc:jar:1.7.7:provided
>> [INFO] |  |  |  \- org.apache.avro:avro-ipc:jar:tests:1.7.7:provided
>> [INFO] |  |  +- com.twitter:chill_2.10:jar:0.5.0:provided
>> [INFO] |  |  |  \- com.esotericsoftware.kryo:kryo:jar:2.21:provided
>> [INFO] |  |  | +-
>> com.esotericsoftware.reflectasm:reflectasm:jar:shaded:1.07:provided
>> [INFO] |  |  | +- com.esotericsoftware.minlog:minlog:jar:1.2:provided
>> [INFO] |  |  | \- org.objenesis:objenesis:jar:1.2:provided
>> [INFO] |  |  +- com.twitter:chill-java:jar:0.5.0:provided
>> [INFO] |  |  +- org.apache.xbean:xbean-asm5-shaded:jar:4.4:provided
>> [INFO] |  |  +- org.apache.hadoop:hadoop-client:jar:2.2.0:provided
>> [INFO] |  |  |  +- org.apache.hadoop:hadoop-common:jar:2.2.0:provided
>> [INFO] |  |  |  |  +- org.apache.commons:commons-math:jar:2.1:provided
>> [INFO] |  |  |  |  +- xmlenc:xmlenc:jar:0.52:provided
>> [INFO] |  |  |  |  +-
>> commons-configuration:commons-configuration:jar:1.6:provided
>> [INFO] |  |  |  |  |  +-
>> commons-digester:commons-digester:jar:1.8:provided
>> [INFO] |  |  |  |  |  |  \-
>> commons-beanutils:commons-beanutils:jar:1.7.0:provided
>> [INFO] |  |  |  |  |  \-
>> commons-beanutils:commons-beanutils-core:jar:1.8.0:provided
>> [INFO] |  |  |  |  \- org.apache.hadoop:hadoop-auth:jar:2.2.0:provided
>> [INFO] |  |  |  +- org.apache.hadoop:hadoop-hdfs:jar:2.2.0:provided
>> [INFO] |  |  |  |  \- org.mortbay.jetty:jetty-util:jar:6.1.26:provided
>> [INFO] |  |  |  +-
>> org.apache.hadoop:hadoop-mapreduce-client-app:jar:2.2.0:provided
>> [INFO] |  |  |  |  +-
>> org.apache.hadoop:hadoop-mapreduce-client-common:jar:2.2.0:provided
>> [INFO] |  |  |  |  |  +-
>> org.apache.hadoop:hadoop-yarn-client:jar:2.2.0:provided
>> [INFO] |  |  |  |  |  |  +-
>> com.sun.jersey.jersey-test-framework:jersey-test-framework-grizzly2:jar:1.9:provided
>> [INFO] |  |  |  |  |  |  |  +-
>> com.sun.jersey.jersey-test-framework:jersey-test-framework-core:jar:1.9:provided
>> [INFO] |  |  |  |  |  |  |  |  \-
>> com.sun.jersey:jersey-client:jar:1.9:provided
>> [INFO] |  |  |  |  |  |  |  \-
>> com.sun.jersey:jersey-grizzly2:jar:1.9:provided
>> [INFO] |  |  |  |  |  |  | +-
>> org.glassfish.grizzly:grizzly-http:jar:2.1.2:provided
>> [INFO] |  |  |  |  |  |  | |  \-
>> org.glassfish.grizzly:grizzly-framework:jar:2.1.2:provided
>> [INFO] |  |  |  |  |  |  | | \-
>> org.glassfish.gmbal:gmbal-api-only:jar:3.0.0-b023:provided
>> [INFO] |  |  |  |  |  |  | |\-
>> org.glassfish.external:management-api:jar:3.0.0-b012:provided
>> [INFO] |  |  |  |  |  |  | +-
>> org.glassfish.grizzly:grizzly-http-server:jar:2.1.2:provided
>> [INFO] |  |  |  |  |  |  | |  \-
>> org.glassfish.grizzly:grizzly-rcm:jar:2.1.2:provided
>> [INFO]

Re: Problem with jackson lib running on spark

2016-03-31 Thread Alonso Isidoro Roman
As you can see jackson-core is provided by several libraries, try to
exclude it from spark-core, i think the minor version is included within
it.

Use this guide to see how to do it:

https://maven.apache.org/guides/introduction/introduction-to-optional-and-excludes-dependencies.html



Alonso Isidoro Roman.

Mis citas preferidas (de hoy) :
"Si depurar es el proceso de quitar los errores de software, entonces
programar debe ser el proceso de introducirlos..."
 -  Edsger Dijkstra

My favorite quotes (today):
"If debugging is the process of removing software bugs, then programming
must be the process of putting ..."
  - Edsger Dijkstra

"If you pay peanuts you get monkeys"


2016-03-31 20:01 GMT+02:00 Marcelo Oikawa :

> Hey, Alonso.
>
> here is the output:
>
> [INFO] spark-processor:spark-processor-druid:jar:1.0-SNAPSHOT
> [INFO] +- org.apache.spark:spark-streaming_2.10:jar:1.6.1:provided
> [INFO] |  +- org.apache.spark:spark-core_2.10:jar:1.6.1:provided
> [INFO] |  |  +- org.apache.avro:avro-mapred:jar:hadoop2:1.7.7:provided
> [INFO] |  |  |  +- org.apache.avro:avro-ipc:jar:1.7.7:provided
> [INFO] |  |  |  \- org.apache.avro:avro-ipc:jar:tests:1.7.7:provided
> [INFO] |  |  +- com.twitter:chill_2.10:jar:0.5.0:provided
> [INFO] |  |  |  \- com.esotericsoftware.kryo:kryo:jar:2.21:provided
> [INFO] |  |  | +-
> com.esotericsoftware.reflectasm:reflectasm:jar:shaded:1.07:provided
> [INFO] |  |  | +- com.esotericsoftware.minlog:minlog:jar:1.2:provided
> [INFO] |  |  | \- org.objenesis:objenesis:jar:1.2:provided
> [INFO] |  |  +- com.twitter:chill-java:jar:0.5.0:provided
> [INFO] |  |  +- org.apache.xbean:xbean-asm5-shaded:jar:4.4:provided
> [INFO] |  |  +- org.apache.hadoop:hadoop-client:jar:2.2.0:provided
> [INFO] |  |  |  +- org.apache.hadoop:hadoop-common:jar:2.2.0:provided
> [INFO] |  |  |  |  +- org.apache.commons:commons-math:jar:2.1:provided
> [INFO] |  |  |  |  +- xmlenc:xmlenc:jar:0.52:provided
> [INFO] |  |  |  |  +-
> commons-configuration:commons-configuration:jar:1.6:provided
> [INFO] |  |  |  |  |  +- commons-digester:commons-digester:jar:1.8:provided
> [INFO] |  |  |  |  |  |  \-
> commons-beanutils:commons-beanutils:jar:1.7.0:provided
> [INFO] |  |  |  |  |  \-
> commons-beanutils:commons-beanutils-core:jar:1.8.0:provided
> [INFO] |  |  |  |  \- org.apache.hadoop:hadoop-auth:jar:2.2.0:provided
> [INFO] |  |  |  +- org.apache.hadoop:hadoop-hdfs:jar:2.2.0:provided
> [INFO] |  |  |  |  \- org.mortbay.jetty:jetty-util:jar:6.1.26:provided
> [INFO] |  |  |  +-
> org.apache.hadoop:hadoop-mapreduce-client-app:jar:2.2.0:provided
> [INFO] |  |  |  |  +-
> org.apache.hadoop:hadoop-mapreduce-client-common:jar:2.2.0:provided
> [INFO] |  |  |  |  |  +-
> org.apache.hadoop:hadoop-yarn-client:jar:2.2.0:provided
> [INFO] |  |  |  |  |  |  +-
> com.sun.jersey.jersey-test-framework:jersey-test-framework-grizzly2:jar:1.9:provided
> [INFO] |  |  |  |  |  |  |  +-
> com.sun.jersey.jersey-test-framework:jersey-test-framework-core:jar:1.9:provided
> [INFO] |  |  |  |  |  |  |  |  \-
> com.sun.jersey:jersey-client:jar:1.9:provided
> [INFO] |  |  |  |  |  |  |  \-
> com.sun.jersey:jersey-grizzly2:jar:1.9:provided
> [INFO] |  |  |  |  |  |  | +-
> org.glassfish.grizzly:grizzly-http:jar:2.1.2:provided
> [INFO] |  |  |  |  |  |  | |  \-
> org.glassfish.grizzly:grizzly-framework:jar:2.1.2:provided
> [INFO] |  |  |  |  |  |  | | \-
> org.glassfish.gmbal:gmbal-api-only:jar:3.0.0-b023:provided
> [INFO] |  |  |  |  |  |  | |\-
> org.glassfish.external:management-api:jar:3.0.0-b012:provided
> [INFO] |  |  |  |  |  |  | +-
> org.glassfish.grizzly:grizzly-http-server:jar:2.1.2:provided
> [INFO] |  |  |  |  |  |  | |  \-
> org.glassfish.grizzly:grizzly-rcm:jar:2.1.2:provided
> [INFO] |  |  |  |  |  |  | +-
> org.glassfish.grizzly:grizzly-http-servlet:jar:2.1.2:provided
> [INFO] |  |  |  |  |  |  | \-
> org.glassfish:javax.servlet:jar:3.1:provided
> [INFO] |  |  |  |  |  |  \- com.sun.jersey:jersey-json:jar:1.9:provided
> [INFO] |  |  |  |  |  | +-
> org.codehaus.jettison:jettison:jar:1.1:provided
> [INFO] |  |  |  |  |  | |  \- stax:stax-api:jar:1.0.1:provided
> [INFO] |  |  |  |  |  | +-
> com.sun.xml.bind:jaxb-impl:jar:2.2.3-1:provided
> [INFO] |  |  |  |  |  | |  \-
> javax.xml.bind:jaxb-api:jar:2.2.2:provided
> [INFO] |  |  |  |  |  | | \-
> javax.activation:activation:jar:1.1:provided
> [INFO] |  |  |  |  |  | +-
> org.codehaus.jackson:jackson-jaxrs:jar:1.8.3:provided
> [INFO] |  |  |  |  |  | \-
> org.codehaus.jackson:jackson-xc:jar:1.8.3:provided
> [INFO] |  |  |  |  |  \-
> org.apache.hadoop:hadoop-yarn-server-common:jar:2.2.0:provided
> [INFO] |  |  |  |  \-
> org.apache.hadoop:hadoop-mapreduce-client-shuffle:jar:2.2.0:provided
> [INFO] |  |  |  +- org.apache.hadoop:hadoop-yarn-api:jar:2.2.0:provided
> [INFO] |  |  |  +-
> org.apache.hadoop:hadoop-mapreduce-client-core:jar:2.2.0:provided
> [INFO] |  |  |  |  \-
> org.apache.hadoop:hadoop-y

Re: Problem with jackson lib running on spark

2016-03-31 Thread Marcelo Oikawa
Hey, Alonso.

here is the output:

[INFO] spark-processor:spark-processor-druid:jar:1.0-SNAPSHOT
[INFO] +- org.apache.spark:spark-streaming_2.10:jar:1.6.1:provided
[INFO] |  +- org.apache.spark:spark-core_2.10:jar:1.6.1:provided
[INFO] |  |  +- org.apache.avro:avro-mapred:jar:hadoop2:1.7.7:provided
[INFO] |  |  |  +- org.apache.avro:avro-ipc:jar:1.7.7:provided
[INFO] |  |  |  \- org.apache.avro:avro-ipc:jar:tests:1.7.7:provided
[INFO] |  |  +- com.twitter:chill_2.10:jar:0.5.0:provided
[INFO] |  |  |  \- com.esotericsoftware.kryo:kryo:jar:2.21:provided
[INFO] |  |  | +-
com.esotericsoftware.reflectasm:reflectasm:jar:shaded:1.07:provided
[INFO] |  |  | +- com.esotericsoftware.minlog:minlog:jar:1.2:provided
[INFO] |  |  | \- org.objenesis:objenesis:jar:1.2:provided
[INFO] |  |  +- com.twitter:chill-java:jar:0.5.0:provided
[INFO] |  |  +- org.apache.xbean:xbean-asm5-shaded:jar:4.4:provided
[INFO] |  |  +- org.apache.hadoop:hadoop-client:jar:2.2.0:provided
[INFO] |  |  |  +- org.apache.hadoop:hadoop-common:jar:2.2.0:provided
[INFO] |  |  |  |  +- org.apache.commons:commons-math:jar:2.1:provided
[INFO] |  |  |  |  +- xmlenc:xmlenc:jar:0.52:provided
[INFO] |  |  |  |  +-
commons-configuration:commons-configuration:jar:1.6:provided
[INFO] |  |  |  |  |  +- commons-digester:commons-digester:jar:1.8:provided
[INFO] |  |  |  |  |  |  \-
commons-beanutils:commons-beanutils:jar:1.7.0:provided
[INFO] |  |  |  |  |  \-
commons-beanutils:commons-beanutils-core:jar:1.8.0:provided
[INFO] |  |  |  |  \- org.apache.hadoop:hadoop-auth:jar:2.2.0:provided
[INFO] |  |  |  +- org.apache.hadoop:hadoop-hdfs:jar:2.2.0:provided
[INFO] |  |  |  |  \- org.mortbay.jetty:jetty-util:jar:6.1.26:provided
[INFO] |  |  |  +-
org.apache.hadoop:hadoop-mapreduce-client-app:jar:2.2.0:provided
[INFO] |  |  |  |  +-
org.apache.hadoop:hadoop-mapreduce-client-common:jar:2.2.0:provided
[INFO] |  |  |  |  |  +-
org.apache.hadoop:hadoop-yarn-client:jar:2.2.0:provided
[INFO] |  |  |  |  |  |  +-
com.sun.jersey.jersey-test-framework:jersey-test-framework-grizzly2:jar:1.9:provided
[INFO] |  |  |  |  |  |  |  +-
com.sun.jersey.jersey-test-framework:jersey-test-framework-core:jar:1.9:provided
[INFO] |  |  |  |  |  |  |  |  \-
com.sun.jersey:jersey-client:jar:1.9:provided
[INFO] |  |  |  |  |  |  |  \-
com.sun.jersey:jersey-grizzly2:jar:1.9:provided
[INFO] |  |  |  |  |  |  | +-
org.glassfish.grizzly:grizzly-http:jar:2.1.2:provided
[INFO] |  |  |  |  |  |  | |  \-
org.glassfish.grizzly:grizzly-framework:jar:2.1.2:provided
[INFO] |  |  |  |  |  |  | | \-
org.glassfish.gmbal:gmbal-api-only:jar:3.0.0-b023:provided
[INFO] |  |  |  |  |  |  | |\-
org.glassfish.external:management-api:jar:3.0.0-b012:provided
[INFO] |  |  |  |  |  |  | +-
org.glassfish.grizzly:grizzly-http-server:jar:2.1.2:provided
[INFO] |  |  |  |  |  |  | |  \-
org.glassfish.grizzly:grizzly-rcm:jar:2.1.2:provided
[INFO] |  |  |  |  |  |  | +-
org.glassfish.grizzly:grizzly-http-servlet:jar:2.1.2:provided
[INFO] |  |  |  |  |  |  | \-
org.glassfish:javax.servlet:jar:3.1:provided
[INFO] |  |  |  |  |  |  \- com.sun.jersey:jersey-json:jar:1.9:provided
[INFO] |  |  |  |  |  | +-
org.codehaus.jettison:jettison:jar:1.1:provided
[INFO] |  |  |  |  |  | |  \- stax:stax-api:jar:1.0.1:provided
[INFO] |  |  |  |  |  | +-
com.sun.xml.bind:jaxb-impl:jar:2.2.3-1:provided
[INFO] |  |  |  |  |  | |  \- javax.xml.bind:jaxb-api:jar:2.2.2:provided
[INFO] |  |  |  |  |  | | \-
javax.activation:activation:jar:1.1:provided
[INFO] |  |  |  |  |  | +-
org.codehaus.jackson:jackson-jaxrs:jar:1.8.3:provided
[INFO] |  |  |  |  |  | \-
org.codehaus.jackson:jackson-xc:jar:1.8.3:provided
[INFO] |  |  |  |  |  \-
org.apache.hadoop:hadoop-yarn-server-common:jar:2.2.0:provided
[INFO] |  |  |  |  \-
org.apache.hadoop:hadoop-mapreduce-client-shuffle:jar:2.2.0:provided
[INFO] |  |  |  +- org.apache.hadoop:hadoop-yarn-api:jar:2.2.0:provided
[INFO] |  |  |  +-
org.apache.hadoop:hadoop-mapreduce-client-core:jar:2.2.0:provided
[INFO] |  |  |  |  \-
org.apache.hadoop:hadoop-yarn-common:jar:2.2.0:provided
[INFO] |  |  |  +-
org.apache.hadoop:hadoop-mapreduce-client-jobclient:jar:2.2.0:provided
[INFO] |  |  |  \- org.apache.hadoop:hadoop-annotations:jar:2.2.0:provided
[INFO] |  |  +- org.apache.spark:spark-launcher_2.10:jar:1.6.1:provided
[INFO] |  |  +-
org.apache.spark:spark-network-common_2.10:jar:1.6.1:provided
[INFO] |  |  +-
org.apache.spark:spark-network-shuffle_2.10:jar:1.6.1:provided
[INFO] |  |  |  \- org.fusesource.leveldbjni:leveldbjni-all:jar:1.8:provided
[INFO] |  |  +- org.apache.spark:spark-unsafe_2.10:jar:1.6.1:provided
[INFO] |  |  +- net.java.dev.jets3t:jets3t:jar:0.7.1:provided
[INFO] |  |  |  \- commons-httpclient:commons-httpclient:jar:3.1:provided
[INFO] |  |  +- org.apache.curator:curator-recipes:jar:2.4.0:compile
[INFO] |  |  +-
org.eclipse.jetty.orbit:javax.servlet:jar:3.0.0.v201112011016:provided
[INFO] |  |  +- org.apache.commo

Re: Problem with jackson lib running on spark

2016-03-31 Thread Alonso Isidoro Roman
Run mvn dependency:tree and print the output here, i suspect that jackson
library is included within more than one dependency.

Alonso Isidoro Roman.

Mis citas preferidas (de hoy) :
"Si depurar es el proceso de quitar los errores de software, entonces
programar debe ser el proceso de introducirlos..."
 -  Edsger Dijkstra

My favorite quotes (today):
"If debugging is the process of removing software bugs, then programming
must be the process of putting ..."
  - Edsger Dijkstra

"If you pay peanuts you get monkeys"


2016-03-31 19:21 GMT+02:00 Marcelo Oikawa :

> Hey, Ted
>
> 2.4.4
>>
>> Looks like Tranquility uses different version of jackson.
>>
>> How do you build your jar ?
>>
>
> I'm building a jar with dependencies using the maven assembly plugin.
> Below is all jackson's dependencies:
>
> [INFO]
> com.fasterxml.jackson.module:jackson-module-scala_2.10:jar:2.4.5:compile
> [INFO]com.fasterxml.jackson.core:jackson-databind:jar:2.4.6:compile
> [INFO]
> com.fasterxml.jackson.datatype:jackson-datatype-joda:jar:2.4.6:compile
> [INFO]
> com.fasterxml.jackson.dataformat:jackson-dataformat-smile:jar:2.4.6:compile
> [INFO]
> com.fasterxml.jackson.jaxrs:jackson-jaxrs-smile-provider:jar:2.4.6:compile
> [INFO]com.fasterxml.jackson.jaxrs:jackson-jaxrs-base:jar:2.4.6:compile
> [INFO]
> com.fasterxml.jackson.module:jackson-module-jaxb-annotations:jar:2.4.6:compile
> [INFO]com.fasterxml.jackson.core:jackson-core:jar:2.4.6:compile
> [INFO]
> com.fasterxml.jackson.jaxrs:jackson-jaxrs-json-provider:jar:2.4.6:compile
> [INFO]
> com.fasterxml.jackson.datatype:jackson-datatype-guava:jar:2.4.6:compile
> [INFO]com.fasterxml.jackson.core:jackson-annotations:jar:2.4.6:compile
> [INFO]org.json4s:json4s-jackson_2.10:jar:3.2.10:provided
> [INFO]org.codehaus.jackson:jackson-xc:jar:1.8.3:provided
> [INFO]org.codehaus.jackson:jackson-jaxrs:jar:1.8.3:provided
> [INFO]org.codehaus.jackson:jackson-mapper-asl:jar:1.9.13:compile
> [INFO]org.codehaus.jackson:jackson-core-asl:jar:1.9.13:compile
> [INFO]
> com.google.http-client:google-http-client-jackson2:jar:1.15.0-rc:compile
>
> As you can see, my jar requires fasterxml 2.4.6. In that case, what does
> spark do? Does it run my jar with my jackson lib (inside my jar) or uses
> the jackson version (2.4.4) used by spark?
>
> Note that one of my dependencies is:
>
> 
> org.apache.spark
> spark-streaming_2.10
> 1.6.1
> provided
> 
>
> and the jackson version 2.4.4 was not listed in maven dependencies...
>
>
>>
>> Consider using maven-shade-plugin to resolve the conflict if you use
>> maven.
>>
>> Cheers
>>
>> On Thu, Mar 31, 2016 at 9:50 AM, Marcelo Oikawa <
>> marcelo.oik...@webradar.com> wrote:
>>
>>> Hi, list.
>>>
>>> We are working on a spark application that sends messages to Druid. For
>>> that, we're using Tranquility core. In my local test, I'm using the
>>> "spark-1.6.1-bin-hadoop2.6" distribution and the following dependencies in
>>> my app:
>>>
>>> 
>>> org.apache.spark
>>> spark-streaming_2.10
>>> 1.6.1
>>> provided
>>> 
>>> 
>>> io.druid
>>> tranquility-core_2.10
>>> 0.7.4
>>> 
>>>
>>> But i getting the error down below when Tranquility tries to create
>>> Tranquilizer object:
>>>
>>> tranquilizer = 
>>> DruidBeams.fromConfig(dataSourceConfig).buildTranquilizer(tranquilizerBuider);
>>>
>>> The stacktrace is down below:
>>>
>>> java.lang.IllegalAccessError: tried to access method
>>> com.fasterxml.jackson.databind.introspect.AnnotatedMember.getAllAnnotations()Lcom/fasterxml/jackson/databind/introspect/AnnotationMap;
>>> from class
>>> com.fasterxml.jackson.databind.introspect.GuiceAnnotationIntrospector
>>> at
>>> com.fasterxml.jackson.databind.introspect.GuiceAnnotationIntrospector.findInjectableValueId(GuiceAnnotationIntrospector.java:39)
>>> at
>>> com.fasterxml.jackson.databind.introspect.AnnotationIntrospectorPair.findInjectableValueId(AnnotationIntrospectorPair.java:269)
>>> at
>>> com.fasterxml.jackson.databind.deser.BasicDeserializerFactory._addDeserializerConstructors(BasicDeserializerFactory.java:433)
>>> at
>>> com.fasterxml.jackson.databind.deser.BasicDeserializerFactory._constructDefaultValueInstantiator(BasicDeserializerFactory.java:325)
>>> at
>>> com.fasterxml.jackson.databind.deser.BasicDeserializerFactory.findValueInstantiator(BasicDeserializerFactory.java:266)
>>> at
>>> com.fasterxml.jackson.databind.deser.BeanDeserializerFactory.buildBeanDeserializer(BeanDeserializerFactory.java:266)
>>> at
>>> com.fasterxml.jackson.databind.deser.BeanDeserializerFactory.createBeanDeserializer(BeanDeserializerFactory.java:168)
>>> at
>>> com.fasterxml.jackson.databind.deser.DeserializerCache._createDeserializer2(DeserializerCache.java:399)
>>> at
>>> com.fasterxml.jackson.databind.deser.DeserializerCache._createDeserializer(DeserializerCache.java:348)
>>> at
>>> com.fasterxml.jackson.databind.deser.DeserializerCache._createAndCache2(DeserializerCache.java:

Re: Problem with jackson lib running on spark

2016-03-31 Thread Marcelo Oikawa
Hey, Ted

2.4.4
>
> Looks like Tranquility uses different version of jackson.
>
> How do you build your jar ?
>

I'm building a jar with dependencies using the maven assembly plugin. Below
is all jackson's dependencies:

[INFO]
com.fasterxml.jackson.module:jackson-module-scala_2.10:jar:2.4.5:compile
[INFO]com.fasterxml.jackson.core:jackson-databind:jar:2.4.6:compile
[INFO]
com.fasterxml.jackson.datatype:jackson-datatype-joda:jar:2.4.6:compile
[INFO]
com.fasterxml.jackson.dataformat:jackson-dataformat-smile:jar:2.4.6:compile
[INFO]
com.fasterxml.jackson.jaxrs:jackson-jaxrs-smile-provider:jar:2.4.6:compile
[INFO]com.fasterxml.jackson.jaxrs:jackson-jaxrs-base:jar:2.4.6:compile
[INFO]
com.fasterxml.jackson.module:jackson-module-jaxb-annotations:jar:2.4.6:compile
[INFO]com.fasterxml.jackson.core:jackson-core:jar:2.4.6:compile
[INFO]
com.fasterxml.jackson.jaxrs:jackson-jaxrs-json-provider:jar:2.4.6:compile
[INFO]
com.fasterxml.jackson.datatype:jackson-datatype-guava:jar:2.4.6:compile
[INFO]com.fasterxml.jackson.core:jackson-annotations:jar:2.4.6:compile
[INFO]org.json4s:json4s-jackson_2.10:jar:3.2.10:provided
[INFO]org.codehaus.jackson:jackson-xc:jar:1.8.3:provided
[INFO]org.codehaus.jackson:jackson-jaxrs:jar:1.8.3:provided
[INFO]org.codehaus.jackson:jackson-mapper-asl:jar:1.9.13:compile
[INFO]org.codehaus.jackson:jackson-core-asl:jar:1.9.13:compile
[INFO]
com.google.http-client:google-http-client-jackson2:jar:1.15.0-rc:compile

As you can see, my jar requires fasterxml 2.4.6. In that case, what does
spark do? Does it run my jar with my jackson lib (inside my jar) or uses
the jackson version (2.4.4) used by spark?

Note that one of my dependencies is:


org.apache.spark
spark-streaming_2.10
1.6.1
provided


and the jackson version 2.4.4 was not listed in maven dependencies...


>
> Consider using maven-shade-plugin to resolve the conflict if you use maven.
>
> Cheers
>
> On Thu, Mar 31, 2016 at 9:50 AM, Marcelo Oikawa <
> marcelo.oik...@webradar.com> wrote:
>
>> Hi, list.
>>
>> We are working on a spark application that sends messages to Druid. For
>> that, we're using Tranquility core. In my local test, I'm using the
>> "spark-1.6.1-bin-hadoop2.6" distribution and the following dependencies in
>> my app:
>>
>> 
>> org.apache.spark
>> spark-streaming_2.10
>> 1.6.1
>> provided
>> 
>> 
>> io.druid
>> tranquility-core_2.10
>> 0.7.4
>> 
>>
>> But i getting the error down below when Tranquility tries to create
>> Tranquilizer object:
>>
>> tranquilizer = 
>> DruidBeams.fromConfig(dataSourceConfig).buildTranquilizer(tranquilizerBuider);
>>
>> The stacktrace is down below:
>>
>> java.lang.IllegalAccessError: tried to access method
>> com.fasterxml.jackson.databind.introspect.AnnotatedMember.getAllAnnotations()Lcom/fasterxml/jackson/databind/introspect/AnnotationMap;
>> from class
>> com.fasterxml.jackson.databind.introspect.GuiceAnnotationIntrospector
>> at
>> com.fasterxml.jackson.databind.introspect.GuiceAnnotationIntrospector.findInjectableValueId(GuiceAnnotationIntrospector.java:39)
>> at
>> com.fasterxml.jackson.databind.introspect.AnnotationIntrospectorPair.findInjectableValueId(AnnotationIntrospectorPair.java:269)
>> at
>> com.fasterxml.jackson.databind.deser.BasicDeserializerFactory._addDeserializerConstructors(BasicDeserializerFactory.java:433)
>> at
>> com.fasterxml.jackson.databind.deser.BasicDeserializerFactory._constructDefaultValueInstantiator(BasicDeserializerFactory.java:325)
>> at
>> com.fasterxml.jackson.databind.deser.BasicDeserializerFactory.findValueInstantiator(BasicDeserializerFactory.java:266)
>> at
>> com.fasterxml.jackson.databind.deser.BeanDeserializerFactory.buildBeanDeserializer(BeanDeserializerFactory.java:266)
>> at
>> com.fasterxml.jackson.databind.deser.BeanDeserializerFactory.createBeanDeserializer(BeanDeserializerFactory.java:168)
>> at
>> com.fasterxml.jackson.databind.deser.DeserializerCache._createDeserializer2(DeserializerCache.java:399)
>> at
>> com.fasterxml.jackson.databind.deser.DeserializerCache._createDeserializer(DeserializerCache.java:348)
>> at
>> com.fasterxml.jackson.databind.deser.DeserializerCache._createAndCache2(DeserializerCache.java:261)
>> at
>> com.fasterxml.jackson.databind.deser.DeserializerCache._createAndCacheValueDeserializer(DeserializerCache.java:241)
>> at
>> com.fasterxml.jackson.databind.deser.DeserializerCache.findValueDeserializer(DeserializerCache.java:142)
>> at
>> com.fasterxml.jackson.databind.DeserializationContext.findContextualValueDeserializer(DeserializationContext.java:380)
>> at
>> com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.construct(PropertyBasedCreator.java:96)
>> at
>> com.fasterxml.jackson.databind.deser.BeanDeserializerBase.resolve(BeanDeserializerBase.java:413)
>> at
>> com.fasterxml.jackson.databind.deser.DeserializerCache._createAndCache2(DeserializerCache.java:292)

Re: Problem with jackson lib running on spark

2016-03-31 Thread Ted Yu
Spark 1.6.1 uses this version of jackson:

2.4.4

Looks like Tranquility uses different version of jackson.

How do you build your jar ?

Consider using maven-shade-plugin to resolve the conflict if you use maven.

Cheers

On Thu, Mar 31, 2016 at 9:50 AM, Marcelo Oikawa  wrote:

> Hi, list.
>
> We are working on a spark application that sends messages to Druid. For
> that, we're using Tranquility core. In my local test, I'm using the
> "spark-1.6.1-bin-hadoop2.6" distribution and the following dependencies in
> my app:
>
> 
> org.apache.spark
> spark-streaming_2.10
> 1.6.1
> provided
> 
> 
> io.druid
> tranquility-core_2.10
> 0.7.4
> 
>
> But i getting the error down below when Tranquility tries to create
> Tranquilizer object:
>
> tranquilizer = 
> DruidBeams.fromConfig(dataSourceConfig).buildTranquilizer(tranquilizerBuider);
>
> The stacktrace is down below:
>
> java.lang.IllegalAccessError: tried to access method
> com.fasterxml.jackson.databind.introspect.AnnotatedMember.getAllAnnotations()Lcom/fasterxml/jackson/databind/introspect/AnnotationMap;
> from class
> com.fasterxml.jackson.databind.introspect.GuiceAnnotationIntrospector
> at
> com.fasterxml.jackson.databind.introspect.GuiceAnnotationIntrospector.findInjectableValueId(GuiceAnnotationIntrospector.java:39)
> at
> com.fasterxml.jackson.databind.introspect.AnnotationIntrospectorPair.findInjectableValueId(AnnotationIntrospectorPair.java:269)
> at
> com.fasterxml.jackson.databind.deser.BasicDeserializerFactory._addDeserializerConstructors(BasicDeserializerFactory.java:433)
> at
> com.fasterxml.jackson.databind.deser.BasicDeserializerFactory._constructDefaultValueInstantiator(BasicDeserializerFactory.java:325)
> at
> com.fasterxml.jackson.databind.deser.BasicDeserializerFactory.findValueInstantiator(BasicDeserializerFactory.java:266)
> at
> com.fasterxml.jackson.databind.deser.BeanDeserializerFactory.buildBeanDeserializer(BeanDeserializerFactory.java:266)
> at
> com.fasterxml.jackson.databind.deser.BeanDeserializerFactory.createBeanDeserializer(BeanDeserializerFactory.java:168)
> at
> com.fasterxml.jackson.databind.deser.DeserializerCache._createDeserializer2(DeserializerCache.java:399)
> at
> com.fasterxml.jackson.databind.deser.DeserializerCache._createDeserializer(DeserializerCache.java:348)
> at
> com.fasterxml.jackson.databind.deser.DeserializerCache._createAndCache2(DeserializerCache.java:261)
> at
> com.fasterxml.jackson.databind.deser.DeserializerCache._createAndCacheValueDeserializer(DeserializerCache.java:241)
> at
> com.fasterxml.jackson.databind.deser.DeserializerCache.findValueDeserializer(DeserializerCache.java:142)
> at
> com.fasterxml.jackson.databind.DeserializationContext.findContextualValueDeserializer(DeserializationContext.java:380)
> at
> com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.construct(PropertyBasedCreator.java:96)
> at
> com.fasterxml.jackson.databind.deser.BeanDeserializerBase.resolve(BeanDeserializerBase.java:413)
> at
> com.fasterxml.jackson.databind.deser.DeserializerCache._createAndCache2(DeserializerCache.java:292)
> at
> com.fasterxml.jackson.databind.deser.DeserializerCache._createAndCacheValueDeserializer(DeserializerCache.java:241)
> at
> com.fasterxml.jackson.databind.deser.DeserializerCache.findValueDeserializer(DeserializerCache.java:142)
> at
> com.fasterxml.jackson.databind.DeserializationContext.findRootValueDeserializer(DeserializationContext.java:394)
> at
> com.fasterxml.jackson.databind.ObjectMapper._findRootDeserializer(ObjectMapper.java:3169)
> at
> com.fasterxml.jackson.databind.ObjectMapper._convert(ObjectMapper.java:2767)
> at
> com.fasterxml.jackson.databind.ObjectMapper.convertValue(ObjectMapper.java:2700)
> at
> com.metamx.tranquility.druid.DruidBeams$.fromConfigInternal(DruidBeams.scala:192)
> at
> com.metamx.tranquility.druid.DruidBeams$.fromConfig(DruidBeams.scala:119)
> at com.metamx.tranquility.druid.DruidBeams.fromConfig(DruidBeams.scala)
>
> Does someone faced that problem too?
>
> I know that it's related to jackson lib conflict but could anyone please
> shed some light? I created a jar with dependencies and when I submit a job
> for spark, does it run with just with the libraries inside the jar, right?
> Where is the conflict between jacksons libraries?
>