Re: [Dev] [Analytics] NoClassDefFoundError:com.codahale.metrics.json.MetricsModule when installing latest analytics features in IOT server

2016-10-12 Thread Maninda Edirisooriya
Hi Rasika,

New version of orbit bundle was released based on [1] and after analytics
team release the related analytics components we just have to update the p2
repo with latest analytics versions. Then it will be fixed.

[1] https://github.com/wso2/orbit/pull/242

Thanks.


*Maninda Edirisooriya*
Senior Software Engineer

*WSO2, Inc.*lean.enterprise.middleware.

*Blog* : http://maninda.blogspot.com/
*E-mail* : mani...@wso2.com
*Skype* : @manindae
*Twitter* : @maninda

On Thu, Oct 13, 2016 at 8:10 AM, Rasika Perera  wrote:

> Hi Waruna/Maninda,
>
> Any update on this?
>
> Thanks,
> Rasika
>
> On Wed, Oct 12, 2016 at 4:38 PM, Isuru Perera  wrote:
>
>> Did anyone try Niranda's suggestion to upgrade the jackson version in
>> Spark?
>>
>> On Wed, Oct 5, 2016 at 3:29 PM, Niranda Perera  wrote:
>>
>>> Hi Ruwan,
>>>
>>> Did we try upgrading the jackson version in spark? I'm hoping that there
>>> are no API changes in jackson 2.8.3.
>>>
>>> We have done a similar exercise for guava and hadoop client
>>>
>>> Best
>>>
>>> On Wed, Oct 5, 2016 at 12:44 AM, Ruwan Yatawara  wrote:
>>>
 Hi Niranda,

 Are u referring to the spark core? if so, it is bound to json4s-jackson
 bundle.

 if we are changing the jackson version of metrics-json we will have to
 make an orbit out of it. From the way I see it, metrics-json must have
 included said version range in attempt to make the bundle future proof.
 (Latest releave version of jackson-core is 2.8.3 [1])

 Given that we have to push out a release in a weeks time, changing
 jackson version of spark is not a feasible option.

 Therefore, I am +1 for changing the jakson version range of
 metrics-json to [2.4.0,2.5.0).

 [1] -https://mvnrepository.com/artifact/com.fasterxml.jackson.co
 re/jackson-core

 Thanks and Regards,

 Ruwan Yatawara

 Associate Technical Lead,
 WSO2 Inc.

 email : ruw...@wso2.com
 mobile : +94 77 9110413
 blog : http://ruwansrants.blogspot.com/
   https://500px.com/ruwan_ace
 www: :http://wso2.com


 On Tue, Oct 4, 2016 at 7:39 PM, Niranda Perera 
 wrote:

> Hi Maninda,
>
> What are the 2 Jason versions here?
>
> Best
>
> On Tue, Oct 4, 2016 at 8:17 AM, Maninda Edirisooriya  > wrote:
>
>> + SameeraJ
>>
>> As we have found so far, the issue is due to the existence of two
>> versions of Jackson bundles exists in the IoT server pack. This was not 
>> the
>> case in DAS because IoT has APIM dependencies which brings the newer
>> version of Jackson into the environment. As Spark uses the older version 
>> of
>> Jackson and Metrics use the newer version of Jackson, importing Metrics
>> bundle to Spark bundle fails in OSGi level, because the export packages 
>> in
>> Metrics, uses some Jackson packages.
>>
>> This has several potential solutions but with inherent issues.
>>
>> 1. Release a new version of Metrics bundle having the same older
>> Jackson dependency. - Releasing with a older version of dependency may be
>> unsuitable in long term. And if in future, APIM features starts to import
>> Metrics bundle, the issue will start to happen again on that import.
>>
>> 2. Release a new version of Spark to work with newer Jackson bundles.
>> - As Spark bundle is only correctly functioning with Jackson 2.4.4 (older
>> version) and not working properly with later version of Jackson we will 
>> not
>> be able to easily release a new Spark version without fixing that issue.
>>
>> 3. Remove DAS components from the IoT server and package as separate
>> IoT Analytics server - Some customers may want to run DAS inside IoT and
>> removing DAS components from IoT server will effect the user experience 
>> for
>> a WSO2 product evaluator to run  in a single server.
>>
>> Please help to find the best approach.
>>
>> Thanks.
>>
>>
>> *Maninda Edirisooriya*
>> Senior Software Engineer
>>
>> *WSO2, Inc.*lean.enterprise.middleware.
>>
>> *Blog* : http://maninda.blogspot.com/
>> *E-mail* : mani...@wso2.com
>> *Skype* : @manindae
>> *Twitter* : @maninda
>>
>> On Tue, Oct 4, 2016 at 5:06 PM, Ruwan Yatawara 
>> wrote:
>>
>>> Hi Niranda,
>>>
>>> Yes, this bundle is active. We found this Jackson related problem
>>> upon further debugging.
>>>
>>> Thanks and Regards,
>>>
>>> Ruwan Yatawara
>>>
>>> Associate Technical Lead,
>>> WSO2 Inc.
>>>
>>> email : ruw...@wso2.com
>>> mobile : +94 77 9110413
>>> blog : http://ruwansrants.blogspot.com/
>>>   https://500px.com/ruwan_ace
>>> www: :http://wso2.com
>>>
>>>
>>> On Tue, Oct 4, 2016 at 4:49 PM, Niranda Perera 
>>> wrote:
>>>
 + RuwanY

 @Waruna

Re: [Dev] [Analytics] NoClassDefFoundError:com.codahale.metrics.json.MetricsModule when installing latest analytics features in IOT server

2016-10-12 Thread Rasika Perera
Hi Waruna/Maninda,

Any update on this?

Thanks,
Rasika

On Wed, Oct 12, 2016 at 4:38 PM, Isuru Perera  wrote:

> Did anyone try Niranda's suggestion to upgrade the jackson version in
> Spark?
>
> On Wed, Oct 5, 2016 at 3:29 PM, Niranda Perera  wrote:
>
>> Hi Ruwan,
>>
>> Did we try upgrading the jackson version in spark? I'm hoping that there
>> are no API changes in jackson 2.8.3.
>>
>> We have done a similar exercise for guava and hadoop client
>>
>> Best
>>
>> On Wed, Oct 5, 2016 at 12:44 AM, Ruwan Yatawara  wrote:
>>
>>> Hi Niranda,
>>>
>>> Are u referring to the spark core? if so, it is bound to json4s-jackson
>>> bundle.
>>>
>>> if we are changing the jackson version of metrics-json we will have to
>>> make an orbit out of it. From the way I see it, metrics-json must have
>>> included said version range in attempt to make the bundle future proof.
>>> (Latest releave version of jackson-core is 2.8.3 [1])
>>>
>>> Given that we have to push out a release in a weeks time, changing
>>> jackson version of spark is not a feasible option.
>>>
>>> Therefore, I am +1 for changing the jakson version range of
>>> metrics-json to [2.4.0,2.5.0).
>>>
>>> [1] -https://mvnrepository.com/artifact/com.fasterxml.jackson.co
>>> re/jackson-core
>>>
>>> Thanks and Regards,
>>>
>>> Ruwan Yatawara
>>>
>>> Associate Technical Lead,
>>> WSO2 Inc.
>>>
>>> email : ruw...@wso2.com
>>> mobile : +94 77 9110413
>>> blog : http://ruwansrants.blogspot.com/
>>>   https://500px.com/ruwan_ace
>>> www: :http://wso2.com
>>>
>>>
>>> On Tue, Oct 4, 2016 at 7:39 PM, Niranda Perera  wrote:
>>>
 Hi Maninda,

 What are the 2 Jason versions here?

 Best

 On Tue, Oct 4, 2016 at 8:17 AM, Maninda Edirisooriya 
 wrote:

> + SameeraJ
>
> As we have found so far, the issue is due to the existence of two
> versions of Jackson bundles exists in the IoT server pack. This was not 
> the
> case in DAS because IoT has APIM dependencies which brings the newer
> version of Jackson into the environment. As Spark uses the older version 
> of
> Jackson and Metrics use the newer version of Jackson, importing Metrics
> bundle to Spark bundle fails in OSGi level, because the export packages in
> Metrics, uses some Jackson packages.
>
> This has several potential solutions but with inherent issues.
>
> 1. Release a new version of Metrics bundle having the same older
> Jackson dependency. - Releasing with a older version of dependency may be
> unsuitable in long term. And if in future, APIM features starts to import
> Metrics bundle, the issue will start to happen again on that import.
>
> 2. Release a new version of Spark to work with newer Jackson bundles.
> - As Spark bundle is only correctly functioning with Jackson 2.4.4 (older
> version) and not working properly with later version of Jackson we will 
> not
> be able to easily release a new Spark version without fixing that issue.
>
> 3. Remove DAS components from the IoT server and package as separate
> IoT Analytics server - Some customers may want to run DAS inside IoT and
> removing DAS components from IoT server will effect the user experience 
> for
> a WSO2 product evaluator to run  in a single server.
>
> Please help to find the best approach.
>
> Thanks.
>
>
> *Maninda Edirisooriya*
> Senior Software Engineer
>
> *WSO2, Inc.*lean.enterprise.middleware.
>
> *Blog* : http://maninda.blogspot.com/
> *E-mail* : mani...@wso2.com
> *Skype* : @manindae
> *Twitter* : @maninda
>
> On Tue, Oct 4, 2016 at 5:06 PM, Ruwan Yatawara 
> wrote:
>
>> Hi Niranda,
>>
>> Yes, this bundle is active. We found this Jackson related problem
>> upon further debugging.
>>
>> Thanks and Regards,
>>
>> Ruwan Yatawara
>>
>> Associate Technical Lead,
>> WSO2 Inc.
>>
>> email : ruw...@wso2.com
>> mobile : +94 77 9110413
>> blog : http://ruwansrants.blogspot.com/
>>   https://500px.com/ruwan_ace
>> www: :http://wso2.com
>>
>>
>> On Tue, Oct 4, 2016 at 4:49 PM, Niranda Perera 
>> wrote:
>>
>>> + RuwanY
>>>
>>> @Waruna, can you check if the com.codahale.metrics.json bundle is
>>> active or not from the OSGI console?
>>>
>>> Best
>>>
>>> On Tue, Oct 4, 2016 at 4:25 AM, Waruna Jayaweera 
>>> wrote:
>>>
 [Looping Niranda,Anjana]

 On Tue, Oct 4, 2016 at 12:15 PM, Waruna Jayaweera >>> > wrote:

> Hi,
> After moving to latest analytics version(1.2.8) , we are getting
> class not found error [1].
>
> This is due to the package import conflicts of spark bundle and
> io.dropwizard.metrics.json which imports different version of jackson
> packages. IOT server packs multiple jackson versio

Re: [Dev] [Analytics] NoClassDefFoundError:com.codahale.metrics.json.MetricsModule when installing latest analytics features in IOT server

2016-10-12 Thread Isuru Perera
Did anyone try Niranda's suggestion to upgrade the jackson version in Spark?

On Wed, Oct 5, 2016 at 3:29 PM, Niranda Perera  wrote:

> Hi Ruwan,
>
> Did we try upgrading the jackson version in spark? I'm hoping that there
> are no API changes in jackson 2.8.3.
>
> We have done a similar exercise for guava and hadoop client
>
> Best
>
> On Wed, Oct 5, 2016 at 12:44 AM, Ruwan Yatawara  wrote:
>
>> Hi Niranda,
>>
>> Are u referring to the spark core? if so, it is bound to json4s-jackson
>> bundle.
>>
>> if we are changing the jackson version of metrics-json we will have to
>> make an orbit out of it. From the way I see it, metrics-json must have
>> included said version range in attempt to make the bundle future proof.
>> (Latest releave version of jackson-core is 2.8.3 [1])
>>
>> Given that we have to push out a release in a weeks time, changing
>> jackson version of spark is not a feasible option.
>>
>> Therefore, I am +1 for changing the jakson version range of
>> metrics-json to [2.4.0,2.5.0).
>>
>> [1] -https://mvnrepository.com/artifact/com.fasterxml.jackson.
>> core/jackson-core
>>
>> Thanks and Regards,
>>
>> Ruwan Yatawara
>>
>> Associate Technical Lead,
>> WSO2 Inc.
>>
>> email : ruw...@wso2.com
>> mobile : +94 77 9110413
>> blog : http://ruwansrants.blogspot.com/
>>   https://500px.com/ruwan_ace
>> www: :http://wso2.com
>>
>>
>> On Tue, Oct 4, 2016 at 7:39 PM, Niranda Perera  wrote:
>>
>>> Hi Maninda,
>>>
>>> What are the 2 Jason versions here?
>>>
>>> Best
>>>
>>> On Tue, Oct 4, 2016 at 8:17 AM, Maninda Edirisooriya 
>>> wrote:
>>>
 + SameeraJ

 As we have found so far, the issue is due to the existence of two
 versions of Jackson bundles exists in the IoT server pack. This was not the
 case in DAS because IoT has APIM dependencies which brings the newer
 version of Jackson into the environment. As Spark uses the older version of
 Jackson and Metrics use the newer version of Jackson, importing Metrics
 bundle to Spark bundle fails in OSGi level, because the export packages in
 Metrics, uses some Jackson packages.

 This has several potential solutions but with inherent issues.

 1. Release a new version of Metrics bundle having the same older
 Jackson dependency. - Releasing with a older version of dependency may be
 unsuitable in long term. And if in future, APIM features starts to import
 Metrics bundle, the issue will start to happen again on that import.

 2. Release a new version of Spark to work with newer Jackson bundles.
 - As Spark bundle is only correctly functioning with Jackson 2.4.4 (older
 version) and not working properly with later version of Jackson we will not
 be able to easily release a new Spark version without fixing that issue.

 3. Remove DAS components from the IoT server and package as separate
 IoT Analytics server - Some customers may want to run DAS inside IoT and
 removing DAS components from IoT server will effect the user experience for
 a WSO2 product evaluator to run  in a single server.

 Please help to find the best approach.

 Thanks.


 *Maninda Edirisooriya*
 Senior Software Engineer

 *WSO2, Inc.*lean.enterprise.middleware.

 *Blog* : http://maninda.blogspot.com/
 *E-mail* : mani...@wso2.com
 *Skype* : @manindae
 *Twitter* : @maninda

 On Tue, Oct 4, 2016 at 5:06 PM, Ruwan Yatawara  wrote:

> Hi Niranda,
>
> Yes, this bundle is active. We found this Jackson related problem upon
> further debugging.
>
> Thanks and Regards,
>
> Ruwan Yatawara
>
> Associate Technical Lead,
> WSO2 Inc.
>
> email : ruw...@wso2.com
> mobile : +94 77 9110413
> blog : http://ruwansrants.blogspot.com/
>   https://500px.com/ruwan_ace
> www: :http://wso2.com
>
>
> On Tue, Oct 4, 2016 at 4:49 PM, Niranda Perera 
> wrote:
>
>> + RuwanY
>>
>> @Waruna, can you check if the com.codahale.metrics.json bundle is
>> active or not from the OSGI console?
>>
>> Best
>>
>> On Tue, Oct 4, 2016 at 4:25 AM, Waruna Jayaweera 
>> wrote:
>>
>>> [Looping Niranda,Anjana]
>>>
>>> On Tue, Oct 4, 2016 at 12:15 PM, Waruna Jayaweera 
>>> wrote:
>>>
 Hi,
 After moving to latest analytics version(1.2.8) , we are getting
 class not found error [1].

 This is due to the package import conflicts of spark bundle and
 io.dropwizard.metrics.json which imports different version of jackson
 packages. IOT server packs multiple jackson versions 2.4.4 and 2.8.2.
 Spark bundle has jackson import range of [2.4.0,2.5.0), so wired to
 jackson-core 2.4.4.
 Io.dropwizard.metrics.json bundle has jackson import range of
 [2.4,3), so wired to jackson-core 2.8.2.
 Spark also required to import Io.dropwiza

Re: [Dev] [Analytics] NoClassDefFoundError:com.codahale.metrics.json.MetricsModule when installing latest analytics features in IOT server

2016-10-05 Thread Niranda Perera
Hi Ruwan,

Did we try upgrading the jackson version in spark? I'm hoping that there
are no API changes in jackson 2.8.3.

We have done a similar exercise for guava and hadoop client

Best

On Wed, Oct 5, 2016 at 12:44 AM, Ruwan Yatawara  wrote:

> Hi Niranda,
>
> Are u referring to the spark core? if so, it is bound to json4s-jackson
> bundle.
>
> if we are changing the jackson version of metrics-json we will have to
> make an orbit out of it. From the way I see it, metrics-json must have
> included said version range in attempt to make the bundle future proof.
> (Latest releave version of jackson-core is 2.8.3 [1])
>
> Given that we have to push out a release in a weeks time, changing
> jackson version of spark is not a feasible option.
>
> Therefore, I am +1 for changing the jakson version range of
> metrics-json to [2.4.0,2.5.0).
>
> [1] -https://mvnrepository.com/artifact/com.fasterxml.
> jackson.core/jackson-core
>
> Thanks and Regards,
>
> Ruwan Yatawara
>
> Associate Technical Lead,
> WSO2 Inc.
>
> email : ruw...@wso2.com
> mobile : +94 77 9110413
> blog : http://ruwansrants.blogspot.com/
>   https://500px.com/ruwan_ace
> www: :http://wso2.com
>
>
> On Tue, Oct 4, 2016 at 7:39 PM, Niranda Perera  wrote:
>
>> Hi Maninda,
>>
>> What are the 2 Jason versions here?
>>
>> Best
>>
>> On Tue, Oct 4, 2016 at 8:17 AM, Maninda Edirisooriya 
>> wrote:
>>
>>> + SameeraJ
>>>
>>> As we have found so far, the issue is due to the existence of two
>>> versions of Jackson bundles exists in the IoT server pack. This was not the
>>> case in DAS because IoT has APIM dependencies which brings the newer
>>> version of Jackson into the environment. As Spark uses the older version of
>>> Jackson and Metrics use the newer version of Jackson, importing Metrics
>>> bundle to Spark bundle fails in OSGi level, because the export packages in
>>> Metrics, uses some Jackson packages.
>>>
>>> This has several potential solutions but with inherent issues.
>>>
>>> 1. Release a new version of Metrics bundle having the same older Jackson
>>> dependency. - Releasing with a older version of dependency may be
>>> unsuitable in long term. And if in future, APIM features starts to import
>>> Metrics bundle, the issue will start to happen again on that import.
>>>
>>> 2. Release a new version of Spark to work with newer Jackson bundles.
>>> - As Spark bundle is only correctly functioning with Jackson 2.4.4 (older
>>> version) and not working properly with later version of Jackson we will not
>>> be able to easily release a new Spark version without fixing that issue.
>>>
>>> 3. Remove DAS components from the IoT server and package as separate IoT
>>> Analytics server - Some customers may want to run DAS inside IoT and
>>> removing DAS components from IoT server will effect the user experience for
>>> a WSO2 product evaluator to run  in a single server.
>>>
>>> Please help to find the best approach.
>>>
>>> Thanks.
>>>
>>>
>>> *Maninda Edirisooriya*
>>> Senior Software Engineer
>>>
>>> *WSO2, Inc.*lean.enterprise.middleware.
>>>
>>> *Blog* : http://maninda.blogspot.com/
>>> *E-mail* : mani...@wso2.com
>>> *Skype* : @manindae
>>> *Twitter* : @maninda
>>>
>>> On Tue, Oct 4, 2016 at 5:06 PM, Ruwan Yatawara  wrote:
>>>
 Hi Niranda,

 Yes, this bundle is active. We found this Jackson related problem upon
 further debugging.

 Thanks and Regards,

 Ruwan Yatawara

 Associate Technical Lead,
 WSO2 Inc.

 email : ruw...@wso2.com
 mobile : +94 77 9110413
 blog : http://ruwansrants.blogspot.com/
   https://500px.com/ruwan_ace
 www: :http://wso2.com


 On Tue, Oct 4, 2016 at 4:49 PM, Niranda Perera 
 wrote:

> + RuwanY
>
> @Waruna, can you check if the com.codahale.metrics.json bundle is
> active or not from the OSGI console?
>
> Best
>
> On Tue, Oct 4, 2016 at 4:25 AM, Waruna Jayaweera 
> wrote:
>
>> [Looping Niranda,Anjana]
>>
>> On Tue, Oct 4, 2016 at 12:15 PM, Waruna Jayaweera 
>> wrote:
>>
>>> Hi,
>>> After moving to latest analytics version(1.2.8) , we are getting
>>> class not found error [1].
>>>
>>> This is due to the package import conflicts of spark bundle and
>>> io.dropwizard.metrics.json which imports different version of jackson
>>> packages. IOT server packs multiple jackson versions 2.4.4 and 2.8.2.
>>> Spark bundle has jackson import range of [2.4.0,2.5.0), so wired to
>>> jackson-core 2.4.4.
>>> Io.dropwizard.metrics.json bundle has jackson import range of
>>> [2.4,3), so wired to jackson-core 2.8.2.
>>> Spark also required to import Io.dropwizard.metrics.json but it
>>> fails due to two different version of jackson packages in spark bundle
>>> class space.
>>> So we need to upgrade the spark jackson version range to  [2.4,3) or
>>> we need to downgrade metrics jackson version to [2.4.0,2.5.0).
>>> Appreciat

Re: [Dev] [Analytics] NoClassDefFoundError:com.codahale.metrics.json.MetricsModule when installing latest analytics features in IOT server

2016-10-04 Thread Ruwan Yatawara
Hi Niranda,

Are u referring to the spark core? if so, it is bound to json4s-jackson
bundle.

if we are changing the jackson version of metrics-json we will have to make
an orbit out of it. From the way I see it, metrics-json must have included
said version range in attempt to make the bundle future proof. (Latest
releave version of jackson-core is 2.8.3 [1])

Given that we have to push out a release in a weeks time, changing
jackson version of spark is not a feasible option.

Therefore, I am +1 for changing the jakson version range of metrics-json to
[2.4.0,2.5.0).

[1] -
https://mvnrepository.com/artifact/com.fasterxml.jackson.core/jackson-core

Thanks and Regards,

Ruwan Yatawara

Associate Technical Lead,
WSO2 Inc.

email : ruw...@wso2.com
mobile : +94 77 9110413
blog : http://ruwansrants.blogspot.com/
  https://500px.com/ruwan_ace
www: :http://wso2.com


On Tue, Oct 4, 2016 at 7:39 PM, Niranda Perera  wrote:

> Hi Maninda,
>
> What are the 2 Jason versions here?
>
> Best
>
> On Tue, Oct 4, 2016 at 8:17 AM, Maninda Edirisooriya 
> wrote:
>
>> + SameeraJ
>>
>> As we have found so far, the issue is due to the existence of two
>> versions of Jackson bundles exists in the IoT server pack. This was not the
>> case in DAS because IoT has APIM dependencies which brings the newer
>> version of Jackson into the environment. As Spark uses the older version of
>> Jackson and Metrics use the newer version of Jackson, importing Metrics
>> bundle to Spark bundle fails in OSGi level, because the export packages in
>> Metrics, uses some Jackson packages.
>>
>> This has several potential solutions but with inherent issues.
>>
>> 1. Release a new version of Metrics bundle having the same older Jackson
>> dependency. - Releasing with a older version of dependency may be
>> unsuitable in long term. And if in future, APIM features starts to import
>> Metrics bundle, the issue will start to happen again on that import.
>>
>> 2. Release a new version of Spark to work with newer Jackson bundles.
>> - As Spark bundle is only correctly functioning with Jackson 2.4.4 (older
>> version) and not working properly with later version of Jackson we will not
>> be able to easily release a new Spark version without fixing that issue.
>>
>> 3. Remove DAS components from the IoT server and package as separate IoT
>> Analytics server - Some customers may want to run DAS inside IoT and
>> removing DAS components from IoT server will effect the user experience for
>> a WSO2 product evaluator to run  in a single server.
>>
>> Please help to find the best approach.
>>
>> Thanks.
>>
>>
>> *Maninda Edirisooriya*
>> Senior Software Engineer
>>
>> *WSO2, Inc.*lean.enterprise.middleware.
>>
>> *Blog* : http://maninda.blogspot.com/
>> *E-mail* : mani...@wso2.com
>> *Skype* : @manindae
>> *Twitter* : @maninda
>>
>> On Tue, Oct 4, 2016 at 5:06 PM, Ruwan Yatawara  wrote:
>>
>>> Hi Niranda,
>>>
>>> Yes, this bundle is active. We found this Jackson related problem upon
>>> further debugging.
>>>
>>> Thanks and Regards,
>>>
>>> Ruwan Yatawara
>>>
>>> Associate Technical Lead,
>>> WSO2 Inc.
>>>
>>> email : ruw...@wso2.com
>>> mobile : +94 77 9110413
>>> blog : http://ruwansrants.blogspot.com/
>>>   https://500px.com/ruwan_ace
>>> www: :http://wso2.com
>>>
>>>
>>> On Tue, Oct 4, 2016 at 4:49 PM, Niranda Perera  wrote:
>>>
 + RuwanY

 @Waruna, can you check if the com.codahale.metrics.json bundle is
 active or not from the OSGI console?

 Best

 On Tue, Oct 4, 2016 at 4:25 AM, Waruna Jayaweera 
 wrote:

> [Looping Niranda,Anjana]
>
> On Tue, Oct 4, 2016 at 12:15 PM, Waruna Jayaweera 
> wrote:
>
>> Hi,
>> After moving to latest analytics version(1.2.8) , we are getting
>> class not found error [1].
>>
>> This is due to the package import conflicts of spark bundle and
>> io.dropwizard.metrics.json which imports different version of jackson
>> packages. IOT server packs multiple jackson versions 2.4.4 and 2.8.2.
>> Spark bundle has jackson import range of [2.4.0,2.5.0), so wired to
>> jackson-core 2.4.4.
>> Io.dropwizard.metrics.json bundle has jackson import range of
>> [2.4,3), so wired to jackson-core 2.8.2.
>> Spark also required to import Io.dropwizard.metrics.json but it fails
>> due to two different version of jackson packages in spark bundle class
>> space.
>> So we need to upgrade the spark jackson version range to  [2.4,3) or
>> we need to downgrade metrics jackson version to [2.4.0,2.5.0).
>> Appreciate any suggestions to fix the issue.
>>
>> [1]
>> ERROR - AnalyticsComponent Error initializing analytics executor:
>> Unable to create analytics client. com/codahale/metrics/json/Metr
>> icsModule
>> org.wso2.carbon.analytics.datasource.commons.exception.AnalyticsException:
>> Unable to create analytics client. com/codahale/metrics/json/Metr
>> icsModule
>> at org.wso2.carbon.

Re: [Dev] [Analytics] NoClassDefFoundError:com.codahale.metrics.json.MetricsModule when installing latest analytics features in IOT server

2016-10-04 Thread Niranda Perera
Hi Maninda,

What are the 2 Jason versions here?

Best

On Tue, Oct 4, 2016 at 8:17 AM, Maninda Edirisooriya 
wrote:

> + SameeraJ
>
> As we have found so far, the issue is due to the existence of two versions
> of Jackson bundles exists in the IoT server pack. This was not the case in
> DAS because IoT has APIM dependencies which brings the newer version of
> Jackson into the environment. As Spark uses the older version of Jackson
> and Metrics use the newer version of Jackson, importing Metrics bundle to
> Spark bundle fails in OSGi level, because the export packages in Metrics,
> uses some Jackson packages.
>
> This has several potential solutions but with inherent issues.
>
> 1. Release a new version of Metrics bundle having the same older Jackson
> dependency. - Releasing with a older version of dependency may be
> unsuitable in long term. And if in future, APIM features starts to import
> Metrics bundle, the issue will start to happen again on that import.
>
> 2. Release a new version of Spark to work with newer Jackson bundles. - As
> Spark bundle is only correctly functioning with Jackson 2.4.4 (older
> version) and not working properly with later version of Jackson we will not
> be able to easily release a new Spark version without fixing that issue.
>
> 3. Remove DAS components from the IoT server and package as separate IoT
> Analytics server - Some customers may want to run DAS inside IoT and
> removing DAS components from IoT server will effect the user experience for
> a WSO2 product evaluator to run  in a single server.
>
> Please help to find the best approach.
>
> Thanks.
>
>
> *Maninda Edirisooriya*
> Senior Software Engineer
>
> *WSO2, Inc.*lean.enterprise.middleware.
>
> *Blog* : http://maninda.blogspot.com/
> *E-mail* : mani...@wso2.com
> *Skype* : @manindae
> *Twitter* : @maninda
>
> On Tue, Oct 4, 2016 at 5:06 PM, Ruwan Yatawara  wrote:
>
>> Hi Niranda,
>>
>> Yes, this bundle is active. We found this Jackson related problem upon
>> further debugging.
>>
>> Thanks and Regards,
>>
>> Ruwan Yatawara
>>
>> Associate Technical Lead,
>> WSO2 Inc.
>>
>> email : ruw...@wso2.com
>> mobile : +94 77 9110413
>> blog : http://ruwansrants.blogspot.com/
>>   https://500px.com/ruwan_ace
>> www: :http://wso2.com
>>
>>
>> On Tue, Oct 4, 2016 at 4:49 PM, Niranda Perera  wrote:
>>
>>> + RuwanY
>>>
>>> @Waruna, can you check if the com.codahale.metrics.json bundle is active
>>> or not from the OSGI console?
>>>
>>> Best
>>>
>>> On Tue, Oct 4, 2016 at 4:25 AM, Waruna Jayaweera 
>>> wrote:
>>>
 [Looping Niranda,Anjana]

 On Tue, Oct 4, 2016 at 12:15 PM, Waruna Jayaweera 
 wrote:

> Hi,
> After moving to latest analytics version(1.2.8) , we are getting class
> not found error [1].
>
> This is due to the package import conflicts of spark bundle and
> io.dropwizard.metrics.json which imports different version of jackson
> packages. IOT server packs multiple jackson versions 2.4.4 and 2.8.2.
> Spark bundle has jackson import range of [2.4.0,2.5.0), so wired to
> jackson-core 2.4.4.
> Io.dropwizard.metrics.json bundle has jackson import range of [2.4,3),
> so wired to jackson-core 2.8.2.
> Spark also required to import Io.dropwizard.metrics.json but it fails
> due to two different version of jackson packages in spark bundle class
> space.
> So we need to upgrade the spark jackson version range to  [2.4,3) or
> we need to downgrade metrics jackson version to [2.4.0,2.5.0).
> Appreciate any suggestions to fix the issue.
>
> [1]
> ERROR - AnalyticsComponent Error initializing analytics executor:
> Unable to create analytics client. com/codahale/metrics/json/Metr
> icsModule
> org.wso2.carbon.analytics.datasource.commons.exception.AnalyticsException:
> Unable to create analytics client. com/codahale/metrics/json/Metr
> icsModule
> at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
> Executor.initializeSparkContext(SparkAnalyticsExecutor.java:321)
> at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
> Executor.initializeAnalyticsClientLocal(SparkAnalyticsExecut
> or.java:303)
> at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
> Executor.initializeAnalyticsClient(SparkAnalyticsExecutor.java:292)
> at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
> Executor.initializeSparkServer(SparkAnalyticsExecutor.java:180)
> at org.wso2.carbon.analytics.spark.core.internal.AnalyticsCompo
> nent.activate(AnalyticsComponent.java:88)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAcce
> ssorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMe
> thodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.eclipse.equinox.internal.ds.model.ServiceCompo

Re: [Dev] [Analytics] NoClassDefFoundError:com.codahale.metrics.json.MetricsModule when installing latest analytics features in IOT server

2016-10-04 Thread Maninda Edirisooriya
+ SameeraJ

As we have found so far, the issue is due to the existence of two versions
of Jackson bundles exists in the IoT server pack. This was not the case in
DAS because IoT has APIM dependencies which brings the newer version of
Jackson into the environment. As Spark uses the older version of Jackson
and Metrics use the newer version of Jackson, importing Metrics bundle to
Spark bundle fails in OSGi level, because the export packages in Metrics,
uses some Jackson packages.

This has several potential solutions but with inherent issues.

1. Release a new version of Metrics bundle having the same older Jackson
dependency. - Releasing with a older version of dependency may be
unsuitable in long term. And if in future, APIM features starts to import
Metrics bundle, the issue will start to happen again on that import.

2. Release a new version of Spark to work with newer Jackson bundles. - As
Spark bundle is only correctly functioning with Jackson 2.4.4 (older
version) and not working properly with later version of Jackson we will not
be able to easily release a new Spark version without fixing that issue.

3. Remove DAS components from the IoT server and package as separate IoT
Analytics server - Some customers may want to run DAS inside IoT and
removing DAS components from IoT server will effect the user experience for
a WSO2 product evaluator to run  in a single server.

Please help to find the best approach.

Thanks.


*Maninda Edirisooriya*
Senior Software Engineer

*WSO2, Inc.*lean.enterprise.middleware.

*Blog* : http://maninda.blogspot.com/
*E-mail* : mani...@wso2.com
*Skype* : @manindae
*Twitter* : @maninda

On Tue, Oct 4, 2016 at 5:06 PM, Ruwan Yatawara  wrote:

> Hi Niranda,
>
> Yes, this bundle is active. We found this Jackson related problem upon
> further debugging.
>
> Thanks and Regards,
>
> Ruwan Yatawara
>
> Associate Technical Lead,
> WSO2 Inc.
>
> email : ruw...@wso2.com
> mobile : +94 77 9110413
> blog : http://ruwansrants.blogspot.com/
>   https://500px.com/ruwan_ace
> www: :http://wso2.com
>
>
> On Tue, Oct 4, 2016 at 4:49 PM, Niranda Perera  wrote:
>
>> + RuwanY
>>
>> @Waruna, can you check if the com.codahale.metrics.json bundle is active
>> or not from the OSGI console?
>>
>> Best
>>
>> On Tue, Oct 4, 2016 at 4:25 AM, Waruna Jayaweera 
>> wrote:
>>
>>> [Looping Niranda,Anjana]
>>>
>>> On Tue, Oct 4, 2016 at 12:15 PM, Waruna Jayaweera 
>>> wrote:
>>>
 Hi,
 After moving to latest analytics version(1.2.8) , we are getting class
 not found error [1].

 This is due to the package import conflicts of spark bundle and
 io.dropwizard.metrics.json which imports different version of jackson
 packages. IOT server packs multiple jackson versions 2.4.4 and 2.8.2.
 Spark bundle has jackson import range of [2.4.0,2.5.0), so wired to
 jackson-core 2.4.4.
 Io.dropwizard.metrics.json bundle has jackson import range of [2.4,3),
 so wired to jackson-core 2.8.2.
 Spark also required to import Io.dropwizard.metrics.json but it fails
 due to two different version of jackson packages in spark bundle class
 space.
 So we need to upgrade the spark jackson version range to  [2.4,3) or we
 need to downgrade metrics jackson version to [2.4.0,2.5.0).
 Appreciate any suggestions to fix the issue.

 [1]
 ERROR - AnalyticsComponent Error initializing analytics executor:
 Unable to create analytics client. com/codahale/metrics/json/Metr
 icsModule
 org.wso2.carbon.analytics.datasource.commons.exception.AnalyticsException:
 Unable to create analytics client. com/codahale/metrics/json/Metr
 icsModule
 at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
 Executor.initializeSparkContext(SparkAnalyticsExecutor.java:321)
 at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
 Executor.initializeAnalyticsClientLocal(SparkAnalyticsExecut
 or.java:303)
 at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
 Executor.initializeAnalyticsClient(SparkAnalyticsExecutor.java:292)
 at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
 Executor.initializeSparkServer(SparkAnalyticsExecutor.java:180)
 at org.wso2.carbon.analytics.spark.core.internal.AnalyticsCompo
 nent.activate(AnalyticsComponent.java:88)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAcce
 ssorImpl.java:57)
 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMe
 thodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:606)
 at org.eclipse.equinox.internal.ds.model.ServiceComponent.activ
 ate(ServiceComponent.java:260)
 at org.eclipse.equinox.internal.ds.model.ServiceComponentProp.a
 ctivate(ServiceComponentProp.java:146)
 at org.eclipse.equinox.internal.ds.model.ServiceComponentProp.b
 uild(ServiceComponentProp.java:345)
>>

Re: [Dev] [Analytics] NoClassDefFoundError:com.codahale.metrics.json.MetricsModule when installing latest analytics features in IOT server

2016-10-04 Thread Ruwan Yatawara
Hi Niranda,

Yes, this bundle is active. We found this Jackson related problem upon
further debugging.

Thanks and Regards,

Ruwan Yatawara

Associate Technical Lead,
WSO2 Inc.

email : ruw...@wso2.com
mobile : +94 77 9110413
blog : http://ruwansrants.blogspot.com/
  https://500px.com/ruwan_ace
www: :http://wso2.com


On Tue, Oct 4, 2016 at 4:49 PM, Niranda Perera  wrote:

> + RuwanY
>
> @Waruna, can you check if the com.codahale.metrics.json bundle is active
> or not from the OSGI console?
>
> Best
>
> On Tue, Oct 4, 2016 at 4:25 AM, Waruna Jayaweera  wrote:
>
>> [Looping Niranda,Anjana]
>>
>> On Tue, Oct 4, 2016 at 12:15 PM, Waruna Jayaweera 
>> wrote:
>>
>>> Hi,
>>> After moving to latest analytics version(1.2.8) , we are getting class
>>> not found error [1].
>>>
>>> This is due to the package import conflicts of spark bundle and
>>> io.dropwizard.metrics.json which imports different version of jackson
>>> packages. IOT server packs multiple jackson versions 2.4.4 and 2.8.2.
>>> Spark bundle has jackson import range of [2.4.0,2.5.0), so wired to
>>> jackson-core 2.4.4.
>>> Io.dropwizard.metrics.json bundle has jackson import range of [2.4,3),
>>> so wired to jackson-core 2.8.2.
>>> Spark also required to import Io.dropwizard.metrics.json but it fails
>>> due to two different version of jackson packages in spark bundle class
>>> space.
>>> So we need to upgrade the spark jackson version range to  [2.4,3) or we
>>> need to downgrade metrics jackson version to [2.4.0,2.5.0).
>>> Appreciate any suggestions to fix the issue.
>>>
>>> [1]
>>> ERROR - AnalyticsComponent Error initializing analytics executor: Unable
>>> to create analytics client. com/codahale/metrics/json/MetricsModule
>>> org.wso2.carbon.analytics.datasource.commons.exception.AnalyticsException:
>>> Unable to create analytics client. com/codahale/metrics/json/Metr
>>> icsModule
>>> at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
>>> Executor.initializeSparkContext(SparkAnalyticsExecutor.java:321)
>>> at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
>>> Executor.initializeAnalyticsClientLocal(SparkAnalyticsExecutor.java:303)
>>> at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
>>> Executor.initializeAnalyticsClient(SparkAnalyticsExecutor.java:292)
>>> at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
>>> Executor.initializeSparkServer(SparkAnalyticsExecutor.java:180)
>>> at org.wso2.carbon.analytics.spark.core.internal.AnalyticsCompo
>>> nent.activate(AnalyticsComponent.java:88)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAcce
>>> ssorImpl.java:57)
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMe
>>> thodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at org.eclipse.equinox.internal.ds.model.ServiceComponent.activ
>>> ate(ServiceComponent.java:260)
>>> at org.eclipse.equinox.internal.ds.model.ServiceComponentProp.a
>>> ctivate(ServiceComponentProp.java:146)
>>> at org.eclipse.equinox.internal.ds.model.ServiceComponentProp.b
>>> uild(ServiceComponentProp.java:345)
>>> at org.eclipse.equinox.internal.ds.InstanceProcess.buildCompone
>>> nt(InstanceProcess.java:620)
>>> at org.eclipse.equinox.internal.ds.InstanceProcess.buildCompone
>>> nts(InstanceProcess.java:197)
>>> at org.eclipse.equinox.internal.ds.Resolver.getEligible(Resolve
>>> r.java:343)
>>> at org.eclipse.equinox.internal.ds.SCRManager.serviceChanged(SC
>>> RManager.java:222)
>>> at org.eclipse.osgi.internal.serviceregistry.FilteredServiceLis
>>> tener.serviceChanged(FilteredServiceListener.java:107)
>>> at org.eclipse.osgi.framework.internal.core.BundleContextImpl.d
>>> ispatchEvent(BundleContextImpl.java:861)
>>> at org.eclipse.osgi.framework.eventmgr.EventManager.dispatchEve
>>> nt(EventManager.java:230)
>>> at org.eclipse.osgi.framework.eventmgr.ListenerQueue.dispatchEv
>>> entSynchronous(ListenerQueue.java:148)
>>> at org.eclipse.osgi.internal.serviceregistry.ServiceRegistry.pu
>>> blishServiceEventPrivileged(ServiceRegistry.java:819)
>>> at org.eclipse.osgi.internal.serviceregistry.ServiceRegistry.pu
>>> blishServiceEvent(ServiceRegistry.java:771)
>>> at org.eclipse.osgi.internal.serviceregistry.ServiceRegistratio
>>> nImpl.register(ServiceRegistrationImpl.java:130)
>>> at org.eclipse.osgi.internal.serviceregistry.ServiceRegistry.re
>>> gisterService(ServiceRegistry.java:214)
>>> at org.eclipse.osgi.framework.internal.core.BundleContextImpl.r
>>> egisterService(BundleContextImpl.java:433)
>>> at org.eclipse.osgi.framework.internal.core.BundleContextImpl.r
>>> egisterService(BundleContextImpl.java:451)
>>> at org.eclipse.osgi.framework.internal.core.BundleContextImpl.r
>>> egisterService(BundleContextImpl.java:950)
>>> at org.wso2.carbon.analytics.dataservice.core.AnalyticsDataServ
>>> iceComponent.activate(AnalyticsDataServiceComponent.java:72)
>>> at sun.reflect.NativeMethodA

Re: [Dev] [Analytics] NoClassDefFoundError:com.codahale.metrics.json.MetricsModule when installing latest analytics features in IOT server

2016-10-04 Thread Niranda Perera
+ RuwanY

@Waruna, can you check if the com.codahale.metrics.json bundle is active or
not from the OSGI console?

Best

On Tue, Oct 4, 2016 at 4:25 AM, Waruna Jayaweera  wrote:

> [Looping Niranda,Anjana]
>
> On Tue, Oct 4, 2016 at 12:15 PM, Waruna Jayaweera 
> wrote:
>
>> Hi,
>> After moving to latest analytics version(1.2.8) , we are getting class
>> not found error [1].
>>
>> This is due to the package import conflicts of spark bundle and
>> io.dropwizard.metrics.json which imports different version of jackson
>> packages. IOT server packs multiple jackson versions 2.4.4 and 2.8.2.
>> Spark bundle has jackson import range of [2.4.0,2.5.0), so wired to
>> jackson-core 2.4.4.
>> Io.dropwizard.metrics.json bundle has jackson import range of [2.4,3), so
>> wired to jackson-core 2.8.2.
>> Spark also required to import Io.dropwizard.metrics.json but it fails due
>> to two different version of jackson packages in spark bundle class space.
>> So we need to upgrade the spark jackson version range to  [2.4,3) or we
>> need to downgrade metrics jackson version to [2.4.0,2.5.0).
>> Appreciate any suggestions to fix the issue.
>>
>> [1]
>> ERROR - AnalyticsComponent Error initializing analytics executor: Unable
>> to create analytics client. com/codahale/metrics/json/MetricsModule
>> org.wso2.carbon.analytics.datasource.commons.exception.AnalyticsException:
>> Unable to create analytics client. com/codahale/metrics/json/Metr
>> icsModule
>> at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
>> Executor.initializeSparkContext(SparkAnalyticsExecutor.java:321)
>> at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
>> Executor.initializeAnalyticsClientLocal(SparkAnalyticsExecutor.java:303)
>> at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
>> Executor.initializeAnalyticsClient(SparkAnalyticsExecutor.java:292)
>> at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
>> Executor.initializeSparkServer(SparkAnalyticsExecutor.java:180)
>> at org.wso2.carbon.analytics.spark.core.internal.AnalyticsCompo
>> nent.activate(AnalyticsComponent.java:88)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAcce
>> ssorImpl.java:57)
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMe
>> thodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at org.eclipse.equinox.internal.ds.model.ServiceComponent.activ
>> ate(ServiceComponent.java:260)
>> at org.eclipse.equinox.internal.ds.model.ServiceComponentProp.a
>> ctivate(ServiceComponentProp.java:146)
>> at org.eclipse.equinox.internal.ds.model.ServiceComponentProp.b
>> uild(ServiceComponentProp.java:345)
>> at org.eclipse.equinox.internal.ds.InstanceProcess.buildCompone
>> nt(InstanceProcess.java:620)
>> at org.eclipse.equinox.internal.ds.InstanceProcess.buildCompone
>> nts(InstanceProcess.java:197)
>> at org.eclipse.equinox.internal.ds.Resolver.getEligible(Resolve
>> r.java:343)
>> at org.eclipse.equinox.internal.ds.SCRManager.serviceChanged(SC
>> RManager.java:222)
>> at org.eclipse.osgi.internal.serviceregistry.FilteredServiceLis
>> tener.serviceChanged(FilteredServiceListener.java:107)
>> at org.eclipse.osgi.framework.internal.core.BundleContextImpl.d
>> ispatchEvent(BundleContextImpl.java:861)
>> at org.eclipse.osgi.framework.eventmgr.EventManager.dispatchEve
>> nt(EventManager.java:230)
>> at org.eclipse.osgi.framework.eventmgr.ListenerQueue.dispatchEv
>> entSynchronous(ListenerQueue.java:148)
>> at org.eclipse.osgi.internal.serviceregistry.ServiceRegistry.pu
>> blishServiceEventPrivileged(ServiceRegistry.java:819)
>> at org.eclipse.osgi.internal.serviceregistry.ServiceRegistry.pu
>> blishServiceEvent(ServiceRegistry.java:771)
>> at org.eclipse.osgi.internal.serviceregistry.ServiceRegistratio
>> nImpl.register(ServiceRegistrationImpl.java:130)
>> at org.eclipse.osgi.internal.serviceregistry.ServiceRegistry.re
>> gisterService(ServiceRegistry.java:214)
>> at org.eclipse.osgi.framework.internal.core.BundleContextImpl.r
>> egisterService(BundleContextImpl.java:433)
>> at org.eclipse.osgi.framework.internal.core.BundleContextImpl.r
>> egisterService(BundleContextImpl.java:451)
>> at org.eclipse.osgi.framework.internal.core.BundleContextImpl.r
>> egisterService(BundleContextImpl.java:950)
>> at org.wso2.carbon.analytics.dataservice.core.AnalyticsDataServ
>> iceComponent.activate(AnalyticsDataServiceComponent.java:72)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAcce
>> ssorImpl.java:57)
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMe
>> thodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at org.eclipse.equinox.internal.ds.model.ServiceComponent.activ
>> ate(ServiceComponent.java:260)
>> at org.eclipse.equinox.internal.ds.model.ServiceComponentProp.a
>> ctivate(ServiceComponentProp.java:146)
>

Re: [Dev] [Analytics] NoClassDefFoundError:com.codahale.metrics.json.MetricsModule when installing latest analytics features in IOT server

2016-10-04 Thread Waruna Jayaweera
[Looping Niranda,Anjana]

On Tue, Oct 4, 2016 at 12:15 PM, Waruna Jayaweera  wrote:

> Hi,
> After moving to latest analytics version(1.2.8) , we are getting class not
> found error [1].
>
> This is due to the package import conflicts of spark bundle and
> io.dropwizard.metrics.json which imports different version of jackson
> packages. IOT server packs multiple jackson versions 2.4.4 and 2.8.2.
> Spark bundle has jackson import range of [2.4.0,2.5.0), so wired to
> jackson-core 2.4.4.
> Io.dropwizard.metrics.json bundle has jackson import range of [2.4,3), so
> wired to jackson-core 2.8.2.
> Spark also required to import Io.dropwizard.metrics.json but it fails due
> to two different version of jackson packages in spark bundle class space.
> So we need to upgrade the spark jackson version range to  [2.4,3) or we
> need to downgrade metrics jackson version to [2.4.0,2.5.0).
> Appreciate any suggestions to fix the issue.
>
> [1]
> ERROR - AnalyticsComponent Error initializing analytics executor: Unable
> to create analytics client. com/codahale/metrics/json/MetricsModule
> org.wso2.carbon.analytics.datasource.commons.exception.AnalyticsException:
> Unable to create analytics client. com/codahale/metrics/json/MetricsModule
> at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
> Executor.initializeSparkContext(SparkAnalyticsExecutor.java:321)
> at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
> Executor.initializeAnalyticsClientLocal(SparkAnalyticsExecutor.java:303)
> at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
> Executor.initializeAnalyticsClient(SparkAnalyticsExecutor.java:292)
> at org.wso2.carbon.analytics.spark.core.internal.SparkAnalytics
> Executor.initializeSparkServer(SparkAnalyticsExecutor.java:180)
> at org.wso2.carbon.analytics.spark.core.internal.AnalyticsCompo
> nent.activate(AnalyticsComponent.java:88)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAcce
> ssorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMe
> thodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.eclipse.equinox.internal.ds.model.ServiceComponent.activ
> ate(ServiceComponent.java:260)
> at org.eclipse.equinox.internal.ds.model.ServiceComponentProp.a
> ctivate(ServiceComponentProp.java:146)
> at org.eclipse.equinox.internal.ds.model.ServiceComponentProp.b
> uild(ServiceComponentProp.java:345)
> at org.eclipse.equinox.internal.ds.InstanceProcess.buildCompone
> nt(InstanceProcess.java:620)
> at org.eclipse.equinox.internal.ds.InstanceProcess.buildCompone
> nts(InstanceProcess.java:197)
> at org.eclipse.equinox.internal.ds.Resolver.getEligible(Resolver.java:343)
> at org.eclipse.equinox.internal.ds.SCRManager.serviceChanged(SC
> RManager.java:222)
> at org.eclipse.osgi.internal.serviceregistry.FilteredServiceLis
> tener.serviceChanged(FilteredServiceListener.java:107)
> at org.eclipse.osgi.framework.internal.core.BundleContextImpl.d
> ispatchEvent(BundleContextImpl.java:861)
> at org.eclipse.osgi.framework.eventmgr.EventManager.dispatchEve
> nt(EventManager.java:230)
> at org.eclipse.osgi.framework.eventmgr.ListenerQueue.dispatchEv
> entSynchronous(ListenerQueue.java:148)
> at org.eclipse.osgi.internal.serviceregistry.ServiceRegistry.pu
> blishServiceEventPrivileged(ServiceRegistry.java:819)
> at org.eclipse.osgi.internal.serviceregistry.ServiceRegistry.pu
> blishServiceEvent(ServiceRegistry.java:771)
> at org.eclipse.osgi.internal.serviceregistry.ServiceRegistratio
> nImpl.register(ServiceRegistrationImpl.java:130)
> at org.eclipse.osgi.internal.serviceregistry.ServiceRegistry.re
> gisterService(ServiceRegistry.java:214)
> at org.eclipse.osgi.framework.internal.core.BundleContextImpl.r
> egisterService(BundleContextImpl.java:433)
> at org.eclipse.osgi.framework.internal.core.BundleContextImpl.r
> egisterService(BundleContextImpl.java:451)
> at org.eclipse.osgi.framework.internal.core.BundleContextImpl.r
> egisterService(BundleContextImpl.java:950)
> at org.wso2.carbon.analytics.dataservice.core.AnalyticsDataServ
> iceComponent.activate(AnalyticsDataServiceComponent.java:72)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAcce
> ssorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMe
> thodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.eclipse.equinox.internal.ds.model.ServiceComponent.activ
> ate(ServiceComponent.java:260)
> at org.eclipse.equinox.internal.ds.model.ServiceComponentProp.a
> ctivate(ServiceComponentProp.java:146)
> at org.eclipse.equinox.internal.ds.model.ServiceComponentProp.b
> uild(ServiceComponentProp.java:345)
> at org.eclipse.equinox.internal.ds.InstanceProcess.buildCompone
> nt(InstanceProcess.java:620)
> at org.eclipse.equinox.internal.ds.InstanceProcess.buildCompone
> nts(Instanc

[Dev] [Analytics] NoClassDefFoundError:com.codahale.metrics.json.MetricsModule when installing latest analytics features in IOT server

2016-10-03 Thread Waruna Jayaweera
Hi,
After moving to latest analytics version(1.2.8) , we are getting class not
found error [1].

This is due to the package import conflicts of spark bundle and
io.dropwizard.metrics.json which imports different version of jackson
packages. IOT server packs multiple jackson versions 2.4.4 and 2.8.2.
Spark bundle has jackson import range of [2.4.0,2.5.0), so wired to
jackson-core 2.4.4.
Io.dropwizard.metrics.json bundle has jackson import range of [2.4,3), so
wired to jackson-core 2.8.2.
Spark also required to import Io.dropwizard.metrics.json but it fails due
to two different version of jackson packages in spark bundle class space.
So we need to upgrade the spark jackson version range to  [2.4,3) or we
need to downgrade metrics jackson version to [2.4.0,2.5.0).
Appreciate any suggestions to fix the issue.

[1]
ERROR - AnalyticsComponent Error initializing analytics executor: Unable to
create analytics client. com/codahale/metrics/json/MetricsModule
org.wso2.carbon.analytics.datasource.commons.exception.AnalyticsException:
Unable to create analytics client. com/codahale/metrics/json/MetricsModule
at org.wso2.carbon.analytics.spark.core.internal.SparkAnalyticsExecutor.
initializeSparkContext(SparkAnalyticsExecutor.java:321)
at org.wso2.carbon.analytics.spark.core.internal.SparkAnalyticsExecutor.
initializeAnalyticsClientLocal(SparkAnalyticsExecutor.java:303)
at org.wso2.carbon.analytics.spark.core.internal.SparkAnalyticsExecutor.
initializeAnalyticsClient(SparkAnalyticsExecutor.java:292)
at org.wso2.carbon.analytics.spark.core.internal.SparkAnalyticsExecutor.
initializeSparkServer(SparkAnalyticsExecutor.java:180)
at org.wso2.carbon.analytics.spark.core.internal.
AnalyticsComponent.activate(AnalyticsComponent.java:88)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(
NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(
DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.eclipse.equinox.internal.ds.model.ServiceComponent.
activate(ServiceComponent.java:260)
at org.eclipse.equinox.internal.ds.model.ServiceComponentProp.
activate(ServiceComponentProp.java:146)
at org.eclipse.equinox.internal.ds.model.ServiceComponentProp.
build(ServiceComponentProp.java:345)
at org.eclipse.equinox.internal.ds.InstanceProcess.buildComponent(
InstanceProcess.java:620)
at org.eclipse.equinox.internal.ds.InstanceProcess.buildComponents(
InstanceProcess.java:197)
at org.eclipse.equinox.internal.ds.Resolver.getEligible(Resolver.java:343)
at org.eclipse.equinox.internal.ds.SCRManager.serviceChanged(
SCRManager.java:222)
at org.eclipse.osgi.internal.serviceregistry.FilteredServiceListener.
serviceChanged(FilteredServiceListener.java:107)
at org.eclipse.osgi.framework.internal.core.BundleContextImpl.dispatchEvent(
BundleContextImpl.java:861)
at org.eclipse.osgi.framework.eventmgr.EventManager.
dispatchEvent(EventManager.java:230)
at org.eclipse.osgi.framework.eventmgr.ListenerQueue.
dispatchEventSynchronous(ListenerQueue.java:148)
at org.eclipse.osgi.internal.serviceregistry.ServiceRegistry.
publishServiceEventPrivileged(ServiceRegistry.java:819)
at org.eclipse.osgi.internal.serviceregistry.ServiceRegistry.
publishServiceEvent(ServiceRegistry.java:771)
at org.eclipse.osgi.internal.serviceregistry.ServiceRegistrationImpl.
register(ServiceRegistrationImpl.java:130)
at org.eclipse.osgi.internal.serviceregistry.ServiceRegistry.
registerService(ServiceRegistry.java:214)
at org.eclipse.osgi.framework.internal.core.BundleContextImpl.
registerService(BundleContextImpl.java:433)
at org.eclipse.osgi.framework.internal.core.BundleContextImpl.
registerService(BundleContextImpl.java:451)
at org.eclipse.osgi.framework.internal.core.BundleContextImpl.
registerService(BundleContextImpl.java:950)
at org.wso2.carbon.analytics.dataservice.core.AnalyticsDataServiceComponent.
activate(AnalyticsDataServiceComponent.java:72)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(
NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(
DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.eclipse.equinox.internal.ds.model.ServiceComponent.
activate(ServiceComponent.java:260)
at org.eclipse.equinox.internal.ds.model.ServiceComponentProp.
activate(ServiceComponentProp.java:146)
at org.eclipse.equinox.internal.ds.model.ServiceComponentProp.
build(ServiceComponentProp.java:345)
at org.eclipse.equinox.internal.ds.InstanceProcess.buildComponent(
InstanceProcess.java:620)
at org.eclipse.equinox.internal.ds.InstanceProcess.buildComponents(
InstanceProcess.java:197)
at org.eclipse.equinox.internal.ds.Resolver.getEligible(Resolver.java:343)
at org.eclipse.equinox.internal.ds.SCRManager.serviceChanged(
SCRManager.java:222)
at org.eclipse.osgi.internal.serviceregistry.FilteredServiceListener.
serviceChanged(Fil