Re: Spark on Tomcat has exception IncompatibleClassChangeError: Implementing class

2015-07-26 Thread Zoran Jeremic
Yes. You're right. I didn't get it till now.

Thanks.

On Sun, Jul 26, 2015 at 7:36 AM, Ted Yu  wrote:

> bq. [INFO] \- org.apache.spark:spark-core_2.10:jar:1.4.0:compile
>
> I think the above notation means spark-core_2.10 is the last dependency.
>
> Cheers
>
> On Thu, Jul 23, 2015 at 9:22 PM, Zoran Jeremic 
> wrote:
>
>> Hi Yana,
>>
>> Sorry for late response. I just saw your email. At the end I ended with
>> the following pom https://www.dropbox.com/s/19fldb9qnnfieck/pom.xml?dl=0
>> There were multiple problems I had to struggle with. One of these were
>> that my application had REST implemented with jboss jersey which got
>> conflicts with sun jersey embedded in spark libraries, so after several
>> days of trying to fix that, I change my REST implementation and switched to
>> sun jersey.
>>
>> Although I'm not sure why spark-core shows under gson:
>>
>> [INFO] +- com.google.code.gson:gson:jar:2.2.2:compile
>> [INFO] \- org.apache.spark:spark-core_2.10:jar:1.4.0:compile
>>
>> It's not actually under gson. Both, gson and spark-core are root libraries. 
>> Spark-core has children libraries, so that's why it has "\-" in front of it.
>>
>> Zoran
>>
>> On Mon, Jul 13, 2015 at 1:17 PM, Yana Kadiyska 
>> wrote:
>>
>>> Oh, this is very interesting -- can you explain about your dependencies
>>> -- I'm running Tomcat 7 and ended up using spark-assembly from WEB_INF/lib
>>> and removing the javax/servlet package out of it...but it's a pain in the
>>> neck.
>>> If I'm reading your first message correctly you use hadoop common and
>>> spark-core? Although I'm not sure why spark-core shows under gson:
>>>
>>> [INFO] +- com.google.code.gson:gson:jar:2.2.2:compile
>>> [INFO] \- org.apache.spark:spark-core_2.10:jar:1.4.0:compile
>>>
>>>
>>> Do you fat jar spark-core? Do you have spark-assembly in Tomcat's runtime 
>>> classpath anywhere? Curious on what a minimal setup here is.
>>>
>>>
>>> Thanks a lot (I'd love to see your .pom if you have it on github or 
>>> somewhere accessible).
>>>
>>>
>>> On Fri, Jul 10, 2015 at 2:24 PM, Zoran Jeremic 
>>> wrote:
>>>
 It looks like there is no problem with Tomcat 8.

 On Fri, Jul 10, 2015 at 11:12 AM, Zoran Jeremic <
 zoran.jere...@gmail.com> wrote:

> Hi Ted,
>
> I'm running Tomcat 7 with Java:
>
> java version "1.8.0_45"
> Java(TM) SE Runtime Environment (build 1.8.0_45-b14)
> Java HotSpot(TM) 64-Bit Server VM (build 25.45-b02, mixed mode)
>
> Zoran
>
>
> On Fri, Jul 10, 2015 at 10:45 AM, Ted Yu  wrote:
>
>> What version of Java is Tomcat run ?
>>
>> Thanks
>>
>>
>>
>> On Jul 10, 2015, at 10:09 AM, Zoran Jeremic 
>> wrote:
>>
>> Hi,
>>
>> I've developed maven application that uses mongo-hadoop connector to
>> pull data from mongodb and process it using Apache spark. The whole 
>> process
>> runs smoothly if I run it on embedded Jetty server. However, if I deploy 
>> it
>> to Tomcat server 7, it's always interrupted at the line of code which
>> collects data from JavaPairRDD with exception that doesn't give me any 
>> clue
>> what the problem might be:
>>
>> 15/07/09 20:42:05 INFO storage.BlockManagerInfo: Added 
>> broadcast_0_piece0 in memory on localhost:58302 (size: 6.3 KB, free: 
>> 946.6 MB)
>>  15/07/09 20:42:05 INFO spark.SparkContext: Created broadcast 0 from 
>> newAPIHadoopRDD at SparkCategoryRecommender.java:106Exception in thread 
>> "Thread-6" java.lang.IncompatibleClassChangeError: Implementing class
>> at java.lang.ClassLoader.defineClass1(Native Method)
>> at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
>> at 
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>> at 
>> org.apache.catalina.loader.WebappClassLoader.findClassInternal(WebappClassLoader.java:2918)
>> at 
>> org.apache.catalina.loader.WebappClassLoader.findClass(WebappClassLoader.java:1174)
>> at 
>> org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1669)
>> at 
>> org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1547)
>> at java.lang.Class.forName0(Native Method)
>> at java.lang.Class.forName(Class.java:264)
>> at 
>> org.apache.spark.mapreduce.SparkHadoopMapReduceUtil$class.firstAvailableClass(SparkHadoopMapReduceUtil.scala:74)
>> at 
>> org.apache.spark.mapreduce.SparkHadoopMapReduceUtil$class.newJobContext(SparkHadoopMapReduceUtil.scala:28)
>> at 
>> org.apache.spark.rdd.NewHadoopRDD.newJobContext(NewHadoopRDD.scala:66)
>> at 
>> org.apache.spark.rdd.NewHadoopRDD.getPartitions(NewHadoopRDD.scala:94)
>> at 
>> org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
>> at 
>> org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
>> at sc

Re: Spark on Tomcat has exception IncompatibleClassChangeError: Implementing class

2015-07-26 Thread Ted Yu
bq. [INFO] \- org.apache.spark:spark-core_2.10:jar:1.4.0:compile

I think the above notation means spark-core_2.10 is the last dependency.

Cheers

On Thu, Jul 23, 2015 at 9:22 PM, Zoran Jeremic 
wrote:

> Hi Yana,
>
> Sorry for late response. I just saw your email. At the end I ended with
> the following pom https://www.dropbox.com/s/19fldb9qnnfieck/pom.xml?dl=0
> There were multiple problems I had to struggle with. One of these were
> that my application had REST implemented with jboss jersey which got
> conflicts with sun jersey embedded in spark libraries, so after several
> days of trying to fix that, I change my REST implementation and switched to
> sun jersey.
>
> Although I'm not sure why spark-core shows under gson:
>
> [INFO] +- com.google.code.gson:gson:jar:2.2.2:compile
> [INFO] \- org.apache.spark:spark-core_2.10:jar:1.4.0:compile
>
> It's not actually under gson. Both, gson and spark-core are root libraries. 
> Spark-core has children libraries, so that's why it has "\-" in front of it.
>
> Zoran
>
> On Mon, Jul 13, 2015 at 1:17 PM, Yana Kadiyska 
> wrote:
>
>> Oh, this is very interesting -- can you explain about your dependencies
>> -- I'm running Tomcat 7 and ended up using spark-assembly from WEB_INF/lib
>> and removing the javax/servlet package out of it...but it's a pain in the
>> neck.
>> If I'm reading your first message correctly you use hadoop common and
>> spark-core? Although I'm not sure why spark-core shows under gson:
>>
>> [INFO] +- com.google.code.gson:gson:jar:2.2.2:compile
>> [INFO] \- org.apache.spark:spark-core_2.10:jar:1.4.0:compile
>>
>>
>> Do you fat jar spark-core? Do you have spark-assembly in Tomcat's runtime 
>> classpath anywhere? Curious on what a minimal setup here is.
>>
>>
>> Thanks a lot (I'd love to see your .pom if you have it on github or 
>> somewhere accessible).
>>
>>
>> On Fri, Jul 10, 2015 at 2:24 PM, Zoran Jeremic 
>> wrote:
>>
>>> It looks like there is no problem with Tomcat 8.
>>>
>>> On Fri, Jul 10, 2015 at 11:12 AM, Zoran Jeremic >> > wrote:
>>>
 Hi Ted,

 I'm running Tomcat 7 with Java:

 java version "1.8.0_45"
 Java(TM) SE Runtime Environment (build 1.8.0_45-b14)
 Java HotSpot(TM) 64-Bit Server VM (build 25.45-b02, mixed mode)

 Zoran


 On Fri, Jul 10, 2015 at 10:45 AM, Ted Yu  wrote:

> What version of Java is Tomcat run ?
>
> Thanks
>
>
>
> On Jul 10, 2015, at 10:09 AM, Zoran Jeremic 
> wrote:
>
> Hi,
>
> I've developed maven application that uses mongo-hadoop connector to
> pull data from mongodb and process it using Apache spark. The whole 
> process
> runs smoothly if I run it on embedded Jetty server. However, if I deploy 
> it
> to Tomcat server 7, it's always interrupted at the line of code which
> collects data from JavaPairRDD with exception that doesn't give me any 
> clue
> what the problem might be:
>
> 15/07/09 20:42:05 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 
> in memory on localhost:58302 (size: 6.3 KB, free: 946.6 MB)
>  15/07/09 20:42:05 INFO spark.SparkContext: Created broadcast 0 from 
> newAPIHadoopRDD at SparkCategoryRecommender.java:106Exception in thread 
> "Thread-6" java.lang.IncompatibleClassChangeError: Implementing class
> at java.lang.ClassLoader.defineClass1(Native Method)
> at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
> at 
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
> at 
> org.apache.catalina.loader.WebappClassLoader.findClassInternal(WebappClassLoader.java:2918)
> at 
> org.apache.catalina.loader.WebappClassLoader.findClass(WebappClassLoader.java:1174)
> at 
> org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1669)
> at 
> org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1547)
> at java.lang.Class.forName0(Native Method)
> at java.lang.Class.forName(Class.java:264)
> at 
> org.apache.spark.mapreduce.SparkHadoopMapReduceUtil$class.firstAvailableClass(SparkHadoopMapReduceUtil.scala:74)
> at 
> org.apache.spark.mapreduce.SparkHadoopMapReduceUtil$class.newJobContext(SparkHadoopMapReduceUtil.scala:28)
> at 
> org.apache.spark.rdd.NewHadoopRDD.newJobContext(NewHadoopRDD.scala:66)
> at 
> org.apache.spark.rdd.NewHadoopRDD.getPartitions(NewHadoopRDD.scala:94)
> at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
> at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
> at scala.Option.getOrElse(Option.scala:120)
> at org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
> at 
> org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:32)
> at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)

Re: Spark on Tomcat has exception IncompatibleClassChangeError: Implementing class

2015-07-23 Thread Zoran Jeremic
Hi Yana,

Sorry for late response. I just saw your email. At the end I ended with the
following pom https://www.dropbox.com/s/19fldb9qnnfieck/pom.xml?dl=0
There were multiple problems I had to struggle with. One of these were that
my application had REST implemented with jboss jersey which got conflicts
with sun jersey embedded in spark libraries, so after several days of
trying to fix that, I change my REST implementation and switched to sun
jersey.

Although I'm not sure why spark-core shows under gson:

[INFO] +- com.google.code.gson:gson:jar:2.2.2:compile
[INFO] \- org.apache.spark:spark-core_2.10:jar:1.4.0:compile

It's not actually under gson. Both, gson and spark-core are root
libraries. Spark-core has children libraries, so that's why it has
"\-" in front of it.

Zoran

On Mon, Jul 13, 2015 at 1:17 PM, Yana Kadiyska 
wrote:

> Oh, this is very interesting -- can you explain about your dependencies --
> I'm running Tomcat 7 and ended up using spark-assembly from WEB_INF/lib and
> removing the javax/servlet package out of it...but it's a pain in the neck.
> If I'm reading your first message correctly you use hadoop common and
> spark-core? Although I'm not sure why spark-core shows under gson:
>
> [INFO] +- com.google.code.gson:gson:jar:2.2.2:compile
> [INFO] \- org.apache.spark:spark-core_2.10:jar:1.4.0:compile
>
>
> Do you fat jar spark-core? Do you have spark-assembly in Tomcat's runtime 
> classpath anywhere? Curious on what a minimal setup here is.
>
>
> Thanks a lot (I'd love to see your .pom if you have it on github or somewhere 
> accessible).
>
>
> On Fri, Jul 10, 2015 at 2:24 PM, Zoran Jeremic 
> wrote:
>
>> It looks like there is no problem with Tomcat 8.
>>
>> On Fri, Jul 10, 2015 at 11:12 AM, Zoran Jeremic 
>> wrote:
>>
>>> Hi Ted,
>>>
>>> I'm running Tomcat 7 with Java:
>>>
>>> java version "1.8.0_45"
>>> Java(TM) SE Runtime Environment (build 1.8.0_45-b14)
>>> Java HotSpot(TM) 64-Bit Server VM (build 25.45-b02, mixed mode)
>>>
>>> Zoran
>>>
>>>
>>> On Fri, Jul 10, 2015 at 10:45 AM, Ted Yu  wrote:
>>>
 What version of Java is Tomcat run ?

 Thanks



 On Jul 10, 2015, at 10:09 AM, Zoran Jeremic 
 wrote:

 Hi,

 I've developed maven application that uses mongo-hadoop connector to
 pull data from mongodb and process it using Apache spark. The whole process
 runs smoothly if I run it on embedded Jetty server. However, if I deploy it
 to Tomcat server 7, it's always interrupted at the line of code which
 collects data from JavaPairRDD with exception that doesn't give me any clue
 what the problem might be:

 15/07/09 20:42:05 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 
 in memory on localhost:58302 (size: 6.3 KB, free: 946.6 MB)
  15/07/09 20:42:05 INFO spark.SparkContext: Created broadcast 0 from 
 newAPIHadoopRDD at SparkCategoryRecommender.java:106Exception in thread 
 "Thread-6" java.lang.IncompatibleClassChangeError: Implementing class
 at java.lang.ClassLoader.defineClass1(Native Method)
 at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
 at 
 java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
 at 
 org.apache.catalina.loader.WebappClassLoader.findClassInternal(WebappClassLoader.java:2918)
 at 
 org.apache.catalina.loader.WebappClassLoader.findClass(WebappClassLoader.java:1174)
 at 
 org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1669)
 at 
 org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1547)
 at java.lang.Class.forName0(Native Method)
 at java.lang.Class.forName(Class.java:264)
 at 
 org.apache.spark.mapreduce.SparkHadoopMapReduceUtil$class.firstAvailableClass(SparkHadoopMapReduceUtil.scala:74)
 at 
 org.apache.spark.mapreduce.SparkHadoopMapReduceUtil$class.newJobContext(SparkHadoopMapReduceUtil.scala:28)
 at 
 org.apache.spark.rdd.NewHadoopRDD.newJobContext(NewHadoopRDD.scala:66)
 at 
 org.apache.spark.rdd.NewHadoopRDD.getPartitions(NewHadoopRDD.scala:94)
 at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
 at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
 at scala.Option.getOrElse(Option.scala:120)
 at org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
 at 
 org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:32)
 at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
 at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
 at scala.Option.getOrElse(Option.scala:120)
 at org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
 at org.apache.spark.SparkContext.runJob(SparkContext.scala:1779)
 at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:885)
 at

Re: Spark on Tomcat has exception IncompatibleClassChangeError: Implementing class

2015-07-13 Thread Yana Kadiyska
Oh, this is very interesting -- can you explain about your dependencies --
I'm running Tomcat 7 and ended up using spark-assembly from WEB_INF/lib and
removing the javax/servlet package out of it...but it's a pain in the neck.
If I'm reading your first message correctly you use hadoop common and
spark-core? Although I'm not sure why spark-core shows under gson:

[INFO] +- com.google.code.gson:gson:jar:2.2.2:compile
[INFO] \- org.apache.spark:spark-core_2.10:jar:1.4.0:compile


Do you fat jar spark-core? Do you have spark-assembly in Tomcat's
runtime classpath anywhere? Curious on what a minimal setup here is.


Thanks a lot (I'd love to see your .pom if you have it on github or
somewhere accessible).


On Fri, Jul 10, 2015 at 2:24 PM, Zoran Jeremic 
wrote:

> It looks like there is no problem with Tomcat 8.
>
> On Fri, Jul 10, 2015 at 11:12 AM, Zoran Jeremic 
> wrote:
>
>> Hi Ted,
>>
>> I'm running Tomcat 7 with Java:
>>
>> java version "1.8.0_45"
>> Java(TM) SE Runtime Environment (build 1.8.0_45-b14)
>> Java HotSpot(TM) 64-Bit Server VM (build 25.45-b02, mixed mode)
>>
>> Zoran
>>
>>
>> On Fri, Jul 10, 2015 at 10:45 AM, Ted Yu  wrote:
>>
>>> What version of Java is Tomcat run ?
>>>
>>> Thanks
>>>
>>>
>>>
>>> On Jul 10, 2015, at 10:09 AM, Zoran Jeremic 
>>> wrote:
>>>
>>> Hi,
>>>
>>> I've developed maven application that uses mongo-hadoop connector to
>>> pull data from mongodb and process it using Apache spark. The whole process
>>> runs smoothly if I run it on embedded Jetty server. However, if I deploy it
>>> to Tomcat server 7, it's always interrupted at the line of code which
>>> collects data from JavaPairRDD with exception that doesn't give me any clue
>>> what the problem might be:
>>>
>>> 15/07/09 20:42:05 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 
>>> in memory on localhost:58302 (size: 6.3 KB, free: 946.6 MB)
>>>  15/07/09 20:42:05 INFO spark.SparkContext: Created broadcast 0 from 
>>> newAPIHadoopRDD at SparkCategoryRecommender.java:106Exception in thread 
>>> "Thread-6" java.lang.IncompatibleClassChangeError: Implementing class
>>> at java.lang.ClassLoader.defineClass1(Native Method)
>>> at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
>>> at 
>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>> at 
>>> org.apache.catalina.loader.WebappClassLoader.findClassInternal(WebappClassLoader.java:2918)
>>> at 
>>> org.apache.catalina.loader.WebappClassLoader.findClass(WebappClassLoader.java:1174)
>>> at 
>>> org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1669)
>>> at 
>>> org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1547)
>>> at java.lang.Class.forName0(Native Method)
>>> at java.lang.Class.forName(Class.java:264)
>>> at 
>>> org.apache.spark.mapreduce.SparkHadoopMapReduceUtil$class.firstAvailableClass(SparkHadoopMapReduceUtil.scala:74)
>>> at 
>>> org.apache.spark.mapreduce.SparkHadoopMapReduceUtil$class.newJobContext(SparkHadoopMapReduceUtil.scala:28)
>>> at 
>>> org.apache.spark.rdd.NewHadoopRDD.newJobContext(NewHadoopRDD.scala:66)
>>> at 
>>> org.apache.spark.rdd.NewHadoopRDD.getPartitions(NewHadoopRDD.scala:94)
>>> at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
>>> at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
>>> at scala.Option.getOrElse(Option.scala:120)
>>> at org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
>>> at 
>>> org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:32)
>>> at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
>>> at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
>>> at scala.Option.getOrElse(Option.scala:120)
>>> at org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
>>> at org.apache.spark.SparkContext.runJob(SparkContext.scala:1779)
>>> at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:885)
>>> at 
>>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
>>> at 
>>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
>>> at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
>>> at org.apache.spark.rdd.RDD.collect(RDD.scala:884)
>>> at 
>>> org.apache.spark.api.java.JavaRDDLike$class.collect(JavaRDDLike.scala:335)
>>> at 
>>> org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:47)
>>> at 
>>> com.warrantylife.iqmetrix.spark.SparkCategoryRecommender.buildDataModelOnProductid(SparkCategoryRecommender.java:149)
>>> at 
>>> com.warrantylife.iqmetrix.spark.SparkCategoryRecommender.computeAndStoreRecommendationsForProductsInCategory(SparkCategoryRecommender.java:179)
>>> at 
>>> com.warrantylife.recommendation.impl.CategoryRecommendationManagerImpl.computeRecommendationsForCategory(CategoryRecommendationManagerImpl.java:105)
>>> at 
>>>

Re: Spark on Tomcat has exception IncompatibleClassChangeError: Implementing class

2015-07-10 Thread Zoran Jeremic
It looks like there is no problem with Tomcat 8.

On Fri, Jul 10, 2015 at 11:12 AM, Zoran Jeremic 
wrote:

> Hi Ted,
>
> I'm running Tomcat 7 with Java:
>
> java version "1.8.0_45"
> Java(TM) SE Runtime Environment (build 1.8.0_45-b14)
> Java HotSpot(TM) 64-Bit Server VM (build 25.45-b02, mixed mode)
>
> Zoran
>
>
> On Fri, Jul 10, 2015 at 10:45 AM, Ted Yu  wrote:
>
>> What version of Java is Tomcat run ?
>>
>> Thanks
>>
>>
>>
>> On Jul 10, 2015, at 10:09 AM, Zoran Jeremic 
>> wrote:
>>
>> Hi,
>>
>> I've developed maven application that uses mongo-hadoop connector to pull
>> data from mongodb and process it using Apache spark. The whole process runs
>> smoothly if I run it on embedded Jetty server. However, if I deploy it to
>> Tomcat server 7, it's always interrupted at the line of code which collects
>> data from JavaPairRDD with exception that doesn't give me any clue what the
>> problem might be:
>>
>> 15/07/09 20:42:05 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in 
>> memory on localhost:58302 (size: 6.3 KB, free: 946.6 MB)
>>  15/07/09 20:42:05 INFO spark.SparkContext: Created broadcast 0 from 
>> newAPIHadoopRDD at SparkCategoryRecommender.java:106Exception in thread 
>> "Thread-6" java.lang.IncompatibleClassChangeError: Implementing class
>> at java.lang.ClassLoader.defineClass1(Native Method)
>> at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
>> at 
>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>> at 
>> org.apache.catalina.loader.WebappClassLoader.findClassInternal(WebappClassLoader.java:2918)
>> at 
>> org.apache.catalina.loader.WebappClassLoader.findClass(WebappClassLoader.java:1174)
>> at 
>> org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1669)
>> at 
>> org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1547)
>> at java.lang.Class.forName0(Native Method)
>> at java.lang.Class.forName(Class.java:264)
>> at 
>> org.apache.spark.mapreduce.SparkHadoopMapReduceUtil$class.firstAvailableClass(SparkHadoopMapReduceUtil.scala:74)
>> at 
>> org.apache.spark.mapreduce.SparkHadoopMapReduceUtil$class.newJobContext(SparkHadoopMapReduceUtil.scala:28)
>> at org.apache.spark.rdd.NewHadoopRDD.newJobContext(NewHadoopRDD.scala:66)
>> at org.apache.spark.rdd.NewHadoopRDD.getPartitions(NewHadoopRDD.scala:94)
>> at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
>> at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
>> at scala.Option.getOrElse(Option.scala:120)
>> at org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
>> at 
>> org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:32)
>> at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
>> at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
>> at scala.Option.getOrElse(Option.scala:120)
>> at org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
>> at org.apache.spark.SparkContext.runJob(SparkContext.scala:1779)
>> at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:885)
>> at 
>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
>> at 
>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
>> at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
>> at org.apache.spark.rdd.RDD.collect(RDD.scala:884)
>> at 
>> org.apache.spark.api.java.JavaRDDLike$class.collect(JavaRDDLike.scala:335)
>> at 
>> org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:47)
>> at 
>> com.warrantylife.iqmetrix.spark.SparkCategoryRecommender.buildDataModelOnProductid(SparkCategoryRecommender.java:149)
>> at 
>> com.warrantylife.iqmetrix.spark.SparkCategoryRecommender.computeAndStoreRecommendationsForProductsInCategory(SparkCategoryRecommender.java:179)
>> at 
>> com.warrantylife.recommendation.impl.CategoryRecommendationManagerImpl.computeRecommendationsForCategory(CategoryRecommendationManagerImpl.java:105)
>> at 
>> com.warrantylife.testdata.TestDataGeneratorService$1.run(TestDataGeneratorService.java:55)
>> at java.lang.Thread.run(Thread.java:745)
>>
>>
>> I guess there is some library conflict happening on the Tomcat, but I
>> have no idea where to look for problem.
>>
>> This is my whole dependency tree:
>>
>> [INFO] --- maven-dependency-plugin:2.1:tree (default-cli) @ 
>> warranty-analytics ---[INFO] 
>> org.warrantylife:warranty-analytics:war:0.1.0-SNAPSHOT[INFO] +- 
>> javax.servlet:javax.servlet-api:jar:3.0.1:provided[INFO] +- 
>> com.sun.jersey:jersey-client:jar:1.19:compile[INFO] +- 
>> com.sun.jersey:jersey-server:jar:1.19:compile[INFO] +- 
>> com.sun.jersey:jersey-core:jar:1.19:compile[INFO] |  \- 
>> javax.ws.rs:jsr311-api:jar:1.1.1:compile[INFO] +- 
>> com.sun.jersey:jersey-servlet:jar:1.19:compile[INFO] +- 
>> org.apache.hadoop:hadoop-common:jar:2.4.1:c

Re: Spark on Tomcat has exception IncompatibleClassChangeError: Implementing class

2015-07-10 Thread Zoran Jeremic
Hi Ted,

I'm running Tomcat 7 with Java:

java version "1.8.0_45"
Java(TM) SE Runtime Environment (build 1.8.0_45-b14)
Java HotSpot(TM) 64-Bit Server VM (build 25.45-b02, mixed mode)

Zoran


On Fri, Jul 10, 2015 at 10:45 AM, Ted Yu  wrote:

> What version of Java is Tomcat run ?
>
> Thanks
>
>
>
> On Jul 10, 2015, at 10:09 AM, Zoran Jeremic 
> wrote:
>
> Hi,
>
> I've developed maven application that uses mongo-hadoop connector to pull
> data from mongodb and process it using Apache spark. The whole process runs
> smoothly if I run it on embedded Jetty server. However, if I deploy it to
> Tomcat server 7, it's always interrupted at the line of code which collects
> data from JavaPairRDD with exception that doesn't give me any clue what the
> problem might be:
>
> 15/07/09 20:42:05 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in 
> memory on localhost:58302 (size: 6.3 KB, free: 946.6 MB)
>  15/07/09 20:42:05 INFO spark.SparkContext: Created broadcast 0 from 
> newAPIHadoopRDD at SparkCategoryRecommender.java:106Exception in thread 
> "Thread-6" java.lang.IncompatibleClassChangeError: Implementing class
> at java.lang.ClassLoader.defineClass1(Native Method)
> at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
> at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
> at 
> org.apache.catalina.loader.WebappClassLoader.findClassInternal(WebappClassLoader.java:2918)
> at 
> org.apache.catalina.loader.WebappClassLoader.findClass(WebappClassLoader.java:1174)
> at 
> org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1669)
> at 
> org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1547)
> at java.lang.Class.forName0(Native Method)
> at java.lang.Class.forName(Class.java:264)
> at 
> org.apache.spark.mapreduce.SparkHadoopMapReduceUtil$class.firstAvailableClass(SparkHadoopMapReduceUtil.scala:74)
> at 
> org.apache.spark.mapreduce.SparkHadoopMapReduceUtil$class.newJobContext(SparkHadoopMapReduceUtil.scala:28)
> at org.apache.spark.rdd.NewHadoopRDD.newJobContext(NewHadoopRDD.scala:66)
> at org.apache.spark.rdd.NewHadoopRDD.getPartitions(NewHadoopRDD.scala:94)
> at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
> at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
> at scala.Option.getOrElse(Option.scala:120)
> at org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
> at 
> org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:32)
> at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
> at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
> at scala.Option.getOrElse(Option.scala:120)
> at org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
> at org.apache.spark.SparkContext.runJob(SparkContext.scala:1779)
> at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:885)
> at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
> at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
> at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
> at org.apache.spark.rdd.RDD.collect(RDD.scala:884)
> at 
> org.apache.spark.api.java.JavaRDDLike$class.collect(JavaRDDLike.scala:335)
> at 
> org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:47)
> at 
> com.warrantylife.iqmetrix.spark.SparkCategoryRecommender.buildDataModelOnProductid(SparkCategoryRecommender.java:149)
> at 
> com.warrantylife.iqmetrix.spark.SparkCategoryRecommender.computeAndStoreRecommendationsForProductsInCategory(SparkCategoryRecommender.java:179)
> at 
> com.warrantylife.recommendation.impl.CategoryRecommendationManagerImpl.computeRecommendationsForCategory(CategoryRecommendationManagerImpl.java:105)
> at 
> com.warrantylife.testdata.TestDataGeneratorService$1.run(TestDataGeneratorService.java:55)
> at java.lang.Thread.run(Thread.java:745)
>
>
> I guess there is some library conflict happening on the Tomcat, but I have
> no idea where to look for problem.
>
> This is my whole dependency tree:
>
> [INFO] --- maven-dependency-plugin:2.1:tree (default-cli) @ 
> warranty-analytics ---[INFO] 
> org.warrantylife:warranty-analytics:war:0.1.0-SNAPSHOT[INFO] +- 
> javax.servlet:javax.servlet-api:jar:3.0.1:provided[INFO] +- 
> com.sun.jersey:jersey-client:jar:1.19:compile[INFO] +- 
> com.sun.jersey:jersey-server:jar:1.19:compile[INFO] +- 
> com.sun.jersey:jersey-core:jar:1.19:compile[INFO] |  \- 
> javax.ws.rs:jsr311-api:jar:1.1.1:compile[INFO] +- 
> com.sun.jersey:jersey-servlet:jar:1.19:compile[INFO] +- 
> org.apache.hadoop:hadoop-common:jar:2.4.1:compile[INFO] |  +- 
> org.apache.hadoop:hadoop-annotations:jar:2.4.1:compile[INFO] |  |  \- 
> jdk.tools:jdk.tools:jar:1.6:system[INFO] |  +- 
> com.google.guava:guava:jar:11.0.2:compile[INFO] |  +- 
> commons-cli:commons

Re: Spark on Tomcat has exception IncompatibleClassChangeError: Implementing class

2015-07-10 Thread Ted Yu
What version of Java is Tomcat run ?

Thanks



> On Jul 10, 2015, at 10:09 AM, Zoran Jeremic  wrote:
> 
> Hi,
> 
> I've developed maven application that uses mongo-hadoop connector to pull 
> data from mongodb and process it using Apache spark. The whole process runs 
> smoothly if I run it on embedded Jetty server. However, if I deploy it to 
> Tomcat server 7, it's always interrupted at the line of code which collects 
> data from JavaPairRDD with exception that doesn't give me any clue what the 
> problem might be:
> 
> 15/07/09 20:42:05 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in 
> memory on localhost:58302 (size: 6.3 KB, free: 946.6 MB)
>  15/07/09 20:42:05 INFO spark.SparkContext: Created broadcast 0 from 
> newAPIHadoopRDD at SparkCategoryRecommender.java:106
> Exception in thread "Thread-6" java.lang.IncompatibleClassChangeError: 
> Implementing class
> at java.lang.ClassLoader.defineClass1(Native Method)
> at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
> at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
> at 
> org.apache.catalina.loader.WebappClassLoader.findClassInternal(WebappClassLoader.java:2918)
> at 
> org.apache.catalina.loader.WebappClassLoader.findClass(WebappClassLoader.java:1174)
> at 
> org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1669)
> at 
> org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1547)
> at java.lang.Class.forName0(Native Method)
> at java.lang.Class.forName(Class.java:264)
> at 
> org.apache.spark.mapreduce.SparkHadoopMapReduceUtil$class.firstAvailableClass(SparkHadoopMapReduceUtil.scala:74)
> at 
> org.apache.spark.mapreduce.SparkHadoopMapReduceUtil$class.newJobContext(SparkHadoopMapReduceUtil.scala:28)
> at org.apache.spark.rdd.NewHadoopRDD.newJobContext(NewHadoopRDD.scala:66)
> at org.apache.spark.rdd.NewHadoopRDD.getPartitions(NewHadoopRDD.scala:94)
> at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
> at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
> at scala.Option.getOrElse(Option.scala:120)
> at org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
> at 
> org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:32)
> at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
> at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
> at scala.Option.getOrElse(Option.scala:120)
> at org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
> at org.apache.spark.SparkContext.runJob(SparkContext.scala:1779)
> at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:885)
> at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
> at 
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:109)
> at org.apache.spark.rdd.RDD.withScope(RDD.scala:286)
> at org.apache.spark.rdd.RDD.collect(RDD.scala:884)
> at 
> org.apache.spark.api.java.JavaRDDLike$class.collect(JavaRDDLike.scala:335)
> at 
> org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:47)
> at 
> com.warrantylife.iqmetrix.spark.SparkCategoryRecommender.buildDataModelOnProductid(SparkCategoryRecommender.java:149)
> at 
> com.warrantylife.iqmetrix.spark.SparkCategoryRecommender.computeAndStoreRecommendationsForProductsInCategory(SparkCategoryRecommender.java:179)
> at 
> com.warrantylife.recommendation.impl.CategoryRecommendationManagerImpl.computeRecommendationsForCategory(CategoryRecommendationManagerImpl.java:105)
> at 
> com.warrantylife.testdata.TestDataGeneratorService$1.run(TestDataGeneratorService.java:55)
> at java.lang.Thread.run(Thread.java:745)
> 
> 
> I guess there is some library conflict happening on the Tomcat, but I have no 
> idea where to look for problem.
> 
> This is my whole dependency tree:
> 
> [INFO] --- maven-dependency-plugin:2.1:tree (default-cli) @ 
> warranty-analytics ---
> [INFO] org.warrantylife:warranty-analytics:war:0.1.0-SNAPSHOT
> [INFO] +- javax.servlet:javax.servlet-api:jar:3.0.1:provided
> [INFO] +- com.sun.jersey:jersey-client:jar:1.19:compile
> [INFO] +- com.sun.jersey:jersey-server:jar:1.19:compile
> [INFO] +- com.sun.jersey:jersey-core:jar:1.19:compile
> [INFO] |  \- javax.ws.rs:jsr311-api:jar:1.1.1:compile
> [INFO] +- com.sun.jersey:jersey-servlet:jar:1.19:compile
> [INFO] +- org.apache.hadoop:hadoop-common:jar:2.4.1:compile
> [INFO] |  +- org.apache.hadoop:hadoop-annotations:jar:2.4.1:compile
> [INFO] |  |  \- jdk.tools:jdk.tools:jar:1.6:system
> [INFO] |  +- com.google.guava:guava:jar:11.0.2:compile
> [INFO] |  +- commons-cli:commons-cli:jar:1.2:compile
> [INFO] |  +- xmlenc:xmlenc:jar:0.52:compile
> [INFO] |  +- commons-httpclient:commons-httpclient:jar:3.1:compile
> [INFO] |  +- commons-codec:commons-codec:jar:1.4:compile
> [INFO] |  +- commons-io:commons-io:jar: