Re: Spark's Guava pieces cause exceptions in non-trivial deployments

2015-05-18 Thread Steve Loughran

On 16 May 2015, at 04:39, Anton Brazhnyk 
anton.brazh...@genesys.commailto:anton.brazh...@genesys.com wrote:

For me it wouldn’t help I guess, because those newer classes would still be 
loaded by different classloader.
What did work for me with 1.3.1 – removing of those classes from Spark’s jar 
completely, so they get loaded from external Guava (the version I prefer) and 
by the classloader I expect.


Note that Hadoop = 2.6.0 wont' work with Guava = 0.17; see: HADOOP-11032

FWIW Guava is a version nightmare across the hadoop stack; almost as bad as 
protobuf.jar. With Hadoop 2.7+, Hadoop will run on later versions, it'll just 
continue to ship an older one to avoid breaking apps that expect it.


Re: Spark's Guava pieces cause exceptions in non-trivial deployments

2015-05-15 Thread Marcelo Vanzin
On Fri, May 15, 2015 at 2:35 PM, Thomas Dudziak tom...@gmail.com wrote:

 I've just been through this exact case with shaded guava in our Mesos
 setup and that is how it behaves there (with Spark 1.3.1).


If that's the case, it's a bug in the Mesos backend, since the spark.*
options should behave exactly the same as SPARK_CLASSPATH. It would be nice
to know whether that is also the case in 1.4 (I took a quick look at the
related code and it seems correct), but I don't have Mesos around to test.





 On Fri, May 15, 2015 at 12:04 PM, Marcelo Vanzin van...@cloudera.com
 wrote:

 On Fri, May 15, 2015 at 11:56 AM, Thomas Dudziak tom...@gmail.com
 wrote:

 Actually the extraClassPath settings put the extra jars at the end of
 the classpath so they won't help. Only the deprecated SPARK_CLASSPATH puts
 them at the front.


 That's definitely not the case for YARN:

 https://github.com/apache/spark/blob/master/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala#L1013

 And it's been like that for as far as I remember.

 I'm almost sure that's also the case for standalone, at least in master /
 1.4, since I touched a lot of that code recently.

 It would be really weird if those options worked differently from
 SPARK_CLASSPATH, since they were meant to replace it.


 On Fri, May 15, 2015 at 11:54 AM, Marcelo Vanzin van...@cloudera.com
 wrote:

 Ah, I see. yeah, it sucks that Spark has to expose Optional (and things
 it depends on), but removing that would break the public API, so...

 One last thing you could try is do add your newer Guava jar to
 spark.driver.extraClassPath and spark.executor.extraClassPath. Those
 settings will place your jars before Spark's in the classpath, so you'd
 actually be using the newer versions of the conflicting classes everywhere.

 It does require manually distributing the Guava jar to the same
 location on all nodes in the cluster, though.

 If that doesn't work. Thomas's suggestion of shading Guava in your app
 can be used as a last resort.


 On Thu, May 14, 2015 at 7:38 PM, Anton Brazhnyk 
 anton.brazh...@genesys.com wrote:

  The problem is with 1.3.1

 It has Function class (mentioned in exception) in
 spark-network-common_2.10-1.3.1.jar.

 Our current resolution is actually backport to 1.2.2, which is working
 fine.





 *From:* Marcelo Vanzin [mailto:van...@cloudera.com]
 *Sent:* Thursday, May 14, 2015 6:27 PM
 *To:* Anton Brazhnyk
 *Cc:* user@spark.apache.org
 *Subject:* Re: Spark's Guava pieces cause exceptions in non-trivial
 deployments



 What version of Spark are you using?

 The bug you mention is only about the Optional class (and a handful of
 others, but none of the classes you're having problems with). All other
 Guava classes should be shaded since Spark 1.2, so you should be able to
 use your own version of Guava with no problems (aside from the Optional
 classes).

 Also, Spark 1.3 added some improvements to how shading is done, so if
 you're using 1.2 I'd recommend trying 1.3 before declaring defeat.



 On Thu, May 14, 2015 at 4:52 PM, Anton Brazhnyk 
 anton.brazh...@genesys.com wrote:

  Greetings,



 I have a relatively complex application with Spark, Jetty and Guava
 (16) not fitting together.

 Exception happens when some components try to use “mix” of Guava
 classes (including Spark’s pieces) that are loaded by different
 classloaders:

 java.lang.LinkageError: loader constraint violation: when resolving
 method
 com.google.common.collect.Iterables.transform(Ljava/lang/Iterable;Lcom/google/common/base/Function;)Ljava/lang/Iterable;
 the class loader (instance of org/eclipse/jetty/webapp/WebAppClassLoader)
 of the current class, org/apache/cassandra/db/ColumnFamilyStore, and the
 class loader (instance of java/net/URLClassLoader) for resolved class,
 com/google/common/collect/Iterables, have different Class objects for the
 type e;Lcom/google/common/base/Function;)Ljava/lang/Iterable; used in the
 signature



 According to https://issues.apache.org/jira/browse/SPARK-4819 it’s
 not going to be fixed at least until Spark 2.0, but maybe some workaround
 is possible?

 Those classes are pretty simple and have low chances to be changed in
 Guava significantly, so any “external” Guava can provide them.



 So, could such problems be fixed if those Spark’s pieces of Guava
 would be in separate jar and could be excluded from the mix (substituted 
 by
 “external” Guava)?





 Thanks,

 Anton




 --

 Marcelo




 --
 Marcelo





 --
 Marcelo





-- 
Marcelo


RE: Spark's Guava pieces cause exceptions in non-trivial deployments

2015-05-15 Thread Anton Brazhnyk
For me it wouldn’t help I guess, because those newer classes would still be 
loaded by different classloader.
What did work for me with 1.3.1 – removing of those classes from Spark’s jar 
completely, so they get loaded from external Guava (the version I prefer) and 
by the classloader I expect.
That’s why I proposed to put them into separate maven artifact where they could 
be just excluded in the build of the app that depends on Spark.

From: Marcelo Vanzin [mailto:van...@cloudera.com]
Sent: Friday, May 15, 2015 11:55 AM
To: Anton Brazhnyk
Cc: user@spark.apache.org
Subject: Re: Spark's Guava pieces cause exceptions in non-trivial deployments

Ah, I see. yeah, it sucks that Spark has to expose Optional (and things it 
depends on), but removing that would break the public API, so...
One last thing you could try is do add your newer Guava jar to 
spark.driver.extraClassPath and spark.executor.extraClassPath. Those 
settings will place your jars before Spark's in the classpath, so you'd 
actually be using the newer versions of the conflicting classes everywhere.
It does require manually distributing the Guava jar to the same location on all 
nodes in the cluster, though.

If that doesn't work. Thomas's suggestion of shading Guava in your app can be 
used as a last resort.


On Thu, May 14, 2015 at 7:38 PM, Anton Brazhnyk 
anton.brazh...@genesys.commailto:anton.brazh...@genesys.com wrote:
The problem is with 1.3.1
It has Function class (mentioned in exception) in 
spark-network-common_2.10-1.3.1.jar.
Our current resolution is actually backport to 1.2.2, which is working fine.


From: Marcelo Vanzin [mailto:van...@cloudera.commailto:van...@cloudera.com]
Sent: Thursday, May 14, 2015 6:27 PM
To: Anton Brazhnyk
Cc: user@spark.apache.orgmailto:user@spark.apache.org
Subject: Re: Spark's Guava pieces cause exceptions in non-trivial deployments

What version of Spark are you using?
The bug you mention is only about the Optional class (and a handful of others, 
but none of the classes you're having problems with). All other Guava classes 
should be shaded since Spark 1.2, so you should be able to use your own version 
of Guava with no problems (aside from the Optional classes).
Also, Spark 1.3 added some improvements to how shading is done, so if you're 
using 1.2 I'd recommend trying 1.3 before declaring defeat.

On Thu, May 14, 2015 at 4:52 PM, Anton Brazhnyk 
anton.brazh...@genesys.commailto:anton.brazh...@genesys.com wrote:
Greetings,

I have a relatively complex application with Spark, Jetty and Guava (16) not 
fitting together.
Exception happens when some components try to use “mix” of Guava classes 
(including Spark’s pieces) that are loaded by different classloaders:
java.lang.LinkageError: loader constraint violation: when resolving method 
com.google.common.collect.Iterables.transform(Ljava/lang/Iterable;Lcom/google/common/base/Function;)Ljava/lang/Iterable;
 the class loader (instance of org/eclipse/jetty/webapp/WebAppClassLoader) of 
the current class, org/apache/cassandra/db/ColumnFamilyStore, and the class 
loader (instance of java/net/URLClassLoader) for resolved class, 
com/google/common/collect/Iterables, have different Class objects for the type 
e;Lcom/google/common/base/Function;)Ljava/lang/Iterable; used in the signature

According to https://issues.apache.org/jira/browse/SPARK-4819 it’s not going to 
be fixed at least until Spark 2.0, but maybe some workaround is possible?
Those classes are pretty simple and have low chances to be changed in Guava 
significantly, so any “external” Guava can provide them.

So, could such problems be fixed if those Spark’s pieces of Guava would be in 
separate jar and could be excluded from the mix (substituted by “external” 
Guava)?


Thanks,
Anton



--
Marcelo



--
Marcelo


Re: Spark's Guava pieces cause exceptions in non-trivial deployments

2015-05-15 Thread Thomas Dudziak
This is still a problem in 1.3. Optional is both used in several shaded
classes within Guava (e.g. the Immutable* classes) and itself uses shaded
classes (e.g. AbstractIterator). This causes problems in application code.
The only reliable way we've found around this is to shade Guava ourselves
for application code and thus avoid the problem altogether.

cheers,
Tom

On Thu, May 14, 2015 at 7:38 PM, Anton Brazhnyk anton.brazh...@genesys.com
wrote:

  The problem is with 1.3.1

 It has Function class (mentioned in exception) in
 spark-network-common_2.10-1.3.1.jar.

 Our current resolution is actually backport to 1.2.2, which is working
 fine.





 *From:* Marcelo Vanzin [mailto:van...@cloudera.com]
 *Sent:* Thursday, May 14, 2015 6:27 PM
 *To:* Anton Brazhnyk
 *Cc:* user@spark.apache.org
 *Subject:* Re: Spark's Guava pieces cause exceptions in non-trivial
 deployments



 What version of Spark are you using?

 The bug you mention is only about the Optional class (and a handful of
 others, but none of the classes you're having problems with). All other
 Guava classes should be shaded since Spark 1.2, so you should be able to
 use your own version of Guava with no problems (aside from the Optional
 classes).

 Also, Spark 1.3 added some improvements to how shading is done, so if
 you're using 1.2 I'd recommend trying 1.3 before declaring defeat.



 On Thu, May 14, 2015 at 4:52 PM, Anton Brazhnyk 
 anton.brazh...@genesys.com wrote:

  Greetings,



 I have a relatively complex application with Spark, Jetty and Guava (16)
 not fitting together.

 Exception happens when some components try to use “mix” of Guava classes
 (including Spark’s pieces) that are loaded by different classloaders:

 java.lang.LinkageError: loader constraint violation: when resolving method
 com.google.common.collect.Iterables.transform(Ljava/lang/Iterable;Lcom/google/common/base/Function;)Ljava/lang/Iterable;
 the class loader (instance of org/eclipse/jetty/webapp/WebAppClassLoader)
 of the current class, org/apache/cassandra/db/ColumnFamilyStore, and the
 class loader (instance of java/net/URLClassLoader) for resolved class,
 com/google/common/collect/Iterables, have different Class objects for the
 type e;Lcom/google/common/base/Function;)Ljava/lang/Iterable; used in the
 signature



 According to https://issues.apache.org/jira/browse/SPARK-4819 it’s not
 going to be fixed at least until Spark 2.0, but maybe some workaround is
 possible?

 Those classes are pretty simple and have low chances to be changed in
 Guava significantly, so any “external” Guava can provide them.



 So, could such problems be fixed if those Spark’s pieces of Guava would be
 in separate jar and could be excluded from the mix (substituted by
 “external” Guava)?





 Thanks,

 Anton




 --

 Marcelo



Re: Spark's Guava pieces cause exceptions in non-trivial deployments

2015-05-15 Thread Marcelo Vanzin
On Fri, May 15, 2015 at 11:56 AM, Thomas Dudziak tom...@gmail.com wrote:

 Actually the extraClassPath settings put the extra jars at the end of the
 classpath so they won't help. Only the deprecated SPARK_CLASSPATH puts them
 at the front.


That's definitely not the case for YARN:
https://github.com/apache/spark/blob/master/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala#L1013

And it's been like that for as far as I remember.

I'm almost sure that's also the case for standalone, at least in master /
1.4, since I touched a lot of that code recently.

It would be really weird if those options worked differently from
SPARK_CLASSPATH, since they were meant to replace it.


On Fri, May 15, 2015 at 11:54 AM, Marcelo Vanzin van...@cloudera.com
 wrote:

 Ah, I see. yeah, it sucks that Spark has to expose Optional (and things
 it depends on), but removing that would break the public API, so...

 One last thing you could try is do add your newer Guava jar to
 spark.driver.extraClassPath and spark.executor.extraClassPath. Those
 settings will place your jars before Spark's in the classpath, so you'd
 actually be using the newer versions of the conflicting classes everywhere.

 It does require manually distributing the Guava jar to the same location
 on all nodes in the cluster, though.

 If that doesn't work. Thomas's suggestion of shading Guava in your app
 can be used as a last resort.


 On Thu, May 14, 2015 at 7:38 PM, Anton Brazhnyk 
 anton.brazh...@genesys.com wrote:

  The problem is with 1.3.1

 It has Function class (mentioned in exception) in
 spark-network-common_2.10-1.3.1.jar.

 Our current resolution is actually backport to 1.2.2, which is working
 fine.





 *From:* Marcelo Vanzin [mailto:van...@cloudera.com]
 *Sent:* Thursday, May 14, 2015 6:27 PM
 *To:* Anton Brazhnyk
 *Cc:* user@spark.apache.org
 *Subject:* Re: Spark's Guava pieces cause exceptions in non-trivial
 deployments



 What version of Spark are you using?

 The bug you mention is only about the Optional class (and a handful of
 others, but none of the classes you're having problems with). All other
 Guava classes should be shaded since Spark 1.2, so you should be able to
 use your own version of Guava with no problems (aside from the Optional
 classes).

 Also, Spark 1.3 added some improvements to how shading is done, so if
 you're using 1.2 I'd recommend trying 1.3 before declaring defeat.



 On Thu, May 14, 2015 at 4:52 PM, Anton Brazhnyk 
 anton.brazh...@genesys.com wrote:

  Greetings,



 I have a relatively complex application with Spark, Jetty and Guava (16)
 not fitting together.

 Exception happens when some components try to use “mix” of Guava classes
 (including Spark’s pieces) that are loaded by different classloaders:

 java.lang.LinkageError: loader constraint violation: when resolving
 method
 com.google.common.collect.Iterables.transform(Ljava/lang/Iterable;Lcom/google/common/base/Function;)Ljava/lang/Iterable;
 the class loader (instance of org/eclipse/jetty/webapp/WebAppClassLoader)
 of the current class, org/apache/cassandra/db/ColumnFamilyStore, and the
 class loader (instance of java/net/URLClassLoader) for resolved class,
 com/google/common/collect/Iterables, have different Class objects for the
 type e;Lcom/google/common/base/Function;)Ljava/lang/Iterable; used in the
 signature



 According to https://issues.apache.org/jira/browse/SPARK-4819 it’s not
 going to be fixed at least until Spark 2.0, but maybe some workaround is
 possible?

 Those classes are pretty simple and have low chances to be changed in
 Guava significantly, so any “external” Guava can provide them.



 So, could such problems be fixed if those Spark’s pieces of Guava would
 be in separate jar and could be excluded from the mix (substituted by
 “external” Guava)?





 Thanks,

 Anton




 --

 Marcelo




 --
 Marcelo





-- 
Marcelo


Re: Spark's Guava pieces cause exceptions in non-trivial deployments

2015-05-15 Thread Thomas Dudziak
Actually the extraClassPath settings put the extra jars at the end of the
classpath so they won't help. Only the deprecated SPARK_CLASSPATH puts them
at the front.

cheers,
Tom

On Fri, May 15, 2015 at 11:54 AM, Marcelo Vanzin van...@cloudera.com
wrote:

 Ah, I see. yeah, it sucks that Spark has to expose Optional (and things it
 depends on), but removing that would break the public API, so...

 One last thing you could try is do add your newer Guava jar to
 spark.driver.extraClassPath and spark.executor.extraClassPath. Those
 settings will place your jars before Spark's in the classpath, so you'd
 actually be using the newer versions of the conflicting classes everywhere.

 It does require manually distributing the Guava jar to the same location
 on all nodes in the cluster, though.

 If that doesn't work. Thomas's suggestion of shading Guava in your app can
 be used as a last resort.


 On Thu, May 14, 2015 at 7:38 PM, Anton Brazhnyk 
 anton.brazh...@genesys.com wrote:

  The problem is with 1.3.1

 It has Function class (mentioned in exception) in
 spark-network-common_2.10-1.3.1.jar.

 Our current resolution is actually backport to 1.2.2, which is working
 fine.





 *From:* Marcelo Vanzin [mailto:van...@cloudera.com]
 *Sent:* Thursday, May 14, 2015 6:27 PM
 *To:* Anton Brazhnyk
 *Cc:* user@spark.apache.org
 *Subject:* Re: Spark's Guava pieces cause exceptions in non-trivial
 deployments



 What version of Spark are you using?

 The bug you mention is only about the Optional class (and a handful of
 others, but none of the classes you're having problems with). All other
 Guava classes should be shaded since Spark 1.2, so you should be able to
 use your own version of Guava with no problems (aside from the Optional
 classes).

 Also, Spark 1.3 added some improvements to how shading is done, so if
 you're using 1.2 I'd recommend trying 1.3 before declaring defeat.



 On Thu, May 14, 2015 at 4:52 PM, Anton Brazhnyk 
 anton.brazh...@genesys.com wrote:

  Greetings,



 I have a relatively complex application with Spark, Jetty and Guava (16)
 not fitting together.

 Exception happens when some components try to use “mix” of Guava classes
 (including Spark’s pieces) that are loaded by different classloaders:

 java.lang.LinkageError: loader constraint violation: when resolving
 method
 com.google.common.collect.Iterables.transform(Ljava/lang/Iterable;Lcom/google/common/base/Function;)Ljava/lang/Iterable;
 the class loader (instance of org/eclipse/jetty/webapp/WebAppClassLoader)
 of the current class, org/apache/cassandra/db/ColumnFamilyStore, and the
 class loader (instance of java/net/URLClassLoader) for resolved class,
 com/google/common/collect/Iterables, have different Class objects for the
 type e;Lcom/google/common/base/Function;)Ljava/lang/Iterable; used in the
 signature



 According to https://issues.apache.org/jira/browse/SPARK-4819 it’s not
 going to be fixed at least until Spark 2.0, but maybe some workaround is
 possible?

 Those classes are pretty simple and have low chances to be changed in
 Guava significantly, so any “external” Guava can provide them.



 So, could such problems be fixed if those Spark’s pieces of Guava would
 be in separate jar and could be excluded from the mix (substituted by
 “external” Guava)?





 Thanks,

 Anton




 --

 Marcelo




 --
 Marcelo



Re: Spark's Guava pieces cause exceptions in non-trivial deployments

2015-05-15 Thread Thomas Dudziak
I've just been through this exact case with shaded guava in our Mesos setup
and that is how it behaves there (with Spark 1.3.1).

cheers,
Tom

On Fri, May 15, 2015 at 12:04 PM, Marcelo Vanzin van...@cloudera.com
wrote:

 On Fri, May 15, 2015 at 11:56 AM, Thomas Dudziak tom...@gmail.com wrote:

 Actually the extraClassPath settings put the extra jars at the end of the
 classpath so they won't help. Only the deprecated SPARK_CLASSPATH puts them
 at the front.


 That's definitely not the case for YARN:

 https://github.com/apache/spark/blob/master/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala#L1013

 And it's been like that for as far as I remember.

 I'm almost sure that's also the case for standalone, at least in master /
 1.4, since I touched a lot of that code recently.

 It would be really weird if those options worked differently from
 SPARK_CLASSPATH, since they were meant to replace it.


 On Fri, May 15, 2015 at 11:54 AM, Marcelo Vanzin van...@cloudera.com
 wrote:

 Ah, I see. yeah, it sucks that Spark has to expose Optional (and things
 it depends on), but removing that would break the public API, so...

 One last thing you could try is do add your newer Guava jar to
 spark.driver.extraClassPath and spark.executor.extraClassPath. Those
 settings will place your jars before Spark's in the classpath, so you'd
 actually be using the newer versions of the conflicting classes everywhere.

 It does require manually distributing the Guava jar to the same location
 on all nodes in the cluster, though.

 If that doesn't work. Thomas's suggestion of shading Guava in your app
 can be used as a last resort.


 On Thu, May 14, 2015 at 7:38 PM, Anton Brazhnyk 
 anton.brazh...@genesys.com wrote:

  The problem is with 1.3.1

 It has Function class (mentioned in exception) in
 spark-network-common_2.10-1.3.1.jar.

 Our current resolution is actually backport to 1.2.2, which is working
 fine.





 *From:* Marcelo Vanzin [mailto:van...@cloudera.com]
 *Sent:* Thursday, May 14, 2015 6:27 PM
 *To:* Anton Brazhnyk
 *Cc:* user@spark.apache.org
 *Subject:* Re: Spark's Guava pieces cause exceptions in non-trivial
 deployments



 What version of Spark are you using?

 The bug you mention is only about the Optional class (and a handful of
 others, but none of the classes you're having problems with). All other
 Guava classes should be shaded since Spark 1.2, so you should be able to
 use your own version of Guava with no problems (aside from the Optional
 classes).

 Also, Spark 1.3 added some improvements to how shading is done, so if
 you're using 1.2 I'd recommend trying 1.3 before declaring defeat.



 On Thu, May 14, 2015 at 4:52 PM, Anton Brazhnyk 
 anton.brazh...@genesys.com wrote:

  Greetings,



 I have a relatively complex application with Spark, Jetty and Guava
 (16) not fitting together.

 Exception happens when some components try to use “mix” of Guava
 classes (including Spark’s pieces) that are loaded by different
 classloaders:

 java.lang.LinkageError: loader constraint violation: when resolving
 method
 com.google.common.collect.Iterables.transform(Ljava/lang/Iterable;Lcom/google/common/base/Function;)Ljava/lang/Iterable;
 the class loader (instance of org/eclipse/jetty/webapp/WebAppClassLoader)
 of the current class, org/apache/cassandra/db/ColumnFamilyStore, and the
 class loader (instance of java/net/URLClassLoader) for resolved class,
 com/google/common/collect/Iterables, have different Class objects for the
 type e;Lcom/google/common/base/Function;)Ljava/lang/Iterable; used in the
 signature



 According to https://issues.apache.org/jira/browse/SPARK-4819 it’s not
 going to be fixed at least until Spark 2.0, but maybe some workaround is
 possible?

 Those classes are pretty simple and have low chances to be changed in
 Guava significantly, so any “external” Guava can provide them.



 So, could such problems be fixed if those Spark’s pieces of Guava would
 be in separate jar and could be excluded from the mix (substituted by
 “external” Guava)?





 Thanks,

 Anton




 --

 Marcelo




 --
 Marcelo





 --
 Marcelo



Re: Spark's Guava pieces cause exceptions in non-trivial deployments

2015-05-14 Thread Marcelo Vanzin
What version of Spark are you using?

The bug you mention is only about the Optional class (and a handful of
others, but none of the classes you're having problems with). All other
Guava classes should be shaded since Spark 1.2, so you should be able to
use your own version of Guava with no problems (aside from the Optional
classes).

Also, Spark 1.3 added some improvements to how shading is done, so if
you're using 1.2 I'd recommend trying 1.3 before declaring defeat.


On Thu, May 14, 2015 at 4:52 PM, Anton Brazhnyk anton.brazh...@genesys.com
wrote:

  Greetings,



 I have a relatively complex application with Spark, Jetty and Guava (16)
 not fitting together.

 Exception happens when some components try to use “mix” of Guava classes
 (including Spark’s pieces) that are loaded by different classloaders:

 java.lang.LinkageError: loader constraint violation: when resolving method
 com.google.common.collect.Iterables.transform(Ljava/lang/Iterable;Lcom/google/common/base/Function;)Ljava/lang/Iterable;
 the class loader (instance of org/eclipse/jetty/webapp/WebAppClassLoader)
 of the current class, org/apache/cassandra/db/ColumnFamilyStore, and the
 class loader (instance of java/net/URLClassLoader) for resolved class,
 com/google/common/collect/Iterables, have different Class objects for the
 type e;Lcom/google/common/base/Function;)Ljava/lang/Iterable; used in the
 signature



 According to https://issues.apache.org/jira/browse/SPARK-4819 it’s not
 going to be fixed at least until Spark 2.0, but maybe some workaround is
 possible?

 Those classes are pretty simple and have low chances to be changed in
 Guava significantly, so any “external” Guava can provide them.



 So, could such problems be fixed if those Spark’s pieces of Guava would be
 in separate jar and could be excluded from the mix (substituted by
 “external” Guava)?





 Thanks,

 Anton




-- 
Marcelo


RE: Spark's Guava pieces cause exceptions in non-trivial deployments

2015-05-14 Thread Anton Brazhnyk
The problem is with 1.3.1
It has Function class (mentioned in exception) in 
spark-network-common_2.10-1.3.1.jar.
Our current resolution is actually backport to 1.2.2, which is working fine.


From: Marcelo Vanzin [mailto:van...@cloudera.com]
Sent: Thursday, May 14, 2015 6:27 PM
To: Anton Brazhnyk
Cc: user@spark.apache.org
Subject: Re: Spark's Guava pieces cause exceptions in non-trivial deployments

What version of Spark are you using?
The bug you mention is only about the Optional class (and a handful of others, 
but none of the classes you're having problems with). All other Guava classes 
should be shaded since Spark 1.2, so you should be able to use your own version 
of Guava with no problems (aside from the Optional classes).
Also, Spark 1.3 added some improvements to how shading is done, so if you're 
using 1.2 I'd recommend trying 1.3 before declaring defeat.

On Thu, May 14, 2015 at 4:52 PM, Anton Brazhnyk 
anton.brazh...@genesys.commailto:anton.brazh...@genesys.com wrote:
Greetings,

I have a relatively complex application with Spark, Jetty and Guava (16) not 
fitting together.
Exception happens when some components try to use “mix” of Guava classes 
(including Spark’s pieces) that are loaded by different classloaders:
java.lang.LinkageError: loader constraint violation: when resolving method 
com.google.common.collect.Iterables.transform(Ljava/lang/Iterable;Lcom/google/common/base/Function;)Ljava/lang/Iterable;
 the class loader (instance of org/eclipse/jetty/webapp/WebAppClassLoader) of 
the current class, org/apache/cassandra/db/ColumnFamilyStore, and the class 
loader (instance of java/net/URLClassLoader) for resolved class, 
com/google/common/collect/Iterables, have different Class objects for the type 
e;Lcom/google/common/base/Function;)Ljava/lang/Iterable; used in the signature

According to https://issues.apache.org/jira/browse/SPARK-4819 it’s not going to 
be fixed at least until Spark 2.0, but maybe some workaround is possible?
Those classes are pretty simple and have low chances to be changed in Guava 
significantly, so any “external” Guava can provide them.

So, could such problems be fixed if those Spark’s pieces of Guava would be in 
separate jar and could be excluded from the mix (substituted by “external” 
Guava)?


Thanks,
Anton



--
Marcelo