FYI I filed SPARK-3647 to track the fix (some people internally have
bumped into this also).

On Mon, Sep 22, 2014 at 1:28 PM, Cody Koeninger <c...@koeninger.org> wrote:
> We've worked around it for the meantime by excluding guava from transitive
> dependencies in the job assembly and specifying the same version of guava 14
> that spark is using.  Obviously things break whenever a guava 15 / 16
> feature is used at runtime, so a long term solution is needed.
>
> On Mon, Sep 22, 2014 at 3:13 PM, Marcelo Vanzin <van...@cloudera.com> wrote:
>>
>> Hmmm, a quick look at the code indicates this should work for
>> executors, but not for the driver... (maybe this deserves a bug being
>> filed, if there isn't one already?)
>>
>> If it's feasible for you, you could remove the Optional.class file
>> from the Spark assembly you're using.
>>
>> On Mon, Sep 22, 2014 at 12:46 PM, Cody Koeninger <c...@koeninger.org>
>> wrote:
>> > We're using Mesos, is there a reasonable expectation that
>> > spark.files.userClassPathFirst will actually work?
>> >
>> > On Mon, Sep 22, 2014 at 1:42 PM, Marcelo Vanzin <van...@cloudera.com>
>> > wrote:
>> >>
>> >> Hi Cody,
>> >>
>> >> I'm still writing a test to make sure I understood exactly what's
>> >> going on here, but from looking at the stack trace, it seems like the
>> >> newer Guava library is picking up the "Optional" class from the Spark
>> >> assembly.
>> >>
>> >> Could you try one of the options that put the user's classpath before
>> >> the Spark assembly? (spark.files.userClassPathFirst or
>> >> spark.yarn.user.classpath.first depending on which master you're
>> >> running)
>> >>
>> >> People seem to have run into issues with those options in the past,
>> >> but if they work for you, then Guava should pick its own Optional
>> >> class (instead of the one shipped with Spark) and things should then
>> >> work.
>> >>
>> >> I'll investigate a way to fix it in Spark in the meantime.
>> >>
>> >>
>> >> On Fri, Sep 19, 2014 at 10:30 PM, Cody Koeninger <c...@koeninger.org>
>> >> wrote:
>> >> > After the recent spark project changes to guava shading, I'm seeing
>> >> > issues
>> >> > with the datastax spark cassandra connector (which depends on guava
>> >> > 15.0)
>> >> > and the datastax cql driver (which depends on guava 16.0.1)
>> >> >
>> >> > Building an assembly for a job (with spark marked as provided) that
>> >> > includes either guava 15.0 or 16.0.1, results in errors like the
>> >> > following:
>> >> >
>> >> > scala> session.close
>> >> >
>> >> > scala> s[14/09/20 04:56:35 ERROR Futures$CombinedFuture: input future
>> >> > failed.
>> >> > java.lang.IllegalAccessError: tried to access class
>> >> > org.spark-project.guava.common.base.Absent from class
>> >> > com.google.common.base.Optional
>> >> >         at com.google.common.base.Optional.absent(Optional.java:79)
>> >> >         at
>> >> > com.google.common.base.Optional.fromNullable(Optional.java:94)
>> >> >         at
>> >> >
>> >> >
>> >> > com.google.common.util.concurrent.Futures$CombinedFuture.setOneValue(Futures.java:1608)
>> >> >         at
>> >> >
>> >> >
>> >> > com.google.common.util.concurrent.Futures$CombinedFuture.access$400(Futures.java:1470)
>> >> >         at
>> >> >
>> >> >
>> >> > com.google.common.util.concurrent.Futures$CombinedFuture$2.run(Futures.java:1548)
>> >> >         at
>> >> >
>> >> >
>> >> > com.google.common.util.concurrent.MoreExecutors$SameThreadExecutorService.execute(MoreExecutors.java:297)
>> >> >         at
>> >> >
>> >> >
>> >> > com.google.common.util.concurrent.ExecutionList.executeListener(ExecutionList.java:156)
>> >> >         at
>> >> >
>> >> >
>> >> > com.google.common.util.concurrent.ExecutionList.add(ExecutionList.java:101)
>> >> >         at
>> >> >
>> >> >
>> >> > com.google.common.util.concurrent.AbstractFuture.addListener(AbstractFuture.java:170)
>> >> >         at
>> >> >
>> >> >
>> >> > com.google.common.util.concurrent.Futures$CombinedFuture.init(Futures.java:1545)
>> >> >         at
>> >> >
>> >> >
>> >> > com.google.common.util.concurrent.Futures$CombinedFuture.<init>(Futures.java:1491)
>> >> >         at
>> >> >
>> >> > com.google.common.util.concurrent.Futures.listFuture(Futures.java:1640)
>> >> >         at
>> >> > com.google.common.util.concurrent.Futures.allAsList(Futures.java:983)
>> >> >         at
>> >> >
>> >> >
>> >> > com.datastax.driver.core.CloseFuture$Forwarding.<init>(CloseFuture.java:73)
>> >> >         at
>> >> >
>> >> >
>> >> > com.datastax.driver.core.HostConnectionPool.closeAsync(HostConnectionPool.java:398)
>> >> >         at
>> >> >
>> >> >
>> >> > com.datastax.driver.core.SessionManager.closeAsync(SessionManager.java:157)
>> >> >         at
>> >> >
>> >> > com.datastax.driver.core.SessionManager.close(SessionManager.java:172)
>> >> >         at
>> >> >
>> >> >
>> >> > com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$destroySession(CassandraConnector.scala:180)
>> >> >         at
>> >> >
>> >> >
>> >> > com.datastax.spark.connector.cql.CassandraConnector$$anonfun$5.apply(CassandraConnector.scala:151)
>> >> >         at
>> >> >
>> >> >
>> >> > com.datastax.spark.connector.cql.CassandraConnector$$anonfun$5.apply(CassandraConnector.scala:151)
>> >> >         at com.datastax.spark.connector.cql.RefCountedCache.com
>> >> >
>> >> >
>> >> > $datastax$spark$connector$cql$RefCountedCache$$releaseImmediately(RefCountedCache.scala:86)
>> >> >         at
>> >> >
>> >> >
>> >> > com.datastax.spark.connector.cql.RefCountedCache$ReleaseTask.run(RefCountedCache.scala:26)
>> >> >         at
>> >> >
>> >> >
>> >> > com.datastax.spark.connector.cql.RefCountedCache$$anonfun$com$datastax$spark$connector$cql$RefCountedCache$$processPendingReleases$2.apply(RefCountedCache.scala:150)
>> >> >         at
>> >> >
>> >> >
>> >> > com.datastax.spark.connector.cql.RefCountedCache$$anonfun$com$datastax$spark$connector$cql$RefCountedCache$$processPendingReleases$2.apply(RefCountedCache.scala:147)
>> >> >         at
>> >> >
>> >> >
>> >> > scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:772)
>> >> >         at
>> >> > scala.collection.Iterator$class.foreach(Iterator.scala:727)
>> >> >         at
>> >> >
>> >> > scala.collection.concurrent.TrieMapIterator.foreach(TrieMap.scala:922)
>> >> >         at
>> >> > scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
>> >> >         at
>> >> > scala.collection.concurrent.TrieMap.foreach(TrieMap.scala:632)
>> >> >         at
>> >> >
>> >> >
>> >> > scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:771)
>> >> >         at com.datastax.spark.connector.cql.RefCountedCache.com
>> >> >
>> >> >
>> >> > $datastax$spark$connector$cql$RefCountedCache$$processPendingReleases(RefCountedCache.scala:147)
>> >> >         at
>> >> >
>> >> >
>> >> > com.datastax.spark.connector.cql.RefCountedCache$$anon$1.run(RefCountedCache.scala:157)
>> >> >         at
>> >> >
>> >> > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>> >> >         at
>> >> >
>> >> >
>> >> > java.util.concurrent.FutureTask$Sync.innerRunAndReset(FutureTask.java:351)
>> >> >         at
>> >> > java.util.concurrent.FutureTask.runAndReset(FutureTask.java:178)
>> >> >         at
>> >> >
>> >> >
>> >> > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:178)
>> >> >         at
>> >> >
>> >> >
>> >> > java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
>> >> >         at
>> >> >
>> >> >
>> >> > java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> >> >         at
>> >> >
>> >> >
>> >> > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> >> >         at java.lang.Thread.run(Thread.java:722)
>> >>
>> >>
>> >>
>> >> --
>> >> Marcelo
>> >
>> >
>>
>>
>>
>> --
>> Marcelo
>
>



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to