I am intermittently running into guava dependency issues across mutiple
spark projects.  I have tried maven shade / relocate but it does not
resolve the issues.

The current project is extremely simple: *no* additional dependencies
beyond scala, spark, and scalatest - yet the issues remain (and yes mvn
clean was re-applied).

Is there a reliable approach to handling the versioning for guava within
spark dependency projects?


[INFO]
------------------------------------------------------------------------
[INFO] Building ccapps_final 1.0-SNAPSHOT
[INFO]
------------------------------------------------------------------------
[INFO]
[INFO] --- exec-maven-plugin:1.6.0:java (default-cli) @ ccapps_final ---
18/05/07 10:24:00 WARN NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
[WARNING]
java.lang.NoSuchMethodError:
com.google.common.cache.CacheBuilder.refreshAfterWrite(JLjava/util/concurrent/TimeUnit;)Lcom/google/common/cache/CacheBuilder;
at org.apache.hadoop.security.Groups.<init>(Groups.java:96)
at org.apache.hadoop.security.Groups.<init>(Groups.java:73)
at
org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:293)
at
org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:283)
at
org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:260)
at
org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:789)
at
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:774)
at
org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:647)
at
org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2424)
at
org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2424)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2424)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:295)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2516)
at
org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:918)
at
org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:910)
at scala.Option.getOrElse(Option.scala:121)
at
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:910)
at luisvictorsteve.SlackSentiment$.getSpark(SlackSentiment.scala:10)
at luisvictorsteve.SlackSentiment$.main(SlackSentiment.scala:16)
at luisvictorsteve.SlackSentiment.main(SlackSentiment.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.mojo.exec.ExecJavaMojo$1.run(ExecJavaMojo.java:282)
at java.lang.Thread.run(Thread.java:748)
[INFO]
------------------------------------------------------------------------

Reply via email to