Mikhail, I know Joseph was working with that job whose test is failing,
I'll cc him personally to see if he has any idea.  Does this work if you do
it on the cluster?  Since it's the weekend, you probably won't get a
response until Monday, let me know if there's anything urgent and we can
try to find a work-around.

On Fri, Mar 31, 2017 at 4:45 PM, Mikhail Popov <[email protected]> wrote:

> Hello! Has anyone experienced issues `mvn package`-ing
> analytics/refinery/source on a local machine?
>
> Wikimedia Analytics Refinery Jobs fails for me as of "Add mediawiki
> history spark jobs to refinery-job" (https://gerrit.wikimedia.org/
> r/#/c/325312/)
>
> Here's my `mvn --version`:
>
> Apache Maven 3.3.9 (bb52d8502b132ec0a5a3f4c09453c07478323dc5;
> 2015-11-10T08:41:47-08:00)
> Maven home: /usr/local/Cellar/maven/3.3.9/libexec
> Java version: 1.8.0_121, vendor: Oracle Corporation
> Java home: /Library/Java/JavaVirtualMachines/jdk1.8.0_
> 121.jdk/Contents/Home/jre
> Default locale: en_US, platform encoding: UTF-8
> OS name: "mac os x", version: "10.12.4", arch: "x86_64", family: "mac"
>
> When I set HEAD to the commit prior to that everything succeeds. Any
> commit after that one makes Jobs tests fail with warnings and errors like:
>
> [INFO] Checking for multiple versions of scala
> [WARNING]  Expected all dependencies to require Scala version: 2.10.4
> [WARNING]  com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4
> [WARNING]  org.spark-project.akka:akka-actor_2.10:2.2.3-shaded-protobuf
> requires scala version: 2.10.4
> [WARNING]  org.spark-project.akka:akka-remote_2.10:2.2.3-shaded-protobuf
> requires scala version: 2.10.4
> [WARNING]  org.spark-project.akka:akka-slf4j_2.10:2.2.3-shaded-protobuf
> requires scala version: 2.10.4
> [WARNING]  org.apache.spark:spark-core_2.10:1.6.0-cdh5.10.0 requires
> scala version: 2.10.4
> [WARNING]  org.json4s:json4s-jackson_2.10:3.2.10 requires scala version:
> 2.10.4
> [WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version:
> 2.10.4
> [WARNING]  org.json4s:json4s-ast_2.10:3.2.10 requires scala version:
> 2.10.4
> [WARNING]  org.json4s:json4s-core_2.10:3.2.10 requires scala version:
> 2.10.0
> [WARNING] Multiple versions of scala libraries detected!
>
> TestDenormalizedRevisionsBuilder:
> 17/03/31 13:38:41 WARN NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable
>
> populateDeleteTime
> java.lang.reflect.InvocationTargetException
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:62)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at org.xerial.snappy.SnappyLoader.loadNativeLibrary(SnappyLoader.java:317)
> at org.xerial.snappy.SnappyLoader.load(SnappyLoader.java:219)
> at org.xerial.snappy.Snappy.<clinit>(Snappy.java:44)
> at org.apache.spark.io.SnappyCompressionCodec$.liftedTree1$1(
> CompressionCodec.scala:169)
> at org.apache.spark.io.SnappyCompressionCodec$.org$apache$spark$io$
> SnappyCompressionCodec$$version$lzycompute(CompressionCodec.scala:168)
> at org.apache.spark.io.SnappyCompressionCodec$.org$apache$spark$io$
> SnappyCompressionCodec$$version(CompressionCodec.scala:168)
> at org.apache.spark.io.SnappyCompressionCodec.<init>(
> CompressionCodec.scala:152)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance(
> NativeConstructorAccessorImpl.java:62)
> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
> DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> at org.apache.spark.io.CompressionCodec$.createCodec(
> CompressionCodec.scala:72)
> at org.apache.spark.io.CompressionCodec$.createCodec(
> CompressionCodec.scala:65)
> at org.apache.spark.broadcast.TorrentBroadcast.org$apache$spark$broadcast$
> TorrentBroadcast$$setConf(TorrentBroadcast.scala:73)
> at org.apache.spark.broadcast.TorrentBroadcast.<init>(
> TorrentBroadcast.scala:80)
> at org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(
> TorrentBroadcastFactory.scala:34)
> at org.apache.spark.broadcast.BroadcastManager.newBroadcast(
> BroadcastManager.scala:63)
> at org.apache.spark.SparkContext.broadcast(SparkContext.scala:1334)
> at org.apache.spark.scheduler.DAGScheduler.submitMissingTasks(
> DAGScheduler.scala:1006)
> at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$
> scheduler$DAGScheduler$$submitStage(DAGScheduler.scala:921)
> at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$
> apache$spark$scheduler$DAGScheduler$$submitStage$4.
> apply(DAGScheduler.scala:924)
> at org.apache.spark.scheduler.DAGScheduler$$anonfun$org$
> apache$spark$scheduler$DAGScheduler$$submitStage$4.
> apply(DAGScheduler.scala:923)
> at scala.collection.immutable.List.foreach(List.scala:318)
> at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$
> scheduler$DAGScheduler$$submitStage(DAGScheduler.scala:923)
> at org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(
> DAGScheduler.scala:861)
> at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.
> doOnReceive(DAGScheduler.scala:1611)
> at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.
> onReceive(DAGScheduler.scala:1603)
> at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.
> onReceive(DAGScheduler.scala:1592)
> at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
> Caused by: java.lang.UnsatisfiedLinkError: no snappyjava in
> java.library.path
> at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1867)
> at java.lang.Runtime.loadLibrary0(Runtime.java:870)
> at java.lang.System.loadLibrary(System.java:1122)
> at org.xerial.snappy.SnappyNativeLoader.loadLibrary(
> SnappyNativeLoader.java:52)
> ... 33 more
> - should put max rev ts when no page state match *** FAILED ***
>   org.apache.spark.SparkException: Job aborted due to stage failure: Task
> serialization failed: java.lang.reflect.InvocationTargetException
>
> _______________________________________________
> Analytics mailing list
> [email protected]
> https://lists.wikimedia.org/mailman/listinfo/analytics
>
>
_______________________________________________
Analytics mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/analytics

Reply via email to