Looks like that's localized to the community module, I'm fine shipping this
release and fixing packaging in a quick point release.

+1 binding

On Tue, Mar 5, 2019 at 10:54 AM Andrew Musselman <a...@apache.org> wrote:

> Hm yeah it's commented out in the spark-cli-drivers
> dependency-reduced.xml; trying with that uncommented now.
>
> On Tue, Mar 5, 2019 at 10:34 AM Andrew Palumbo <ap....@outlook.com> wrote:
>
>> maybe we just need to add fastutil to the shell pom?
>> ________________________________
>> From: Andrew Musselman <andrew.mussel...@gmail.com>
>> Sent: Monday, March 4, 2019 12:19 PM
>> To: Mahout Dev List
>> Subject: Re: 0.14.0 RC2
>>
>> Running the example in the README gives a class not found:
>> "java.lang.NoClassDefFoundError:
>> it/unimi/dsi/fastutil/ints/Int2DoubleOpenHashMap"
>>
>> If that's just us still using something that's been removed, it's not a
>> deal-breaker for me as long as we fix it in a quick point release.
>>
>> Pending that being a simple fix my vote is +1 binding, and if Andy's not
>> back from vacation and his proxy works that's +2 binding from me and Andy.
>>
>>
>> bob $ export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
>> bob $ export MAHOUT_HOME=//home/akm/a/src/test/
>>
>> repository.apache.org/content/repositories/orgapachemahout-1052/org/apache/mahout/mahout/0.14.0
>> bob $ export SPARK_HOME=/home/akm/a/src/spark-2.1.0-bin-hadoop2.7
>> bob $ MASTER=local[2] mahout-0.14.0/bin/mahout spark-shell
>> Adding lib/ to CLASSPATH
>> Using Spark's default log4j profile:
>> org/apache/spark/log4j-defaults.properties
>> Setting default log level to "WARN".
>> To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use
>> setLogLevel(newLevel).
>> 19/03/04 09:07:44 WARN NativeCodeLoader: Unable to load native-hadoop
>> library for your platform... using builtin-java classes where applicable
>> 19/03/04 09:07:44 WARN Utils: Your hostname, Bob resolves to a loopback
>> address: 127.0.1.1; using 10.0.1.2 instead (on interface eno1)
>> 19/03/04 09:07:44 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to
>> another address
>> 19/03/04 09:07:53 WARN ObjectStore: Failed to get database global_temp,
>> returning NoSuchObjectException
>> Spark context Web UI available at http://10.0.1.2:4040
>> Spark context available as 'sc' (master = local[2], app id =
>> local-1551719265339).
>> Spark session available as 'spark'.
>> Loading /home/akm/a/src/test/
>>
>> repository.apache.org/content/repositories/orgapachemahout-1052/org/apache/mahout/mahout/0.14.0/mahout-0.14.0/bin/load-shell.scala
>> .
>> ..
>> import org.apache.mahout.math._
>> import org.apache.mahout.math.scalabindings._
>> import org.apache.mahout.math.drm._
>> import org.apache.mahout.math.scalabindings.RLikeOps._
>> import org.apache.mahout.math.drm.RLikeDrmOps._
>> import org.apache.mahout.sparkbindings._
>> sdc: org.apache.mahout.sparkbindings.SparkDistributedContext =
>> org.apache.mahout.sparkbindings.SparkDistributedContext@749ffdc7
>>
>>                 _                 _
>> _ __ ___   __ _| |__   ___  _   _| |_
>>  '_ ` _ \ / _` | '_ \ / _ \| | | | __|
>>  | | | | (_| | | | | (_) | |_| | |_
>> _| |_| |_|\__,_|_| |_|\___/ \__,_|\__|  version 0.14.0
>>
>>
>>
>> That file does not exist
>>
>> Welcome to
>>       ____              __
>>      / __/__  ___ _____/ /__
>>     _\ \/ _ \/ _ `/ __/  '_/
>>    /___/ .__/\_,_/_/ /_/\_\   version 2.1.0
>>       /_/
>>
>> Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_191)
>> Type in expressions to have them evaluated.
>> Type :help for more information.
>>
>> scala> :load /home/akm/a/src/test/
>>
>> repository.apache.org/content/repositories/orgapachemahout-1052/org/apache/mahout/mahout/0.14.0/mahout-0.14.0/examples/bin/SparseSparseDrmTimer.mscala
>> Loading /home/akm/a/src/test/
>>
>> repository.apache.org/content/repositories/orgapachemahout-1052/org/apache/mahout/mahout/0.14.0/mahout-0.14.0/examples/bin/SparseSparseDrmTimer.mscala
>> .
>> ..
>> timeSparseDRMMMul: (m: Int, n: Int, s: Int, para: Int, pctDense: Double,
>> seed: Long)Long
>>
>> scala> timeSparseDRMMMul(1000,1000,1000,1,.02,1234L)
>> 19/03/04 09:13:13 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID
>> 1)
>> java.lang.NoClassDefFoundError:
>> it/unimi/dsi/fastutil/ints/Int2DoubleOpenHashMap
>>     at
>>
>> org.apache.mahout.math.RandomAccessSparseVector.<init>(RandomAccessSparseVector.java:49)
>>     at
>>
>> org.apache.mahout.math.RandomAccessSparseVector.<init>(RandomAccessSparseVector.java:44)
>>     at
>>
>> org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11$$anonfun$apply$2.apply(SparkEngine.scala:200)
>>     at
>>
>> org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11$$anonfun$apply$2.apply(SparkEngine.scala:200)
>>     at
>>
>> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
>>     at
>>
>> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
>>     at scala.collection.immutable.Range.foreach(Range.scala:160)
>>     at
>> scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
>>     at scala.collection.AbstractTraversable.map(Traversable.scala:104)
>>     at
>>
>> org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11.apply(SparkEngine.scala:200)
>>     at
>>
>> org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11.apply(SparkEngine.scala:195)
>>     at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
>>     at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
>>     at scala.collection.Iterator$class.isEmpty(Iterator.scala:330)
>>     at scala.collection.AbstractIterator.isEmpty(Iterator.scala:1336)
>>     at
>>
>> org.apache.mahout.sparkbindings.drm.package$$anonfun$blockify$1.apply(package.scala:55)
>>     at
>>
>> org.apache.mahout.sparkbindings.drm.package$$anonfun$blockify$1.apply(package.scala:53)
>>     at
>>
>> org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
>>     at
>>
>> org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
>>     at
>> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
>>     at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
>>     at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
>>     at
>> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
>>     at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
>>     at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
>>     at
>> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
>>     at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
>>     at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
>>     at
>> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
>>     at
>> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
>>     at org.apache.spark.scheduler.Task.run(Task.scala:99)
>>     at
>> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
>>     at
>>
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>>     at
>>
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>>     at java.lang.Thread.run(Thread.java:748)
>> 19/03/04 09:13:13 ERROR Executor: Exception in task 0.0 in stage 1.0 (TID
>> 0)
>> java.lang.NoClassDefFoundError:
>> it/unimi/dsi/fastutil/ints/Int2DoubleOpenHashMap
>>     at
>>
>> org.apache.mahout.math.RandomAccessSparseVector.<init>(RandomAccessSparseVector.java:49)
>>     at
>>
>> org.apache.mahout.math.RandomAccessSparseVector.<init>(RandomAccessSparseVector.java:44)
>>     at
>>
>> org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11$$anonfun$apply$2.apply(SparkEngine.scala:200)
>>     at
>>
>> org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11$$anonfun$apply$2.apply(SparkEngine.scala:200)
>>     at
>>
>> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
>>     at
>>
>> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
>>     at scala.collection.immutable.Range.foreach(Range.scala:160)
>>     at
>> scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
>>     at scala.collection.AbstractTraversable.map(Traversable.scala:104)
>>     at
>>
>> org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11.apply(SparkEngine.scala:200)
>>     at
>>
>> org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11.apply(SparkEngine.scala:195)
>>     at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
>>     at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
>>     at scala.collection.Iterator$class.isEmpty(Iterator.scala:330)
>>     at scala.collection.AbstractIterator.isEmpty(Iterator.scala:1336)
>>     at
>>
>> org.apache.mahout.sparkbindings.drm.package$$anonfun$blockify$1.apply(package.scala:55)
>>     at
>>
>> org.apache.mahout.sparkbindings.drm.package$$anonfun$blockify$1.apply(package.scala:53)
>>     at
>>
>> org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
>>     at
>>
>> org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
>>     at
>> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
>>     at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
>>     at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
>>     at
>> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
>>     at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
>>     at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
>>     at
>> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
>>     at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
>>     at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
>>     at
>> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
>>     at
>> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
>>     at org.apache.spark.scheduler.Task.run(Task.scala:99)
>>     at
>> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
>>     at
>>
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>>     at
>>
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>>     at java.lang.Thread.run(Thread.java:748)
>> Caused by: java.lang.ClassNotFoundException:
>> it.unimi.dsi.fastutil.ints.Int2DoubleOpenHashMap
>>     at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>     ... 35 more
>> 19/03/04 09:13:13 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 1,
>> localhost, executor driver): java.lang.NoClassDefFoundError:
>> it/unimi/dsi/fastutil/ints/Int2DoubleOpenHashMap
>>     at
>>
>> org.apache.mahout.math.RandomAccessSparseVector.<init>(RandomAccessSparseVector.java:49)
>>     at
>>
>> org.apache.mahout.math.RandomAccessSparseVector.<init>(RandomAccessSparseVector.java:44)
>>     at
>>
>> org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11$$anonfun$apply$2.apply(SparkEngine.scala:200)
>>     at
>>
>> org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11$$anonfun$apply$2.apply(SparkEngine.scala:200)
>>     at
>>
>> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
>>     at
>>
>> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
>>     at scala.collection.immutable.Range.foreach(Range.scala:160)
>>     at
>> scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
>>     at scala.collection.AbstractTraversable.map(Traversable.scala:104)
>>     at
>>
>> org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11.apply(SparkEngine.scala:200)
>>     at
>>
>> org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11.apply(SparkEngine.scala:195)
>>     at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
>>     at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
>>     at scala.collection.Iterator$class.isEmpty(Iterator.scala:330)
>>     at scala.collection.AbstractIterator.isEmpty(Iterator.scala:1336)
>>     at
>>
>> org.apache.mahout.sparkbindings.drm.package$$anonfun$blockify$1.apply(package.scala:55)
>>     at
>>
>> org.apache.mahout.sparkbindings.drm.package$$anonfun$blockify$1.apply(package.scala:53)
>>     at
>>
>> org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
>>     at
>>
>> org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
>>     at
>> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
>>     at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
>>     at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
>>     at
>> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
>>     at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
>>     at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
>>     at
>> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
>>     at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
>>     at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
>>     at
>> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
>>     at
>> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
>>     at org.apache.spark.scheduler.Task.run(Task.scala:99)
>>     at
>> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
>>     at
>>
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>>     at
>>
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>>     at java.lang.Thread.run(Thread.java:748)
>>
>> 19/03/04 09:13:13 ERROR TaskSetManager: Task 0 in stage 0.0 failed 1
>> times;
>> aborting job
>> 19/03/04 09:13:13 WARN TaskSetManager: Lost task 0.0 in stage 1.0 (TID 0,
>> localhost, executor driver): java.lang.NoClassDefFoundError:
>> it/unimi/dsi/fastutil/ints/Int2DoubleOpenHashMap
>>     at
>>
>> org.apache.mahout.math.RandomAccessSparseVector.<init>(RandomAccessSparseVector.java:49)
>>     at
>>
>> org.apache.mahout.math.RandomAccessSparseVector.<init>(RandomAccessSparseVector.java:44)
>>     at
>>
>> org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11$$anonfun$apply$2.apply(SparkEngine.scala:200)
>>     at
>>
>> org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11$$anonfun$apply$2.apply(SparkEngine.scala:200)
>>     at
>>
>> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
>>     at
>>
>> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
>>     at scala.collection.immutable.Range.foreach(Range.scala:160)
>>     at
>> scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
>>     at scala.collection.AbstractTraversable.map(Traversable.scala:104)
>>     at
>>
>> org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11.apply(SparkEngine.scala:200)
>>     at
>>
>> org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11.apply(SparkEngine.scala:195)
>>     at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
>>     at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
>>     at scala.collection.Iterator$class.isEmpty(Iterator.scala:330)
>>     at scala.collection.AbstractIterator.isEmpty(Iterator.scala:1336)
>>     at
>>
>> org.apache.mahout.sparkbindings.drm.package$$anonfun$blockify$1.apply(package.scala:55)
>>     at
>>
>> org.apache.mahout.sparkbindings.drm.package$$anonfun$blockify$1.apply(package.scala:53)
>>     at
>>
>> org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
>>     at
>>
>> org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
>>     at
>> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
>>     at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
>>     at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
>>     at
>> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
>>     at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
>>     at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
>>     at
>> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
>>     at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
>>     at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
>>     at
>> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
>>     at
>> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
>>     at org.apache.spark.scheduler.Task.run(Task.scala:99)
>>     at
>> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
>>     at
>>
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>>     at
>>
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>>     at java.lang.Thread.run(Thread.java:748)
>> Caused by: java.lang.ClassNotFoundException:
>> it.unimi.dsi.fastutil.ints.Int2DoubleOpenHashMap
>>     at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>     ... 35 more
>>
>> 19/03/04 09:13:13 ERROR TaskSetManager: Task 0 in stage 1.0 failed 1
>> times;
>> aborting job
>> org.apache.spark.SparkException: Job aborted due to stage failure: Task 0
>> in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage
>> 0.0 (TID 1, localhost, executor driver): java.lang.NoClassDefFoundError:
>> it/unimi/dsi/fastutil/ints/Int2DoubleOpenHashMap
>>     at
>>
>> org.apache.mahout.math.RandomAccessSparseVector.<init>(RandomAccessSparseVector.java:49)
>>     at
>>
>> org.apache.mahout.math.RandomAccessSparseVector.<init>(RandomAccessSparseVector.java:44)
>>     at
>>
>> org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11$$anonfun$apply$2.apply(SparkEngine.scala:200)
>>     at
>>
>> org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11$$anonfun$apply$2.apply(SparkEngine.scala:200)
>>     at
>>
>> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
>>     at
>>
>> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
>>     at scala.collection.immutable.Range.foreach(Range.scala:160)
>>     at
>> scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
>>     at scala.collection.AbstractTraversable.map(Traversable.scala:104)
>>     at
>>
>> org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11.apply(SparkEngine.scala:200)
>>     at
>>
>> org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11.apply(SparkEngine.scala:195)
>>     at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
>>     at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
>>     at scala.collection.Iterator$class.isEmpty(Iterator.scala:330)
>>     at scala.collection.AbstractIterator.isEmpty(Iterator.scala:1336)
>>     at
>>
>> org.apache.mahout.sparkbindings.drm.package$$anonfun$blockify$1.apply(package.scala:55)
>>     at
>>
>> org.apache.mahout.sparkbindings.drm.package$$anonfun$blockify$1.apply(package.scala:53)
>>     at
>>
>> org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
>>     at
>>
>> org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
>>     at
>> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
>>     at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
>>     at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
>>     at
>> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
>>     at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
>>     at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
>>     at
>> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
>>     at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
>>     at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
>>     at
>> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
>>     at
>> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
>>     at org.apache.spark.scheduler.Task.run(Task.scala:99)
>>     at
>> org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
>>     at
>>
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>>     at
>>
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>>     at java.lang.Thread.run(Thread.java:748)
>>
>> Driver stacktrace:
>>   at org.apache.spark.scheduler.DAGScheduler.org
>>
>> $apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1435)
>>   at
>>
>> org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1423)
>>   at
>>
>> org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1422)
>>   at
>>
>> scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
>>   at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
>>   at
>>
>> org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1422)
>>   at
>>
>> org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:802)
>>   at
>>
>> org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:802)
>>   at scala.Option.foreach(Option.scala:257)
>>   at
>>
>> org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:802)
>>   at
>>
>> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1650)
>>   at
>>
>> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1605)
>>   at
>>
>> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1594)
>>   at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
>>   at
>> org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:628)
>>   at org.apache.spark.SparkContext.runJob(SparkContext.scala:1918)
>>   at org.apache.spark.SparkContext.runJob(SparkContext.scala:1931)
>>   at org.apache.spark.SparkContext.runJob(SparkContext.scala:1944)
>>   at org.apache.spark.SparkContext.runJob(SparkContext.scala:1958)
>>   at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:935)
>>   at
>>
>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
>>   at
>>
>> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
>>   at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)
>>   at org.apache.spark.rdd.RDD.collect(RDD.scala:934)
>>   at
>>
>> org.apache.mahout.sparkbindings.drm.CheckpointedDrmSpark.collect(CheckpointedDrmSpark.scala:128)
>>   at org.apache.mahout.math.drm.package$.drm2InCore(package.scala:98)
>>   at timeSparseDRMMMul(<console>:87)
>>   ... 60 elided
>> Caused by: java.lang.NoClassDefFoundError:
>> it/unimi/dsi/fastutil/ints/Int2DoubleOpenHashMap
>>   at
>>
>> org.apache.mahout.math.RandomAccessSparseVector.<init>(RandomAccessSparseVector.java:49)
>>   at
>>
>> org.apache.mahout.math.RandomAccessSparseVector.<init>(RandomAccessSparseVector.java:44)
>>   at
>>
>> org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11$$anonfun$apply$2.apply(SparkEngine.scala:200)
>>   at
>>
>> org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11$$anonfun$apply$2.apply(SparkEngine.scala:200)
>>   at
>>
>> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
>>   at
>>
>> scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
>>   at scala.collection.immutable.Range.foreach(Range.scala:160)
>>   at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
>>   at scala.collection.AbstractTraversable.map(Traversable.scala:104)
>>   at
>>
>> org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11.apply(SparkEngine.scala:200)
>>   at
>>
>> org.apache.mahout.sparkbindings.SparkEngine$$anonfun$11.apply(SparkEngine.scala:195)
>>   at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
>>   at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
>>   at scala.collection.Iterator$class.isEmpty(Iterator.scala:330)
>>   at scala.collection.AbstractIterator.isEmpty(Iterator.scala:1336)
>>   at
>>
>> org.apache.mahout.sparkbindings.drm.package$$anonfun$blockify$1.apply(package.scala:55)
>>   at
>>
>> org.apache.mahout.sparkbindings.drm.package$$anonfun$blockify$1.apply(package.scala:53)
>>   at
>>
>> org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
>>   at
>>
>> org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
>>   at
>> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
>>   at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
>>   at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
>>   at
>> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
>>   at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
>>   at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
>>   at
>> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
>>   at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
>>   at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
>>   at
>> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
>>   at
>> org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
>>   at org.apache.spark.scheduler.Task.run(Task.scala:99)
>>   at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
>>   at
>>
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>>   at
>>
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>>   at java.lang.Thread.run(Thread.java:748)
>>
>> scala>
>>
>>
>> On Mon, Mar 4, 2019 at 7:27 AM Trevor Grant <trevor.d.gr...@gmail.com>
>> wrote:
>>
>> > +1 binding.
>> >
>> > Steps
>> > ```
>> > rm -rf ~/.m2/repository/org/apache/mahout
>> > wget
>> >
>> >
>> https://repository.apache.org/content/repositories/orgapachemahout-1052/org/apache/mahout/mahout/0.14.0/mahout-0.14.0-source-release.zip
>> > unzip mahout-0.14.0-source-release.zip
>> > cd mahout-0.14.0
>> > mvn clean install
>> > ```
>> >
>> > yields-
>> > ```
>> > [INFO]
>> > ------------------------------------------------------------------------
>> > [INFO] Reactor Summary:
>> > [INFO]
>> > [INFO] Apache Mahout ...................................... SUCCESS [
>> > 2.033 s]
>> > [INFO] Mahout Core ........................................ SUCCESS
>> [04:06
>> > min]
>> > [INFO] Mahout Engine ...................................... SUCCESS [
>> > 0.122 s]
>> > [INFO] - Mahout HDFS Support .............................. SUCCESS [
>> > 10.330 s]
>> > [INFO] - Mahout Spark Engine .............................. SUCCESS
>> [01:36
>> > min]
>> > [INFO] Mahout Community ................................... SUCCESS [
>> > 0.469 s]
>> > [INFO]
>> > ------------------------------------------------------------------------
>> > [INFO] BUILD SUCCESS
>> > [INFO]
>> > ------------------------------------------------------------------------
>> > [INFO] Total time: 05:56 min
>> > [INFO] Finished at: 2019-03-04T09:13:09-06:00
>> > [INFO] Final Memory: 77M/1085M
>> > [INFO]
>> > ------------------------------------------------------------------------
>> > ```
>> >
>> > Other info:
>> > $ java -version
>> > openjdk version "1.8.0_171"
>> > OpenJDK Runtime Environment (build
>> > 1.8.0_171-8u171-b11-0ubuntu0.17.10.1-b11)
>> > OpenJDK 64-Bit Server VM (build 25.171-b11, mixed mode)
>> >
>> > $ mvn -v
>> > Apache Maven 3.5.0
>> >
>> >
>> >
>> > On Fri, Mar 1, 2019 at 3:10 PM Andrew Musselman <a...@apache.org> wrote:
>> >
>> > > Just remembered md5 is deprecated per ASF release guidelines; sha1
>> sums
>> > are
>> > > good too:
>> > >
>> > > bob $ sha1sum mahout-0.14.0.pom
>> > > dfd3e920e652302823279e01a0a5ab4c819cd54c  mahout-0.14.0.pom
>> > > bob $ cat mahout-0.14.0.pom.sha1
>> > > dfd3e920e652302823279e01a0a5ab4c819cd54c
>> > > bob $ sha1sum mahout-0.14.0-source-release.zip
>> > > 63daeccdfdd6fc4b2014ad4a35c30c54a08b4a2b
>> > mahout-0.14.0-source-release.zip
>> > > bob $ cat mahout-0.14.0-source-release.zip.sha1
>> > > 63daeccdfdd6fc4b2014ad4a35c30c54a08b4a2b
>> > > bob $ sha1sum mahout-0.14.0-tests.jar
>> > > 83d3c495430e4082be4df418b846ab32573e154f  mahout-0.14.0-tests.jar
>> > > bob $ cat mahout-0.14.0-tests.jar.sha1
>> > > 83d3c495430e4082be4df418b846ab32573e154f
>> > >
>> > > On Fri, Mar 1, 2019 at 11:07 AM Andrew Musselman <a...@apache.org>
>> wrote:
>> > >
>> > > > New build out at
>> > > >
>> > https://repository.apache.org/content/repositories/orgapachemahout-1052
>> > > >
>> > > > Builds, checksums are correct, and signatures are good; I'll be
>> testing
>> > > > over the weekend.
>> > > >
>> > > > [INFO] Apache Mahout ...................................... SUCCESS
>> [
>> > > > 1.217 s]
>> > > > [INFO] Mahout Core ........................................ SUCCESS
>> > > [03:10
>> > > > min]
>> > > > [INFO] Mahout Engine ...................................... SUCCESS
>> [
>> > > > 0.080 s]
>> > > > [INFO] - Mahout HDFS Support .............................. SUCCESS
>> [
>> > > > 4.711 s]
>> > > > [INFO] - Mahout Spark Engine .............................. SUCCESS
>> [
>> > > > 59.623 s]
>> > > > [INFO] Mahout Community ................................... SUCCESS
>> [
>> > > > 0.376 s]
>> > > > [INFO]
>> > > >
>> > ------------------------------------------------------------------------
>> > > > [INFO] BUILD SUCCESS
>> > > > [INFO]
>> > > >
>> > ------------------------------------------------------------------------
>> > > > [INFO] Total time: 04:16 min
>> > > > [INFO] Finished at: 2019-03-01T10:59:01-08:00
>> > > > [INFO] Final Memory: 71M/1281M
>> > > > [INFO]
>> > > >
>> > ------------------------------------------------------------------------
>> > > >
>> > > >
>> > > > bob $ ls
>> > > > index.html         mahout-0.14.0.pom.asc
>> > > > mahout-0.14.0-source-release.zip
>> > > > mahout-0.14.0-source-release.zip.sha1  mahout-0.14.0-tests.jar.md5
>> > > > mahout-0.14.0      mahout-0.14.0.pom.md5
>> > > > mahout-0.14.0-source-release.zip.asc
>> > > > mahout-0.14.0-tests.jar                mahout-0.14.0-tests.jar.sha1
>> > > > mahout-0.14.0.pom  mahout-0.14.0.pom.sha1
>> > > > mahout-0.14.0-source-release.zip.md5  mahout-0.14.0-tests.jar.asc
>> > > >
>> > > > bob $ gpg mahout-0.14.0.pom.asc
>> > > > gpg: assuming signed data in `mahout-0.14.0.pom'
>> > > > gpg: Signature made Fri 01 Mar 2019 09:59:00 AM PST using RSA key ID
>> > > > 140A5BE9
>> > > > gpg: Good signature from "Andrew K Musselman (ASF Signing Key) <
>> > > > a...@apache.org>"
>> > > > bob $ gpg mahout-0.14.0-source-release.zip.asc
>> > > > gpg: assuming signed data in `mahout-0.14.0-source-release.zip'
>> > > > gpg: Signature made Fri 01 Mar 2019 09:59:00 AM PST using RSA key ID
>> > > > 140A5BE9
>> > > > gpg: Good signature from "Andrew K Musselman (ASF Signing Key) <
>> > > > a...@apache.org>"
>> > > > bob $ gpg mahout-0.14.0-tests.jar.asc
>> > > > gpg: assuming signed data in `mahout-0.14.0-tests.jar'
>> > > > gpg: Signature made Fri 01 Mar 2019 09:59:00 AM PST using RSA key ID
>> > > > 140A5BE9
>> > > > gpg: Good signature from "Andrew K Musselman (ASF Signing Key) <
>> > > > a...@apache.org>"
>> > > >
>> > > > bob $ md5sum mahout-0.14.0.pom
>> > > > 5a2c22802d443eb96afb1afb3f38e9c8  mahout-0.14.0.pom
>> > > > bob $ cat mahout-0.14.0.pom.md5
>> > > > 5a2c22802d443eb96afb1afb3f38e9c8
>> > > > bob $ md5sum mahout-0.14.0-source-release.zip
>> > > > b6eadad5cdd69f0eccae38f2eebefdd0  mahout-0.14.0-source-release.zip
>> > > > bob $ cat mahout-0.14.0-source-release.zip.md5
>> > > > b6eadad5cdd69f0eccae38f2eebefdd0
>> > > > bob $ md5sum mahout-0.14.0-tests.jar
>> > > > bef852667898c41fd3c95fef71d6325c  mahout-0.14.0-tests.jar
>> > > > bob $ cat mahout-0.14.0-tests.jar.md5
>> > > > bef852667898c41fd3c95fef71d6325c
>> > > >
>> > >
>> >
>>
>

Reply via email to