Re: [VOTE] Release Apache Spark 1.4.0 (RC3)

2015-05-29 Thread Taka Shinagawa
Mike,

The broken Configuration link can be fixed if you add a missing dash '-' on
the first line in docs/configuration.md and run 'jekyll build'.

https://github.com/apache/spark/pull/6513

On Fri, May 29, 2015 at 6:38 PM, Mike Ringenburg mik...@cray.com wrote:

  The Configuration link on the docs appears to be broken.

  Mike



 On May 29, 2015, at 4:41 PM, Patrick Wendell pwend...@gmail.com wrote:

  Please vote on releasing the following candidate as Apache Spark version
 1.4.0!

 The tag to be voted on is v1.4.0-rc3 (commit dd109a8):

 https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=dd109a8746ec07c7c83995890fc2c0cd7a693730

 The release files, including signatures, digests, etc. can be found at:
 http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc3-bin/

 Release artifacts are signed with the following key:
 https://people.apache.org/keys/committer/pwendell.asc

 The staging repository for this release can be found at:
 [published as version: 1.4.0]
 https://repository.apache.org/content/repositories/orgapachespark-1109/
 [published as version: 1.4.0-rc3]
 https://repository.apache.org/content/repositories/orgapachespark-1110/

 The documentation corresponding to this release can be found at:
 http://people.apache.org/~pwendell/spark-releases/spark-1.4.0-rc3-docs/

 Please vote on releasing this package as Apache Spark 1.4.0!

 The vote is open until Tuesday, June 02, at 00:32 UTC and passes
 if a majority of at least 3 +1 PMC votes are cast.

 [ ] +1 Release this package as Apache Spark 1.4.0
 [ ] -1 Do not release this package because ...

 To learn more about Apache Spark, please see
 http://spark.apache.org/

 == What has changed since RC1 ==
 Below is a list of bug fixes that went into this RC:
 http://s.apache.org/vN

 == How can I help test this release? ==
 If you are a Spark user, you can help us test this release by
 taking a Spark 1.3 workload and running on this release candidate,
 then reporting any regressions.

 == What justifies a -1 vote for this release? ==
 This vote is happening towards the end of the 1.4 QA period,
 so -1 votes should only occur for significant regressions from 1.3.1.
 Bugs already present in 1.3.X, minor regressions, or bugs related
 to new features will not block this release.

 -
 To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
 For additional commands, e-mail: dev-h...@spark.apache.org




libgfortran Dependency

2014-07-09 Thread Taka Shinagawa
Hi,

After testing Spark 1.0.1-RC2 on EC2 instances from the standard Ubuntu and
Amazon Linux AMIs,
I've noticed the MLlib's dependancy on gfortran library (libgfortran.so.3).

sbt assembly succeeds without this library installed, but sbt test
fails as follows.

I'm wondering if documenting this dependency in README and online doc might
a good idea.

-
[info] ALSSuite:
-- org.jblas ERROR Couldn't load copied link file:
java.lang.UnsatisfiedLinkError:
/tmp/jblas8312335435391185287libjblas_arch_flavor.so: libgfortran.so.3:
cannot open shared object file: No such file or directory.

On Linux 64bit, you need additional support libraries.
You need to install libgfortran3.

For example for debian or Ubuntu, type sudo apt-get install libgfortran3

For more information, see
https://github.com/mikiobraun/jblas/wiki/Missing-Libraries
[info] Exception encountered when attempting to run a suite with class
name: org.apache.spark.mllib.recommendation.ALSSuite *** ABORTED ***
[info]   java.lang.UnsatisfiedLinkError:
org.jblas.NativeBlas.dgemm(CCIIID[DII[DIID[DII)V
[info]   at org.jblas.NativeBlas.dgemm(Native Method)
[info]   at org.jblas.SimpleBlas.gemm(SimpleBlas.java:251)
[info]   at org.jblas.DoubleMatrix.mmuli(DoubleMatrix.java:1697)
[info]   at org.jblas.DoubleMatrix.mmul(DoubleMatrix.java:3054)
[info]   at
org.apache.spark.mllib.recommendation.ALSSuite$.generateRatings(ALSSuite.scala:67)
[info]   at
org.apache.spark.mllib.recommendation.ALSSuite.testALS(ALSSuite.scala:167)
[info]   at
org.apache.spark.mllib.recommendation.ALSSuite$$anonfun$3.apply$mcV$sp(ALSSuite.scala:83)
[info]   at
org.apache.spark.mllib.recommendation.ALSSuite$$anonfun$3.apply(ALSSuite.scala:83)
[info]   at
org.apache.spark.mllib.recommendation.ALSSuite$$anonfun$3.apply(ALSSuite.scala:83)
[info]   at
org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
[info]   ...

-


Re: libgfortran Dependency

2014-07-09 Thread Taka Shinagawa
Thanks for point me to the MLlib guide. I was looking at only README and
Spark docs.

Also found it's already filed in JIRA
https://spark-project.atlassian.net/browse/SPARK-797


On Wed, Jul 9, 2014 at 7:45 PM, Xiangrui Meng men...@gmail.com wrote:

 It is documented in the official doc:
 http://spark.apache.org/docs/latest/mllib-guide.html

 On Wed, Jul 9, 2014 at 7:35 PM, Taka Shinagawa taka.epsi...@gmail.com
 wrote:
  Hi,
 
  After testing Spark 1.0.1-RC2 on EC2 instances from the standard Ubuntu
 and
  Amazon Linux AMIs,
  I've noticed the MLlib's dependancy on gfortran library
 (libgfortran.so.3).
 
  sbt assembly succeeds without this library installed, but sbt test
  fails as follows.
 
  I'm wondering if documenting this dependency in README and online doc
 might
  a good idea.
 
  -
  [info] ALSSuite:
  -- org.jblas ERROR Couldn't load copied link file:
  java.lang.UnsatisfiedLinkError:
  /tmp/jblas8312335435391185287libjblas_arch_flavor.so: libgfortran.so.3:
  cannot open shared object file: No such file or directory.
 
  On Linux 64bit, you need additional support libraries.
  You need to install libgfortran3.
 
  For example for debian or Ubuntu, type sudo apt-get install
 libgfortran3
 
  For more information, see
  https://github.com/mikiobraun/jblas/wiki/Missing-Libraries
  [info] Exception encountered when attempting to run a suite with class
  name: org.apache.spark.mllib.recommendation.ALSSuite *** ABORTED ***
  [info]   java.lang.UnsatisfiedLinkError:
  org.jblas.NativeBlas.dgemm(CCIIID[DII[DIID[DII)V
  [info]   at org.jblas.NativeBlas.dgemm(Native Method)
  [info]   at org.jblas.SimpleBlas.gemm(SimpleBlas.java:251)
  [info]   at org.jblas.DoubleMatrix.mmuli(DoubleMatrix.java:1697)
  [info]   at org.jblas.DoubleMatrix.mmul(DoubleMatrix.java:3054)
  [info]   at
 
 org.apache.spark.mllib.recommendation.ALSSuite$.generateRatings(ALSSuite.scala:67)
  [info]   at
 
 org.apache.spark.mllib.recommendation.ALSSuite.testALS(ALSSuite.scala:167)
  [info]   at
 
 org.apache.spark.mllib.recommendation.ALSSuite$$anonfun$3.apply$mcV$sp(ALSSuite.scala:83)
  [info]   at
 
 org.apache.spark.mllib.recommendation.ALSSuite$$anonfun$3.apply(ALSSuite.scala:83)
  [info]   at
 
 org.apache.spark.mllib.recommendation.ALSSuite$$anonfun$3.apply(ALSSuite.scala:83)
  [info]   at
  org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
  [info]   ...
 
  -



Re: Errors from Sbt Test

2014-07-02 Thread Taka Shinagawa
(ActorCell.scala:338)

at akka.actor.LocalActorRef.stop(ActorRef.scala:340)

at akka.actor.dungeon.Children$class.stop(Children.scala:66)

at akka.actor.ActorCell.stop(ActorCell.scala:338)

at
akka.actor.dungeon.FaultHandling$$anonfun$terminate$1.apply(FaultHandling.scala:149)

at
akka.actor.dungeon.FaultHandling$$anonfun$terminate$1.apply(FaultHandling.scala:149)

at scala.collection.Iterator$class.foreach(Iterator.scala:727)

at
akka.util.Collections$PartialImmutableValuesIterable$$anon$1.foreach(Collections.scala:27)

at
akka.util.Collections$PartialImmutableValuesIterable.foreach(Collections.scala:52)

at akka.actor.dungeon.FaultHandling$class.terminate(FaultHandling.scala:149)

at akka.actor.ActorCell.terminate(ActorCell.scala:338)

at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:431)

at akka.actor.ActorCell.systemInvoke(ActorCell.scala:447)

at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:262)

at akka.dispatch.Mailbox.run(Mailbox.scala:218)

at
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)

at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)

at
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)

at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)

at
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)










On Tue, Jul 1, 2014 at 1:04 AM, Patrick Wendell pwend...@gmail.com wrote:

 Do those also happen if you run other hadoop versions (e.g. try 1.0.4)?

 On Tue, Jul 1, 2014 at 1:00 AM, Taka Shinagawa taka.epsi...@gmail.com
 wrote:
  Since Spark 1.0.0, I've been seeing multiple errors when running sbt
 test.
 
  I ran the following commands from Spark 1.0.1 RC1 on Mac OSX 10.9.2.
 
  $ sbt/sbt clean
  $ SPARK_HADOOP_VERSION=1.2.1 sbt/sbt assembly
  $ sbt/sbt test
 
 
  I'm attaching the log file generated by the sbt test.
 
  Here's the summary part of the test.
 
  [info] Run completed in 30 minutes, 57 seconds.
  [info] Total number of tests run: 605
  [info] Suites: completed 83, aborted 0
  [info] Tests: succeeded 600, failed 5, canceled 0, ignored 5, pending 0
  [info] *** 5 TESTS FAILED ***
  [error] Failed: Total 653, Failed 5, Errors 0, Passed 648, Ignored 5
  [error] Failed tests:
  [error] org.apache.spark.ShuffleNettySuite
  [error] org.apache.spark.ShuffleSuite
  [error] org.apache.spark.FileServerSuite
  [error] org.apache.spark.DistributedSuite
  [error] (core/test:test) sbt.TestsFailedException: Tests unsuccessful
  [error] Total time: 2033 s, completed Jul 1, 2014 12:08:03 AM
 
  Is anyone else seeing errors like this?
 
 
  Thanks,
  Taka



Errors from Sbt Test

2014-07-01 Thread Taka Shinagawa
Since Spark 1.0.0, I've been seeing multiple errors when running sbt test.

I ran the following commands from Spark 1.0.1 RC1 on Mac OSX 10.9.2.

$ sbt/sbt clean
$ SPARK_HADOOP_VERSION=1.2.1 sbt/sbt assembly
$ sbt/sbt test


I'm attaching the log file generated by the sbt test.

Here's the summary part of the test.

[info] Run completed in 30 minutes, 57 seconds.
[info] Total number of tests run: 605
[info] Suites: completed 83, aborted 0
[info] Tests: succeeded 600, failed 5, canceled 0, ignored 5, pending 0
[info] *** 5 TESTS FAILED ***
[error] Failed: Total 653, Failed 5, Errors 0, Passed 648, Ignored 5
[error] Failed tests:
[error] org.apache.spark.ShuffleNettySuite
[error] org.apache.spark.ShuffleSuite
[error] org.apache.spark.FileServerSuite
[error] org.apache.spark.DistributedSuite
[error] (core/test:test) sbt.TestsFailedException: Tests unsuccessful
[error] Total time: 2033 s, completed Jul 1, 2014 12:08:03 AM

Is anyone else seeing errors like this?


Thanks,
Taka