Mike,
The broken Configuration link can be fixed if you add a missing dash '-' on
the first line in docs/configuration.md and run 'jekyll build'.
https://github.com/apache/spark/pull/6513
On Fri, May 29, 2015 at 6:38 PM, Mike Ringenburg mik...@cray.com wrote:
The Configuration link on the
Hi,
After testing Spark 1.0.1-RC2 on EC2 instances from the standard Ubuntu and
Amazon Linux AMIs,
I've noticed the MLlib's dependancy on gfortran library (libgfortran.so.3).
sbt assembly succeeds without this library installed, but sbt test
fails as follows.
I'm wondering if documenting this
://spark.apache.org/docs/latest/mllib-guide.html
On Wed, Jul 9, 2014 at 7:35 PM, Taka Shinagawa taka.epsi...@gmail.com
wrote:
Hi,
After testing Spark 1.0.1-RC2 on EC2 instances from the standard Ubuntu
and
Amazon Linux AMIs,
I've noticed the MLlib's dependancy on gfortran library
(ForkJoinPool.java:1979)
at
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
On Tue, Jul 1, 2014 at 1:04 AM, Patrick Wendell pwend...@gmail.com wrote:
Do those also happen if you run other hadoop versions (e.g. try 1.0.4)?
On Tue, Jul 1, 2014 at 1:00 AM, Taka
Since Spark 1.0.0, I've been seeing multiple errors when running sbt test.
I ran the following commands from Spark 1.0.1 RC1 on Mac OSX 10.9.2.
$ sbt/sbt clean
$ SPARK_HADOOP_VERSION=1.2.1 sbt/sbt assembly
$ sbt/sbt test
I'm attaching the log file generated by the sbt test.
Here's the summary