By the way: This is great work. I am new to the spark world, and have been like a kid in a candy store learnign all it can do.
Is there a good list of build variables? What I me is like the SPARK_HIVE variable described on the Spark SQL page. I'd like to include that, but once I found that I wondered if there were other options I should consider before building. Thanks! On Fri, May 30, 2014 at 6:52 AM, John Omernik <j...@omernik.com> wrote: > All: > > In the pom.xml file I see the MapR repository, but it's not included in > the ./project/SparkBuild.scala file. Is this expected? I know to build I > have to add it there otherwise sbt hates me with evil red messages and > such. > > John > > > On Fri, May 30, 2014 at 6:24 AM, Kousuke Saruta <saru...@oss.nttdata.co.jp > > wrote: > >> Hi all >> >> >> >> In <https://spark.apache.org/downloads.html>, the URL for release note >> of 1.0.0 seems to be wrong. >> >> The URL should be >> https://spark.apache.org/releases/spark-release-1-0-0.html but links to >> https://spark.apache.org/releases/spark-release-1.0.0.html >> >> >> >> Best Regards, >> >> Kousuke >> >> >> >> *From:* prabeesh k [mailto:prabsma...@gmail.com] >> *Sent:* Friday, May 30, 2014 8:18 PM >> *To:* user@spark.apache.org >> *Subject:* Re: Announcing Spark 1.0.0 >> >> >> >> I forgot to hard refresh. >> >> thanks >> >> >> >> >> >> On Fri, May 30, 2014 at 4:18 PM, Patrick Wendell <pwend...@gmail.com> >> wrote: >> >> It is updated - try holding "Shift + refresh" in your browser, you are >> probably caching the page. >> >> >> On Fri, May 30, 2014 at 3:46 AM, prabeesh k <prabsma...@gmail.com> wrote: >> > Please update the http://spark.apache.org/docs/latest/ link >> > >> > >> > On Fri, May 30, 2014 at 4:03 PM, Margusja <mar...@roo.ee> wrote: >> >> >> >> Is it possible to download pre build package? >> >> >> >> >> http://mirror.symnds.com/software/Apache/incubator/spark/spark-1.0.0/spark-1.0.0-bin-hadoop2.tgz >> >> - gives me 404 >> >> >> >> Best regards, Margus (Margusja) Roo >> >> +372 51 48 780 >> >> http://margus.roo.ee >> >> http://ee.linkedin.com/in/margusroo >> >> skype: margusja >> >> ldapsearch -x -h ldap.sk.ee -b c=EE "(serialNumber=37303140314)" >> >> >> >> >> >> On 30/05/14 13:18, Christopher Nguyen wrote: >> >>> >> >>> Awesome work, Pat et al.! >> >>> >> >>> -- >> >>> Christopher T. Nguyen >> >>> Co-founder & CEO, Adatao <http://adatao.com> >> >>> linkedin.com/in/ctnguyen <http://linkedin.com/in/ctnguyen> >> >>> >> >>> >> >>> >> >>> >> >>> On Fri, May 30, 2014 at 3:12 AM, Patrick Wendell <pwend...@gmail.com >> >>> <mailto:pwend...@gmail.com>> wrote: >> >>> >> >>> I'm thrilled to announce the availability of Spark 1.0.0! Spark >> 1.0.0 >> >>> is a milestone release as the first in the 1.0 line of releases, >> >>> providing API stability for Spark's core interfaces. >> >>> >> >>> Spark 1.0.0 is Spark's largest release ever, with contributions >> from >> >>> 117 developers. I'd like to thank everyone involved in this >> release - >> >>> it was truly a community effort with fixes, features, and >> >>> optimizations contributed from dozens of organizations. >> >>> >> >>> This release expands Spark's standard libraries, introducing a new >> >>> SQL >> >>> package (SparkSQL) which lets users integrate SQL queries into >> >>> existing Spark workflows. MLlib, Spark's machine learning >> library, is >> >>> expanded with sparse vector support and several new algorithms. >> The >> >>> GraphX and Streaming libraries also introduce new features and >> >>> optimizations. Spark's core engine adds support for secured YARN >> >>> clusters, a unified tool for submitting Spark applications, and >> >>> several performance and stability improvements. Finally, Spark >> adds >> >>> support for Java 8 lambda syntax and improves coverage of the Java >> >>> and >> >>> Python API's. >> >>> >> >>> Those features only scratch the surface - check out the release >> >>> notes here: >> >>> http://spark.apache.org/releases/spark-release-1-0-0.html >> >>> >> >>> Note that since release artifacts were posted recently, certain >> >>> mirrors may not have working downloads for a few hours. >> >>> >> >>> - Patrick >> >>> >> >>> >> >> >> > >> >> >> > >