Re: Announcing Spark 1.1.1!
Because this was a maintenance release, we should not have introduced any binary backwards or forwards incompatibilities. Therefore, applications that were written and compiled against 1.1.0 should still work against a 1.1.1 cluster, and vice versa. On Wed, Dec 3, 2014 at 1:30 PM, Andrew Or wrote: > By the Spark server do you mean the standalone Master? It is best if they > are upgraded together because there have been changes to the Master in > 1.1.1. Although it might "just work", it's highly recommended to restart > your cluster manager too. > > 2014-12-03 13:19 GMT-08:00 Romi Kuntsman : > > About version compatibility and upgrade path - can the Java application >> dependencies and the Spark server be upgraded separately (i.e. will 1.1.0 >> library work with 1.1.1 server, and vice versa), or do they need to be >> upgraded together? >> >> Thanks! >> >> *Romi Kuntsman*, *Big Data Engineer* >> http://www.totango.com >> >> On Tue, Dec 2, 2014 at 11:36 PM, Andrew Or wrote: >> >>> I am happy to announce the availability of Spark 1.1.1! This is a >>> maintenance release with many bug fixes, most of which are concentrated in >>> the core. This list includes various fixes to sort-based shuffle, memory >>> leak, and spilling issues. Contributions from this release came from 55 >>> developers. >>> >>> Visit the release notes [1] to read about the new features, or >>> download [2] the release today. >>> >>> [1] http://spark.apache.org/releases/spark-release-1-1-1.html >>> [2] http://spark.apache.org/downloads.html >>> >>> Please e-mail me directly for any typo's in the release notes or name >>> listing. >>> >>> Thanks for everyone who contributed, and congratulations! >>> -Andrew >>> >> >> >
Re: Announcing Spark 1.1.1!
By the Spark server do you mean the standalone Master? It is best if they are upgraded together because there have been changes to the Master in 1.1.1. Although it might "just work", it's highly recommended to restart your cluster manager too. 2014-12-03 13:19 GMT-08:00 Romi Kuntsman : > About version compatibility and upgrade path - can the Java application > dependencies and the Spark server be upgraded separately (i.e. will 1.1.0 > library work with 1.1.1 server, and vice versa), or do they need to be > upgraded together? > > Thanks! > > *Romi Kuntsman*, *Big Data Engineer* > http://www.totango.com > > On Tue, Dec 2, 2014 at 11:36 PM, Andrew Or wrote: > >> I am happy to announce the availability of Spark 1.1.1! This is a >> maintenance release with many bug fixes, most of which are concentrated in >> the core. This list includes various fixes to sort-based shuffle, memory >> leak, and spilling issues. Contributions from this release came from 55 >> developers. >> >> Visit the release notes [1] to read about the new features, or >> download [2] the release today. >> >> [1] http://spark.apache.org/releases/spark-release-1-1-1.html >> [2] http://spark.apache.org/downloads.html >> >> Please e-mail me directly for any typo's in the release notes or name >> listing. >> >> Thanks for everyone who contributed, and congratulations! >> -Andrew >> > >
Re: Announcing Spark 1.1.1!
About version compatibility and upgrade path - can the Java application dependencies and the Spark server be upgraded separately (i.e. will 1.1.0 library work with 1.1.1 server, and vice versa), or do they need to be upgraded together? Thanks! *Romi Kuntsman*, *Big Data Engineer* http://www.totango.com On Tue, Dec 2, 2014 at 11:36 PM, Andrew Or wrote: > I am happy to announce the availability of Spark 1.1.1! This is a > maintenance release with many bug fixes, most of which are concentrated in > the core. This list includes various fixes to sort-based shuffle, memory > leak, and spilling issues. Contributions from this release came from 55 > developers. > > Visit the release notes [1] to read about the new features, or > download [2] the release today. > > [1] http://spark.apache.org/releases/spark-release-1-1-1.html > [2] http://spark.apache.org/downloads.html > > Please e-mail me directly for any typo's in the release notes or name > listing. > > Thanks for everyone who contributed, and congratulations! > -Andrew >
Re: Announcing Spark 1.1.1!
Andrew and developers, thank you for excellent release! It fixed almost all of our issues. Now we are migrating to Spark from Zoo of Python, Java, Hive, Pig jobs. Our Scala/Spark jobs often failed on 1.1. Spark 1.1.1 works like a Swiss watch. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Announcing-Spark-1-1-1-tp20195p20251.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Announcing Spark 1.1.1!
I am happy to announce the availability of Spark 1.1.1! This is a maintenance release with many bug fixes, most of which are concentrated in the core. This list includes various fixes to sort-based shuffle, memory leak, and spilling issues. Contributions from this release came from 55 developers. Visit the release notes [1] to read about the new features, or download [2] the release today. [1] http://spark.apache.org/releases/spark-release-1-1-1.html [2] http://spark.apache.org/downloads.html Please e-mail me directly for any typo's in the release notes or name listing. Thanks for everyone who contributed, and congratulations! -Andrew