Thanks! I realize that manipulating the published version in the pom is a
bit inconvenient but it's really useful to have clear version identifiers
when we're juggling different versions and testing them out. For example,
this will come in handy when we compare 1.4.0-rc1 and 1.4.0-rc2 in a couple
of weeks :)

Punya

On Tue, May 19, 2015 at 12:39 PM Patrick Wendell <pwend...@gmail.com> wrote:

> Punya,
>
> Let me see if I can publish these under rc1 as well. In the future
> this will all be automated but current it's a somewhat manual task.
>
> - Patrick
>
> On Tue, May 19, 2015 at 9:32 AM, Punyashloka Biswal
> <punya.bis...@gmail.com> wrote:
> > When publishing future RCs to the staging repository, would it be
> possible
> > to use a version number that includes the "rc1" designation? In the
> current
> > setup, when I run a build against the artifacts at
> >
> https://repository.apache.org/content/repositories/orgapachespark-1092/org/apache/spark/spark-core_2.10/1.4.0/
> ,
> > my local Maven cache will get polluted with things that claim to be 1.4.0
> > but aren't. It would be preferable for the version number to be 1.4.0-rc1
> > instead.
> >
> > Thanks!
> > Punya
> >
> >
> > On Tue, May 19, 2015 at 12:20 PM Sean Owen <so...@cloudera.com> wrote:
> >>
> >> Before I vote, I wanted to point out there are still 9 Blockers for
> 1.4.0.
> >> I'd like to use this status to really mean "must happen before the
> release".
> >> Many of these may be already fixed, or aren't really blockers -- can
> just be
> >> updated accordingly.
> >>
> >> I bet at least one will require further work if it's really meant for
> 1.4,
> >> so all this means is there is likely to be another RC. We should still
> kick
> >> the tires on RC1.
> >>
> >> (I also assume we should be extra conservative about what is merged into
> >> 1.4 at this point.)
> >>
> >>
> >> SPARK-6784 SQL Clean up all the inbound/outbound conversions for
> DateType
> >> Adrian Wang
> >>
> >> SPARK-6811 SparkR Building binary R packages for SparkR Shivaram
> >> Venkataraman
> >>
> >> SPARK-6941 SQL Provide a better error message to explain that tables
> >> created from RDDs are immutable
> >> SPARK-7158 SQL collect and take return different results
> >> SPARK-7478 SQL Add a SQLContext.getOrCreate to maintain a singleton
> >> instance of SQLContext Tathagata Das
> >>
> >> SPARK-7616 SQL Overwriting a partitioned parquet table corrupt data
> Cheng
> >> Lian
> >>
> >> SPARK-7654 SQL DataFrameReader and DataFrameWriter for input/output API
> >> Reynold Xin
> >>
> >> SPARK-7662 SQL Exception of multi-attribute generator anlysis in
> >> projection
> >>
> >> SPARK-7713 SQL Use shared broadcast hadoop conf for partitioned table
> >> scan. Yin Huai
> >>
> >>
> >> On Tue, May 19, 2015 at 5:10 PM, Patrick Wendell <pwend...@gmail.com>
> >> wrote:
> >>>
> >>> Please vote on releasing the following candidate as Apache Spark
> version
> >>> 1.4.0!
> >>>
> >>> The tag to be voted on is v1.4.0-rc1 (commit 777a081):
> >>>
> >>>
> https://git-wip-us.apache.org/repos/asf?p=spark.git;a=commit;h=777a08166f1fb144146ba32581d4632c3466541e
> >>>
> >>> The release files, including signatures, digests, etc. can be found at:
> >>> http://people.apache.org/~pwendell/spark-1.4.0-rc1/
> >>>
> >>> Release artifacts are signed with the following key:
> >>> https://people.apache.org/keys/committer/pwendell.asc
> >>>
> >>> The staging repository for this release can be found at:
> >>>
> https://repository.apache.org/content/repositories/orgapachespark-1092/
> >>>
> >>> The documentation corresponding to this release can be found at:
> >>> http://people.apache.org/~pwendell/spark-1.4.0-rc1-docs/
> >>>
> >>> Please vote on releasing this package as Apache Spark 1.4.0!
> >>>
> >>> The vote is open until Friday, May 22, at 17:03 UTC and passes
> >>> if a majority of at least 3 +1 PMC votes are cast.
> >>>
> >>> [ ] +1 Release this package as Apache Spark 1.4.0
> >>> [ ] -1 Do not release this package because ...
> >>>
> >>> To learn more about Apache Spark, please see
> >>> http://spark.apache.org/
> >>>
> >>> == How can I help test this release? ==
> >>> If you are a Spark user, you can help us test this release by
> >>> taking a Spark 1.3 workload and running on this release candidate,
> >>> then reporting any regressions.
> >>>
> >>> == What justifies a -1 vote for this release? ==
> >>> This vote is happening towards the end of the 1.4 QA period,
> >>> so -1 votes should only occur for significant regressions from 1.3.1.
> >>> Bugs already present in 1.3.X, minor regressions, or bugs related
> >>> to new features will not block this release.
> >>>
> >>> ---------------------------------------------------------------------
> >>> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> >>> For additional commands, e-mail: dev-h...@spark.apache.org
> >>>
> >>
> >
>

Reply via email to