+1 (non-binding) * verified signature and sha for all files (there's a glitch which I'll describe in below) * built source (DISCLAIMER: didn't run tests) and made custom distribution, and built a docker image based on the distribution - used profiles: kubernetes, hadoop-3.2, hadoop-cloud * ran some SS PySpark queries (Rate to Kafka, Kafka to Kafka) with Spark on k8s (used MinIO - s3 compatible - as checkpoint location) - for Kafka reader, tested both approaches: newer (offset via admin client) and older (offset via consumer) * ran simple batch query with magic committer against MinIO storage & dynamic volume provisioning (with NFS) * verified DataStreamReader.table & DataStreamWriter.toTable works in PySpark (which also verifies on Scala API as well) * ran test stateful SS queries and checked the new additions of SS UI (state store & watermark information)
A glitch from verifying sha; the file format of sha512 is different between source targz and others. My tool succeeded with others and failed with source targz, though I confirmed sha itself is the same. Not a blocker but would be ideal if we can make it be consistent. Thanks for driving the release process! On Tue, Jan 19, 2021 at 2:25 PM Yuming Wang <wgy...@gmail.com> wrote: > +1. > > On Tue, Jan 19, 2021 at 7:54 AM Hyukjin Kwon <gurwls...@gmail.com> wrote: > >> I forgot to say :). I'll start with my +1. >> >> On Mon, 18 Jan 2021, 21:06 Hyukjin Kwon, <gurwls...@gmail.com> wrote: >> >>> Please vote on releasing the following candidate as Apache Spark version >>> 3.1.1. >>> >>> The vote is open until January 22nd 4PM PST and passes if a majority +1 >>> PMC votes are cast, with a minimum of 3 +1 votes. >>> >>> [ ] +1 Release this package as Apache Spark 3.1.0 >>> [ ] -1 Do not release this package because ... >>> >>> To learn more about Apache Spark, please see http://spark.apache.org/ >>> >>> The tag to be voted on is v3.1.1-rc1 (commit >>> 53fe365edb948d0e05a5ccb62f349cd9fcb4bb5d): >>> https://github.com/apache/spark/tree/v3.1.1-rc1 >>> >>> The release files, including signatures, digests, etc. can be found at: >>> https://dist.apache.org/repos/dist/dev/spark/v3.1.1-rc1-bin/ >>> >>> Signatures used for Spark RCs can be found in this file: >>> https://dist.apache.org/repos/dist/dev/spark/KEYS >>> >>> The staging repository for this release can be found at: >>> https://repository.apache.org/content/repositories/orgapachespark-1364 >>> >>> The documentation corresponding to this release can be found at: >>> https://dist.apache.org/repos/dist/dev/spark/v3.1.1-rc1-docs/ >>> >>> The list of bug fixes going into 3.1.1 can be found at the following URL: >>> https://s.apache.org/41kf2 >>> >>> This release is using the release script of the tag v3.1.1-rc1. >>> >>> FAQ >>> >>> =================== >>> What happened to 3.1.0? >>> =================== >>> >>> There was a technical issue during Apache Spark 3.1.0 preparation, and >>> it was discussed and decided to skip 3.1.0. >>> Please see >>> https://spark.apache.org/news/next-official-release-spark-3.1.1.html >>> for more details. >>> >>> ========================= >>> How can I help test this release? >>> ========================= >>> >>> If you are a Spark user, you can help us test this release by taking >>> an existing Spark workload and running on this release candidate, then >>> reporting any regressions. >>> >>> If you're working in PySpark you can set up a virtual env and install >>> the current RC via "pip install >>> https://dist.apache.org/repos/dist/dev/spark/v3.1.1-rc1-bin/pyspark-3.1.1.tar.gz >>> " >>> and see if anything important breaks. >>> In the Java/Scala, you can add the staging repository to your projects >>> resolvers and test >>> with the RC (make sure to clean up the artifact cache before/after so >>> you don't end up building with an out of date RC going forward). >>> >>> =========================================== >>> What should happen to JIRA tickets still targeting 3.1.1? >>> =========================================== >>> >>> The current list of open tickets targeted at 3.1.1 can be found at: >>> https://issues.apache.org/jira/projects/SPARK and search for "Target >>> Version/s" = 3.1.1 >>> >>> Committers should look at those and triage. Extremely important bug >>> fixes, documentation, and API tweaks that impact compatibility should >>> be worked on immediately. Everything else please retarget to an >>> appropriate release. >>> >>> ================== >>> But my bug isn't fixed? >>> ================== >>> >>> In order to make timely releases, we will typically not hold the >>> release unless the bug in question is a regression from the previous >>> release. That being said, if there is something which is a regression >>> that has not been correctly targeted please ping me or a committer to >>> help target the issue. >>> >>>