+1 (non-binding) signature, checksum, license, build and test

On Fri, Nov 5, 2021 at 12:36 AM OpenInx <open...@gmail.com> wrote:

> +1  (binding)
>
> 1. Download the source tarball, signature (.asc), and checksum (.sha512):
>  OK
> 2. Import gpg keys: download KEYS and run gpg --import
> /path/to/downloaded/KEYS (optional if this hasn’t changed) :  OK
> 3. Verify the signature by running: gpg --verify
> apache-iceberg-xx-incubating.tar.gz.asc:  OK
> 4. Verify the checksum by running: shasum -a 256 -c
> apache-iceberg-0.12.1.tar.gz.sha512 apache-iceberg-0.12.1.tar.gz :  OK
> 5. Untar the archive and go into the source directory: tar xzf
> apache-iceberg-xx-incubating.tar.gz && cd apache-iceberg-xx-incubating:  OK
> 6. Run RAT checks to validate license headers: dev/check-license: OK
> 7. Build and test the project: ./gradlew build (use Java 8) :   OK
> 8. Check the flink works fine by the following command line:
>
> ./bin/sql-client.sh embedded -j
> /Users/openinx/Downloads/apache-iceberg-0.12.1/flink-runtime/build/libs/iceberg-flink-runtime-0.12.1.jar
> shell
>
> CREATE CATALOG hadoop_prod WITH (
> 'type'='iceberg',
> 'catalog-type'='hadoop',
> 'warehouse'='file:///Users/openinx/test/iceberg-warehouse'
> );
>
> CREATE TABLE `hadoop_prod`.`default`.`flink_table` (
> id BIGINT,
> data STRING
> );
>
> INSERT INTO `hadoop_prod`.`default`.`flink_table` VALUES (1, 'AAA');
> SELECT * FROM `hadoop_prod`.`default`.`flink_table`;
> +----+------+
> | id | data |
> +----+------+
> | 1 | AAA |
> +----+------+
> 1 row in set
>
> Thanks all for the work.
>
> On Fri, Nov 5, 2021 at 2:20 PM Cheng Pan <cheng...@apache.org> wrote:
>
>> +1 (non-binding)
>>
>> The integration test based on the master branch of Apache Kyuubi
>> (Incubating) passed.
>>
>> https://github.com/apache/incubator-kyuubi/pull/1338
>>
>> Thanks,
>> Cheng Pan
>>
>> On Fri, Nov 5, 2021 at 1:19 PM Kyle Bendickson <k...@tabular.io> wrote:
>> >
>> >
>> > +1 (binding)
>> >
>> > - Validated checksums, signatures, and licenses
>> > -  Ran all of the unit tests
>> > - Imported Files from Orc tables via Spark stored procedure, with
>> floating point type columns and inspected the metrics afterwards
>> > - Registered and used bucketed UDFs for various types such as integer
>> and byte
>> > - Created and dropped tables
>> > - Ran MERGE INTO queries using Spark DDL
>> > - Verified ability to read tables with parquet files with nested map
>> type schema from various versions (both before and after Parquet 1.11.0 ->
>> 1.11.1 upgrade)
>> > - Tried to set a tblproperty to null (received error as expected)
>> > - Full unit test suite
>> > - Ran several Flink queries, both batch and streaming.
>> > - Tested against a custom catalog
>> >
>> > My spark configuration was very similar to Ryan’s. I used Flink 1.12.1
>> on a docker-compose setup via the Flink SQL client with 2 task managers.
>> >
>> > In addition to testing with a custom catalog, I also tested with HMS /
>> Hive catalog with HDFS as storage as well as Hadoop Catalog with data on
>> (local) HDFS.
>> >
>> > I’ve not gotten the Hive3 errors despite running unit tests several
>> times.
>> >
>> > - Kyle (@kbendick)
>> >
>> >
>> > On Thu, Nov 4, 2021 at 9:57 PM Daniel Weeks <dwe...@apache.org> wrote:
>> >>
>> >> +1 (binding)
>> >>
>> >> Verified sigs, sums, license, build and test.
>> >>
>> >> -Dan
>> >>
>> >> On Thu, Nov 4, 2021 at 4:30 PM Ryan Blue <b...@tabular.io> wrote:
>> >>>
>> >>> +1 (binding)
>> >>>
>> >>> Validated checksums, checked signature, ran tests (still a couple
>> failing in Hive3)
>> >>> Staged binaries from the release tarball
>> >>> Tested Spark metadata tables
>> >>> Used rewrite_manifests stored procedure in Spark
>> >>> Updated to v2 using SET TBLPROPERTIES
>> >>> Dropped and added partition fields
>> >>> Replaced a table with itself using INSERT OVERWRITE
>> >>> Tested custom catalogs
>> >>>
>> >>> Here’s my Spark config script in case anyone else wants to validate:
>> >>>
>> >>> /home/blue/Apps/spark-3.1.1-bin-hadoop3.2/bin/spark-shell \
>> >>>     --conf spark.jars.repositories=
>> https://repository.apache.org/content/repositories/orgapacheiceberg-1019/
>> \
>> >>>     --packages org.apache.iceberg:iceberg-spark3-runtime:0.12.1 \
>> >>>     --conf
>> spark.sql.extensions=org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions
>> \
>> >>>     --conf
>> spark.sql.catalog.local=org.apache.iceberg.spark.SparkCatalog \
>> >>>     --conf spark.sql.catalog.local.type=hadoop \
>> >>>     --conf
>> spark.sql.catalog.local.warehouse=/home/blue/tmp/hadoop-warehouse \
>> >>>     --conf spark.sql.catalog.local.default-namespace=default \
>> >>>     --conf
>> spark.sql.catalog.prodhive=org.apache.iceberg.spark.SparkCatalog \
>> >>>     --conf spark.sql.catalog.prodhive.type=hive \
>> >>>     --conf
>> spark.sql.catalog.prodhive.warehouse=/home/blue/tmp/prod-warehouse \
>> >>>     --conf spark.sql.catalog.prodhive.default-namespace=default \
>> >>>     --conf spark.sql.defaultCatalog=local
>> >>>
>> >>>
>> >>> On Thu, Nov 4, 2021 at 1:02 PM Jack Ye <yezhao...@gmail.com> wrote:
>> >>>>
>> >>>> +1, non-binding
>> >>>>
>> >>>> ran checksum, build, unit tests, AWS integration tests and verified
>> fixes in EMR 6.4.0.
>> >>>>
>> >>>> Best,
>> >>>> Jack Ye
>> >>>>
>> >>>> On Tue, Nov 2, 2021 at 7:16 PM Kyle Bendickson <k...@tabular.io>
>> wrote:
>> >>>>>
>> >>>>> Hi everyone,
>> >>>>>
>> >>>>>
>> >>>>> I propose the following RC to be released as the official Apache
>> Iceberg 0.12.1 release.
>> >>>>>
>> >>>>>
>> >>>>> The commit id is d4052a73f14b63e1f519aaa722971dc74f8c9796
>> >>>>>
>> >>>>> * This corresponds to the tag: apache-iceberg-0.12.1-rc0
>> >>>>>
>> >>>>> *
>> https://github.com/apache/iceberg/commits/apache-iceberg-0.12.1-rc0
>> >>>>>
>> >>>>> *
>> https://github.com/apache/iceberg/tree/d4052a73f14b63e1f519aaa722971dc74f8c9796
>> >>>>>
>> >>>>>
>> >>>>> The release tarball, signature, and checksums are here:
>> >>>>>
>> >>>>> *
>> https://dist.apache.org/repos/dist/dev/iceberg/apache-iceberg-0.12.1-rc0/
>> >>>>>
>> >>>>>
>> >>>>> You can find the KEYS file here:
>> >>>>>
>> >>>>> * https://dist.apache.org/repos/dist/dev/iceberg/KEYS
>> >>>>>
>> >>>>>
>> >>>>> Convenience binary artifacts are staged in Nexus. The Maven
>> repository URL is:
>> >>>>>
>> >>>>> *
>> https://repository.apache.org/content/repositories/orgapacheiceberg-1019/
>> >>>>>
>> >>>>>
>> >>>>> This release includes the following changes:
>> >>>>>
>> >>>>>
>> https://github.com/apache/iceberg/compare/apache-iceberg-0.12.0...apache-iceberg-0.12.1-rc0
>> >>>>>
>> >>>>>
>> >>>>> Please download, verify, and test.
>> >>>>>
>> >>>>>
>> >>>>> Please vote in the next 72 hours.
>> >>>>>
>> >>>>>
>> >>>>> [ ] +1 Release this as Apache Iceberg <VERSION>
>> >>>>>
>> >>>>> [ ] +0
>> >>>>>
>> >>>>> [ ] -1 Do not release this because...
>> >>>>>
>> >>>>> --
>> >>>>> Best,
>> >>>>> Kyle Bendickson
>> >>>>> Github: @kbendick
>> >>>
>> >>>
>> >>>
>> >>> --
>> >>> Ryan Blue
>> >>> Tabular
>>
>

Reply via email to