dependabot[bot] opened a new pull request, #4: URL: https://github.com/apache/beam-starter-java-provider/pull/4
Bumps `beam.version` from 2.63.0 to 2.64.0. Updates `org.apache.beam:beam-sdks-java-google-cloud-platform-bom` from 2.63.0 to 2.64.0 <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/apache/beam/releases">org.apache.beam:beam-sdks-java-google-cloud-platform-bom's releases</a>.</em></p> <blockquote> <h2>Beam 2.64.0 release</h2> <!-- raw HTML omitted --> <p>We are happy to present the new 2.64.0 release of Beam. This release includes both improvements and new functionality. See the <a href="https://beam.apache.org/get-started/downloads/">download page</a> for this release.</p> <!-- raw HTML omitted --> <p>For more information on changes in 2.64.0, check out the <a href="https://github.com/apache/beam/milestone/28">detailed release notes</a>.</p> <h2>Highlights</h2> <ul> <li>Managed API for <a href="https://beam.apache.org/releases/javadoc/current/org/apache/beam/sdk/managed/Managed.html">Java</a> and <a href="https://beam.apache.org/releases/pydoc/current/apache_beam.transforms.managed.html#module-apache_beam.transforms.managed">Python</a> supports <a href="https://beam.apache.org/documentation/io/connectors/">key I/O connectors</a> Iceberg, Kafka, and BigQuery.</li> </ul> <h2>I/Os</h2> <ul> <li>[Java] Use API compatible with both com.google.cloud.bigdataoss:util 2.x and 3.x in BatchLoads (<a href="https://redirect.github.com/apache/beam/pull/34105">#34105</a>)</li> <li>[IcebergIO] Added new CDC source for batch and streaming, available as <code>Managed.ICEBERG_CDC</code> (<a href="https://redirect.github.com/apache/beam/pull/33504">#33504</a>)</li> <li>[IcebergIO] Address edge case where bundle retry following a successful data commit results in data duplication (<a href="https://redirect.github.com/apache/beam/pull/34264">#34264</a>)</li> </ul> <h2>New Features / Improvements</h2> <ul> <li>[Python] Support custom coders in Reshuffle (<a href="https://redirect.github.com/apache/beam/issues/29908">#29908</a>, <a href="https://redirect.github.com/apache/beam/issues/33356">#33356</a>).</li> <li>[Java] Upgrade SLF4J to 2.0.16. Update default Spark version to 3.5.0. (<a href="https://redirect.github.com/apache/beam/pull/33574">#33574</a>)</li> <li>[Java] Support for <code>--add-modules</code> JVM option is added through a new pipeline option <code>JdkAddRootModules</code>. This allows extending the module graph with optional modules such as SDK incubator modules. Sample usage: <code><pipeline invocation> --jdkAddRootModules=jdk.incubator.vector</code> (<a href="https://redirect.github.com/apache/beam/issues/30281">#30281</a>).</li> <li>Managed API for <a href="https://beam.apache.org/releases/javadoc/current/org/apache/beam/sdk/managed/Managed.html">Java</a> and <a href="https://beam.apache.org/releases/pydoc/current/apache_beam.transforms.managed.html#module-apache_beam.transforms.managed">Python</a> supports <a href="https://beam.apache.org/documentation/io/connectors/">key I/O connectors</a> Iceberg, Kafka, and BigQuery.</li> <li>Prism now supports event time triggers for most common cases. (<a href="https://redirect.github.com/apache/beam/issues/31438">#31438</a>) <ul> <li>Prism does not yet support triggered side inputs, or triggers on merging windows (such as session windows).</li> </ul> </li> </ul> <h2>Breaking Changes</h2> <ul> <li>[Python] Reshuffle now correctly respects user-specified type hints, fixing a previous bug where it might use FastPrimitivesCoder wrongly. This change could break pipelines with incorrect type hints in Reshuffle. If you have issues after upgrading, temporarily set update_compatibility_version to a previous Beam version to use the old behavior. The recommended solution is to fix the type hints in your code. (<a href="https://redirect.github.com/apache/beam/pull/33932">#33932</a>)</li> <li>[Java] SparkReceiver 2 has been moved to SparkReceiver 3 that supports Spark 3.x. (<a href="https://redirect.github.com/apache/beam/pull/33574">#33574</a>)</li> <li>[Python] Correct parsing of <code>collections.abc.Sequence</code> type hints was added, which can lead to pipelines failing type hint checks that were previously passing erroneously. These issues will be most commonly seen trying to consume a PCollection with a <code>Sequence</code> type hint after a GroupByKey or a CoGroupByKey. (<a href="https://redirect.github.com/apache/beam/pull/33999">#33999</a>.</li> </ul> <h2>Bugfixes</h2> <ul> <li>(Python) Fixed occasional pipeline stuckness that was affecting Python 3.11 users (<a href="https://redirect.github.com/apache/beam/issues/33966">#33966</a>).</li> <li>(Java) Fixed TIME field encodings for BigQuery Storage API writes on GenericRecords (<a href="https://redirect.github.com/apache/beam/pull/34059">#34059</a>).</li> <li>(Java) Fixed a race condition in JdbcIO which could cause hangs trying to acquire a connection (<a href="https://redirect.github.com/apache/beam/pull/34058">#34058</a>).</li> </ul> <!-- raw HTML omitted --> </blockquote> <p>... (truncated)</p> </details> <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/apache/beam/blob/master/CHANGES.md">org.apache.beam:beam-sdks-java-google-cloud-platform-bom's changelog</a>.</em></p> <blockquote> <h1>[2.64.0] - 2025-03-31</h1> <h2>Highlights</h2> <ul> <li>Managed API for <a href="https://beam.apache.org/releases/javadoc/current/org/apache/beam/sdk/managed/Managed.html">Java</a> and <a href="https://beam.apache.org/releases/pydoc/current/apache_beam.transforms.managed.html#module-apache_beam.transforms.managed">Python</a> supports <a href="https://beam.apache.org/documentation/io/connectors/">key I/O connectors</a> Iceberg, Kafka, and BigQuery.</li> <li>New highly anticipated feature X added to Python SDK (<a href="https://github.com/apache/beam/issues/X">#X</a>).</li> <li>New highly anticipated feature Y added to Java SDK (<a href="https://github.com/apache/beam/issues/Y">#Y</a>).</li> </ul> <h2>I/Os</h2> <ul> <li>[Java] Use API compatible with both com.google.cloud.bigdataoss:util 2.x and 3.x in BatchLoads (<a href="https://redirect.github.com/apache/beam/pull/34105">#34105</a>)</li> <li>[IcebergIO] Added new CDC source for batch and streaming, available as <code>Managed.ICEBERG_CDC</code> (<a href="https://redirect.github.com/apache/beam/pull/33504">#33504</a>)</li> <li>[IcebergIO] Address edge case where bundle retry following a successful data commit results in data duplication (<a href="https://redirect.github.com/apache/beam/pull/34264">#34264</a>)</li> </ul> <h2>New Features / Improvements</h2> <ul> <li>[Python] Support custom coders in Reshuffle (<a href="https://redirect.github.com/apache/beam/issues/29908">#29908</a>, <a href="https://redirect.github.com/apache/beam/issues/33356">#33356</a>).</li> <li>[Java] Upgrade SLF4J to 2.0.16. Update default Spark version to 3.5.0. (<a href="https://redirect.github.com/apache/beam/pull/33574">#33574</a>)</li> <li>[Java] Support for <code>--add-modules</code> JVM option is added through a new pipeline option <code>JdkAddRootModules</code>. This allows extending the module graph with optional modules such as SDK incubator modules. Sample usage: <code><pipeline invocation> --jdkAddRootModules=jdk.incubator.vector</code> (<a href="https://redirect.github.com/apache/beam/issues/30281">#30281</a>).</li> <li>Managed API for <a href="https://beam.apache.org/releases/javadoc/current/org/apache/beam/sdk/managed/Managed.html">Java</a> and <a href="https://beam.apache.org/releases/pydoc/current/apache_beam.transforms.managed.html#module-apache_beam.transforms.managed">Python</a> supports <a href="https://beam.apache.org/documentation/io/connectors/">key I/O connectors</a> Iceberg, Kafka, and BigQuery.</li> <li>[YAML] Beam YAML UDFs (such as those used in MapToFields) can now have declared dependencies (e.g. pypi packages for Python, or extra jars for Java).</li> <li>Prism now supports event time triggers for most common cases. (<a href="https://redirect.github.com/apache/beam/issues/31438">#31438</a>) <ul> <li>Prism does not yet support triggered side inputs, or triggers on merging windows (such as session windows).</li> </ul> </li> </ul> <h2>Breaking Changes</h2> <ul> <li>[Python] Reshuffle now correctly respects user-specified type hints, fixing a previous bug where it might use FastPrimitivesCoder wrongly. This change could break pipelines with incorrect type hints in Reshuffle. If you have issues after upgrading, temporarily set update_compatibility_version to a previous Beam version to use the old behavior. The recommended solution is to fix the type hints in your code. (<a href="https://redirect.github.com/apache/beam/pull/33932">#33932</a>)</li> <li>[Java] SparkReceiver 2 has been moved to SparkReceiver 3 that supports Spark 3.x. (<a href="https://redirect.github.com/apache/beam/pull/33574">#33574</a>)</li> <li>[Python] Correct parsing of <code>collections.abc.Sequence</code> type hints was added, which can lead to pipelines failing type hint checks that were previously passing erroneously. These issues will be most commonly seen trying to consume a PCollection with a <code>Sequence</code> type hint after a GroupByKey or a CoGroupByKey. (<a href="https://redirect.github.com/apache/beam/pull/33999">#33999</a>).</li> </ul> <h2>Bugfixes</h2> <ul> <li>(Python) Fixed occasional pipeline stuckness that was affecting Python 3.11 users (<a href="https://redirect.github.com/apache/beam/issues/33966">#33966</a>).</li> <li>(Java) Fixed TIME field encodings for BigQuery Storage API writes on GenericRecords (<a href="https://redirect.github.com/apache/beam/pull/34059">#34059</a>).</li> <li>(Java) Fixed a race condition in JdbcIO which could cause hangs trying to acquire a connection (<a href="https://redirect.github.com/apache/beam/pull/34058">#34058</a>).</li> <li>(Java) Fix BigQuery Storage Write compatibility with Avro 1.8 (<a href="https://redirect.github.com/apache/beam/pull/34281">#34281</a>).</li> <li>Fixed checkpoint recovery and streaming behavior in Spark Classic and Portable runner's Flatten transform by replacing queueStream with SingleEmitInputDStream (<a href="https://redirect.github.com/apache/beam/pull/34080">#34080</a>, <a href="https://redirect.github.com/apache/beam/issues/18144">#18144</a>, <a href="https://redirect.github.com/apache/beam/issues/20426">#20426</a>)</li> <li>(Java) Fixed Read caching of UnboundedReader objects to effectively cache across multiple DoFns and avoid checkpointing unstarted reader. <a href="https://redirect.github.com/apache/beam/pull/34146">#34146</a> <a href="https://redirect.github.com/apache/beam/pull/33901">#33901</a></li> </ul> <h2>Known Issues</h2> <ul> <li>(Java) Current version of protobuf has a <a href="https://redirect.github.com/protocolbuffers/protobuf/issues/20599">bug</a> leading to incompatibilities with clients using older versions of Protobuf (<a href="https://redirect.github.com/GoogleCloudPlatform/DataflowTemplates/issues/2191">example issue</a>). This issue has been seen in SpannerIO in particular. Tracked in <a href="https://redirect.github.com/GoogleCloudPlatform/DataflowTemplates/issues/34452">#34452</a>.</li> <li>(Java) When constructing <code>SpannerConfig</code> for <code>SpannerIO</code>, calling <code>withHost</code> with a null or empty host will now result in a Null Pointer Exception (<code>java.lang.NullPointerException: Cannot invoke "java.lang.CharSequence.length()" because "this.text" is null</code>). See <a href="https://redirect.github.com/GoogleCloudPlatform/DataflowTemplates/issues/34489">GoogleCloudPlatform/DataflowTemplates#34489</a> for context.</li> </ul> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/apache/beam/commit/e979de95577a23bd79eac9764c13564e2a6896a9"><code>e979de9</code></a> Set version for 2.64.0 RC2</li> <li><a href="https://github.com/apache/beam/commit/56e11aa1dabf5031e987d34fe804fd53bf11c429"><code>56e11aa</code></a> Update build.gradle (<a href="https://redirect.github.com/apache/beam/issues/34401">#34401</a>) (<a href="https://redirect.github.com/apache/beam/issues/34406">#34406</a>)</li> <li><a href="https://github.com/apache/beam/commit/014e77d43e483de6eb06d31701890d6708590d2e"><code>014e77d</code></a> Perform correct release version validation (<a href="https://redirect.github.com/apache/beam/issues/34378">#34378</a>) (<a href="https://redirect.github.com/apache/beam/issues/34400">#34400</a>)</li> <li><a href="https://github.com/apache/beam/commit/efe443d33b4c81408197cd4581182ee3a82efbb7"><code>efe443d</code></a> fix the check on the 2.64 release branch (<a href="https://redirect.github.com/apache/beam/issues/34370">#34370</a>)</li> <li><a href="https://github.com/apache/beam/commit/10f4fc4ed58d904f25b628c130fc89f56f5a4153"><code>10f4fc4</code></a> Set Dataflow container to release version.</li> <li><a href="https://github.com/apache/beam/commit/44500319cacf453d79cbfe2b22a02c17730ef051"><code>4450031</code></a> Add sdf kafka poll latencies (<a href="https://redirect.github.com/apache/beam/issues/34275">#34275</a>)</li> <li><a href="https://github.com/apache/beam/commit/4500499706608ad95458596834c915d96c0fae43"><code>4500499</code></a> update the changes.md by adding 2.65.0 (<a href="https://redirect.github.com/apache/beam/issues/34369">#34369</a>)</li> <li><a href="https://github.com/apache/beam/commit/6842f3ac1b1de53ba153f9058c13f2fcc9c41202"><code>6842f3a</code></a> Add Encryption When Writing to Iceberg Tables in RecordWriter.java (<a href="https://redirect.github.com/apache/beam/issues/34021">#34021</a>)</li> <li><a href="https://github.com/apache/beam/commit/332fe03b72800f37d24951ab78533a74e453ba64"><code>332fe03</code></a> [Managed Iceberg] unbounded source (<a href="https://redirect.github.com/apache/beam/issues/33504">#33504</a>)</li> <li><a href="https://github.com/apache/beam/commit/9e1cf5aaee4abc9c36c7425bcdc39cf6a9fdc75c"><code>9e1cf5a</code></a> Kafta Table consumer properties feature for beamSQL (<a href="https://redirect.github.com/apache/beam/issues/34366">#34366</a>)</li> <li>Additional commits viewable in <a href="https://github.com/apache/beam/compare/v2.63.0...v2.64.0">compare view</a></li> </ul> </details> <br /> Updates `org.apache.beam:beam-sdks-java-core` from 2.63.0 to 2.64.0 <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/apache/beam/releases">org.apache.beam:beam-sdks-java-core's releases</a>.</em></p> <blockquote> <h2>Beam 2.64.0 release</h2> <!-- raw HTML omitted --> <p>We are happy to present the new 2.64.0 release of Beam. This release includes both improvements and new functionality. See the <a href="https://beam.apache.org/get-started/downloads/">download page</a> for this release.</p> <!-- raw HTML omitted --> <p>For more information on changes in 2.64.0, check out the <a href="https://github.com/apache/beam/milestone/28">detailed release notes</a>.</p> <h2>Highlights</h2> <ul> <li>Managed API for <a href="https://beam.apache.org/releases/javadoc/current/org/apache/beam/sdk/managed/Managed.html">Java</a> and <a href="https://beam.apache.org/releases/pydoc/current/apache_beam.transforms.managed.html#module-apache_beam.transforms.managed">Python</a> supports <a href="https://beam.apache.org/documentation/io/connectors/">key I/O connectors</a> Iceberg, Kafka, and BigQuery.</li> </ul> <h2>I/Os</h2> <ul> <li>[Java] Use API compatible with both com.google.cloud.bigdataoss:util 2.x and 3.x in BatchLoads (<a href="https://redirect.github.com/apache/beam/pull/34105">#34105</a>)</li> <li>[IcebergIO] Added new CDC source for batch and streaming, available as <code>Managed.ICEBERG_CDC</code> (<a href="https://redirect.github.com/apache/beam/pull/33504">#33504</a>)</li> <li>[IcebergIO] Address edge case where bundle retry following a successful data commit results in data duplication (<a href="https://redirect.github.com/apache/beam/pull/34264">#34264</a>)</li> </ul> <h2>New Features / Improvements</h2> <ul> <li>[Python] Support custom coders in Reshuffle (<a href="https://redirect.github.com/apache/beam/issues/29908">#29908</a>, <a href="https://redirect.github.com/apache/beam/issues/33356">#33356</a>).</li> <li>[Java] Upgrade SLF4J to 2.0.16. Update default Spark version to 3.5.0. (<a href="https://redirect.github.com/apache/beam/pull/33574">#33574</a>)</li> <li>[Java] Support for <code>--add-modules</code> JVM option is added through a new pipeline option <code>JdkAddRootModules</code>. This allows extending the module graph with optional modules such as SDK incubator modules. Sample usage: <code><pipeline invocation> --jdkAddRootModules=jdk.incubator.vector</code> (<a href="https://redirect.github.com/apache/beam/issues/30281">#30281</a>).</li> <li>Managed API for <a href="https://beam.apache.org/releases/javadoc/current/org/apache/beam/sdk/managed/Managed.html">Java</a> and <a href="https://beam.apache.org/releases/pydoc/current/apache_beam.transforms.managed.html#module-apache_beam.transforms.managed">Python</a> supports <a href="https://beam.apache.org/documentation/io/connectors/">key I/O connectors</a> Iceberg, Kafka, and BigQuery.</li> <li>Prism now supports event time triggers for most common cases. (<a href="https://redirect.github.com/apache/beam/issues/31438">#31438</a>) <ul> <li>Prism does not yet support triggered side inputs, or triggers on merging windows (such as session windows).</li> </ul> </li> </ul> <h2>Breaking Changes</h2> <ul> <li>[Python] Reshuffle now correctly respects user-specified type hints, fixing a previous bug where it might use FastPrimitivesCoder wrongly. This change could break pipelines with incorrect type hints in Reshuffle. If you have issues after upgrading, temporarily set update_compatibility_version to a previous Beam version to use the old behavior. The recommended solution is to fix the type hints in your code. (<a href="https://redirect.github.com/apache/beam/pull/33932">#33932</a>)</li> <li>[Java] SparkReceiver 2 has been moved to SparkReceiver 3 that supports Spark 3.x. (<a href="https://redirect.github.com/apache/beam/pull/33574">#33574</a>)</li> <li>[Python] Correct parsing of <code>collections.abc.Sequence</code> type hints was added, which can lead to pipelines failing type hint checks that were previously passing erroneously. These issues will be most commonly seen trying to consume a PCollection with a <code>Sequence</code> type hint after a GroupByKey or a CoGroupByKey. (<a href="https://redirect.github.com/apache/beam/pull/33999">#33999</a>.</li> </ul> <h2>Bugfixes</h2> <ul> <li>(Python) Fixed occasional pipeline stuckness that was affecting Python 3.11 users (<a href="https://redirect.github.com/apache/beam/issues/33966">#33966</a>).</li> <li>(Java) Fixed TIME field encodings for BigQuery Storage API writes on GenericRecords (<a href="https://redirect.github.com/apache/beam/pull/34059">#34059</a>).</li> <li>(Java) Fixed a race condition in JdbcIO which could cause hangs trying to acquire a connection (<a href="https://redirect.github.com/apache/beam/pull/34058">#34058</a>).</li> </ul> <!-- raw HTML omitted --> </blockquote> <p>... (truncated)</p> </details> <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/apache/beam/blob/master/CHANGES.md">org.apache.beam:beam-sdks-java-core's changelog</a>.</em></p> <blockquote> <h1>[2.64.0] - 2025-03-31</h1> <h2>Highlights</h2> <ul> <li>Managed API for <a href="https://beam.apache.org/releases/javadoc/current/org/apache/beam/sdk/managed/Managed.html">Java</a> and <a href="https://beam.apache.org/releases/pydoc/current/apache_beam.transforms.managed.html#module-apache_beam.transforms.managed">Python</a> supports <a href="https://beam.apache.org/documentation/io/connectors/">key I/O connectors</a> Iceberg, Kafka, and BigQuery.</li> <li>New highly anticipated feature X added to Python SDK (<a href="https://github.com/apache/beam/issues/X">#X</a>).</li> <li>New highly anticipated feature Y added to Java SDK (<a href="https://github.com/apache/beam/issues/Y">#Y</a>).</li> </ul> <h2>I/Os</h2> <ul> <li>[Java] Use API compatible with both com.google.cloud.bigdataoss:util 2.x and 3.x in BatchLoads (<a href="https://redirect.github.com/apache/beam/pull/34105">#34105</a>)</li> <li>[IcebergIO] Added new CDC source for batch and streaming, available as <code>Managed.ICEBERG_CDC</code> (<a href="https://redirect.github.com/apache/beam/pull/33504">#33504</a>)</li> <li>[IcebergIO] Address edge case where bundle retry following a successful data commit results in data duplication (<a href="https://redirect.github.com/apache/beam/pull/34264">#34264</a>)</li> </ul> <h2>New Features / Improvements</h2> <ul> <li>[Python] Support custom coders in Reshuffle (<a href="https://redirect.github.com/apache/beam/issues/29908">#29908</a>, <a href="https://redirect.github.com/apache/beam/issues/33356">#33356</a>).</li> <li>[Java] Upgrade SLF4J to 2.0.16. Update default Spark version to 3.5.0. (<a href="https://redirect.github.com/apache/beam/pull/33574">#33574</a>)</li> <li>[Java] Support for <code>--add-modules</code> JVM option is added through a new pipeline option <code>JdkAddRootModules</code>. This allows extending the module graph with optional modules such as SDK incubator modules. Sample usage: <code><pipeline invocation> --jdkAddRootModules=jdk.incubator.vector</code> (<a href="https://redirect.github.com/apache/beam/issues/30281">#30281</a>).</li> <li>Managed API for <a href="https://beam.apache.org/releases/javadoc/current/org/apache/beam/sdk/managed/Managed.html">Java</a> and <a href="https://beam.apache.org/releases/pydoc/current/apache_beam.transforms.managed.html#module-apache_beam.transforms.managed">Python</a> supports <a href="https://beam.apache.org/documentation/io/connectors/">key I/O connectors</a> Iceberg, Kafka, and BigQuery.</li> <li>[YAML] Beam YAML UDFs (such as those used in MapToFields) can now have declared dependencies (e.g. pypi packages for Python, or extra jars for Java).</li> <li>Prism now supports event time triggers for most common cases. (<a href="https://redirect.github.com/apache/beam/issues/31438">#31438</a>) <ul> <li>Prism does not yet support triggered side inputs, or triggers on merging windows (such as session windows).</li> </ul> </li> </ul> <h2>Breaking Changes</h2> <ul> <li>[Python] Reshuffle now correctly respects user-specified type hints, fixing a previous bug where it might use FastPrimitivesCoder wrongly. This change could break pipelines with incorrect type hints in Reshuffle. If you have issues after upgrading, temporarily set update_compatibility_version to a previous Beam version to use the old behavior. The recommended solution is to fix the type hints in your code. (<a href="https://redirect.github.com/apache/beam/pull/33932">#33932</a>)</li> <li>[Java] SparkReceiver 2 has been moved to SparkReceiver 3 that supports Spark 3.x. (<a href="https://redirect.github.com/apache/beam/pull/33574">#33574</a>)</li> <li>[Python] Correct parsing of <code>collections.abc.Sequence</code> type hints was added, which can lead to pipelines failing type hint checks that were previously passing erroneously. These issues will be most commonly seen trying to consume a PCollection with a <code>Sequence</code> type hint after a GroupByKey or a CoGroupByKey. (<a href="https://redirect.github.com/apache/beam/pull/33999">#33999</a>).</li> </ul> <h2>Bugfixes</h2> <ul> <li>(Python) Fixed occasional pipeline stuckness that was affecting Python 3.11 users (<a href="https://redirect.github.com/apache/beam/issues/33966">#33966</a>).</li> <li>(Java) Fixed TIME field encodings for BigQuery Storage API writes on GenericRecords (<a href="https://redirect.github.com/apache/beam/pull/34059">#34059</a>).</li> <li>(Java) Fixed a race condition in JdbcIO which could cause hangs trying to acquire a connection (<a href="https://redirect.github.com/apache/beam/pull/34058">#34058</a>).</li> <li>(Java) Fix BigQuery Storage Write compatibility with Avro 1.8 (<a href="https://redirect.github.com/apache/beam/pull/34281">#34281</a>).</li> <li>Fixed checkpoint recovery and streaming behavior in Spark Classic and Portable runner's Flatten transform by replacing queueStream with SingleEmitInputDStream (<a href="https://redirect.github.com/apache/beam/pull/34080">#34080</a>, <a href="https://redirect.github.com/apache/beam/issues/18144">#18144</a>, <a href="https://redirect.github.com/apache/beam/issues/20426">#20426</a>)</li> <li>(Java) Fixed Read caching of UnboundedReader objects to effectively cache across multiple DoFns and avoid checkpointing unstarted reader. <a href="https://redirect.github.com/apache/beam/pull/34146">#34146</a> <a href="https://redirect.github.com/apache/beam/pull/33901">#33901</a></li> </ul> <h2>Known Issues</h2> <ul> <li>(Java) Current version of protobuf has a <a href="https://redirect.github.com/protocolbuffers/protobuf/issues/20599">bug</a> leading to incompatibilities with clients using older versions of Protobuf (<a href="https://redirect.github.com/GoogleCloudPlatform/DataflowTemplates/issues/2191">example issue</a>). This issue has been seen in SpannerIO in particular. Tracked in <a href="https://redirect.github.com/GoogleCloudPlatform/DataflowTemplates/issues/34452">#34452</a>.</li> <li>(Java) When constructing <code>SpannerConfig</code> for <code>SpannerIO</code>, calling <code>withHost</code> with a null or empty host will now result in a Null Pointer Exception (<code>java.lang.NullPointerException: Cannot invoke "java.lang.CharSequence.length()" because "this.text" is null</code>). See <a href="https://redirect.github.com/GoogleCloudPlatform/DataflowTemplates/issues/34489">GoogleCloudPlatform/DataflowTemplates#34489</a> for context.</li> </ul> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/apache/beam/commit/e979de95577a23bd79eac9764c13564e2a6896a9"><code>e979de9</code></a> Set version for 2.64.0 RC2</li> <li><a href="https://github.com/apache/beam/commit/56e11aa1dabf5031e987d34fe804fd53bf11c429"><code>56e11aa</code></a> Update build.gradle (<a href="https://redirect.github.com/apache/beam/issues/34401">#34401</a>) (<a href="https://redirect.github.com/apache/beam/issues/34406">#34406</a>)</li> <li><a href="https://github.com/apache/beam/commit/014e77d43e483de6eb06d31701890d6708590d2e"><code>014e77d</code></a> Perform correct release version validation (<a href="https://redirect.github.com/apache/beam/issues/34378">#34378</a>) (<a href="https://redirect.github.com/apache/beam/issues/34400">#34400</a>)</li> <li><a href="https://github.com/apache/beam/commit/efe443d33b4c81408197cd4581182ee3a82efbb7"><code>efe443d</code></a> fix the check on the 2.64 release branch (<a href="https://redirect.github.com/apache/beam/issues/34370">#34370</a>)</li> <li><a href="https://github.com/apache/beam/commit/10f4fc4ed58d904f25b628c130fc89f56f5a4153"><code>10f4fc4</code></a> Set Dataflow container to release version.</li> <li><a href="https://github.com/apache/beam/commit/44500319cacf453d79cbfe2b22a02c17730ef051"><code>4450031</code></a> Add sdf kafka poll latencies (<a href="https://redirect.github.com/apache/beam/issues/34275">#34275</a>)</li> <li><a href="https://github.com/apache/beam/commit/4500499706608ad95458596834c915d96c0fae43"><code>4500499</code></a> update the changes.md by adding 2.65.0 (<a href="https://redirect.github.com/apache/beam/issues/34369">#34369</a>)</li> <li><a href="https://github.com/apache/beam/commit/6842f3ac1b1de53ba153f9058c13f2fcc9c41202"><code>6842f3a</code></a> Add Encryption When Writing to Iceberg Tables in RecordWriter.java (<a href="https://redirect.github.com/apache/beam/issues/34021">#34021</a>)</li> <li><a href="https://github.com/apache/beam/commit/332fe03b72800f37d24951ab78533a74e453ba64"><code>332fe03</code></a> [Managed Iceberg] unbounded source (<a href="https://redirect.github.com/apache/beam/issues/33504">#33504</a>)</li> <li><a href="https://github.com/apache/beam/commit/9e1cf5aaee4abc9c36c7425bcdc39cf6a9fdc75c"><code>9e1cf5a</code></a> Kafta Table consumer properties feature for beamSQL (<a href="https://redirect.github.com/apache/beam/issues/34366">#34366</a>)</li> <li>Additional commits viewable in <a href="https://github.com/apache/beam/compare/v2.63.0...v2.64.0">compare view</a></li> </ul> </details> <br /> Updates `org.apache.beam:beam-sdks-java-expansion-service` from 2.63.0 to 2.64.0 <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/apache/beam/releases">org.apache.beam:beam-sdks-java-expansion-service's releases</a>.</em></p> <blockquote> <h2>Beam 2.64.0 release</h2> <!-- raw HTML omitted --> <p>We are happy to present the new 2.64.0 release of Beam. This release includes both improvements and new functionality. See the <a href="https://beam.apache.org/get-started/downloads/">download page</a> for this release.</p> <!-- raw HTML omitted --> <p>For more information on changes in 2.64.0, check out the <a href="https://github.com/apache/beam/milestone/28">detailed release notes</a>.</p> <h2>Highlights</h2> <ul> <li>Managed API for <a href="https://beam.apache.org/releases/javadoc/current/org/apache/beam/sdk/managed/Managed.html">Java</a> and <a href="https://beam.apache.org/releases/pydoc/current/apache_beam.transforms.managed.html#module-apache_beam.transforms.managed">Python</a> supports <a href="https://beam.apache.org/documentation/io/connectors/">key I/O connectors</a> Iceberg, Kafka, and BigQuery.</li> </ul> <h2>I/Os</h2> <ul> <li>[Java] Use API compatible with both com.google.cloud.bigdataoss:util 2.x and 3.x in BatchLoads (<a href="https://redirect.github.com/apache/beam/pull/34105">#34105</a>)</li> <li>[IcebergIO] Added new CDC source for batch and streaming, available as <code>Managed.ICEBERG_CDC</code> (<a href="https://redirect.github.com/apache/beam/pull/33504">#33504</a>)</li> <li>[IcebergIO] Address edge case where bundle retry following a successful data commit results in data duplication (<a href="https://redirect.github.com/apache/beam/pull/34264">#34264</a>)</li> </ul> <h2>New Features / Improvements</h2> <ul> <li>[Python] Support custom coders in Reshuffle (<a href="https://redirect.github.com/apache/beam/issues/29908">#29908</a>, <a href="https://redirect.github.com/apache/beam/issues/33356">#33356</a>).</li> <li>[Java] Upgrade SLF4J to 2.0.16. Update default Spark version to 3.5.0. (<a href="https://redirect.github.com/apache/beam/pull/33574">#33574</a>)</li> <li>[Java] Support for <code>--add-modules</code> JVM option is added through a new pipeline option <code>JdkAddRootModules</code>. This allows extending the module graph with optional modules such as SDK incubator modules. Sample usage: <code><pipeline invocation> --jdkAddRootModules=jdk.incubator.vector</code> (<a href="https://redirect.github.com/apache/beam/issues/30281">#30281</a>).</li> <li>Managed API for <a href="https://beam.apache.org/releases/javadoc/current/org/apache/beam/sdk/managed/Managed.html">Java</a> and <a href="https://beam.apache.org/releases/pydoc/current/apache_beam.transforms.managed.html#module-apache_beam.transforms.managed">Python</a> supports <a href="https://beam.apache.org/documentation/io/connectors/">key I/O connectors</a> Iceberg, Kafka, and BigQuery.</li> <li>Prism now supports event time triggers for most common cases. (<a href="https://redirect.github.com/apache/beam/issues/31438">#31438</a>) <ul> <li>Prism does not yet support triggered side inputs, or triggers on merging windows (such as session windows).</li> </ul> </li> </ul> <h2>Breaking Changes</h2> <ul> <li>[Python] Reshuffle now correctly respects user-specified type hints, fixing a previous bug where it might use FastPrimitivesCoder wrongly. This change could break pipelines with incorrect type hints in Reshuffle. If you have issues after upgrading, temporarily set update_compatibility_version to a previous Beam version to use the old behavior. The recommended solution is to fix the type hints in your code. (<a href="https://redirect.github.com/apache/beam/pull/33932">#33932</a>)</li> <li>[Java] SparkReceiver 2 has been moved to SparkReceiver 3 that supports Spark 3.x. (<a href="https://redirect.github.com/apache/beam/pull/33574">#33574</a>)</li> <li>[Python] Correct parsing of <code>collections.abc.Sequence</code> type hints was added, which can lead to pipelines failing type hint checks that were previously passing erroneously. These issues will be most commonly seen trying to consume a PCollection with a <code>Sequence</code> type hint after a GroupByKey or a CoGroupByKey. (<a href="https://redirect.github.com/apache/beam/pull/33999">#33999</a>.</li> </ul> <h2>Bugfixes</h2> <ul> <li>(Python) Fixed occasional pipeline stuckness that was affecting Python 3.11 users (<a href="https://redirect.github.com/apache/beam/issues/33966">#33966</a>).</li> <li>(Java) Fixed TIME field encodings for BigQuery Storage API writes on GenericRecords (<a href="https://redirect.github.com/apache/beam/pull/34059">#34059</a>).</li> <li>(Java) Fixed a race condition in JdbcIO which could cause hangs trying to acquire a connection (<a href="https://redirect.github.com/apache/beam/pull/34058">#34058</a>).</li> </ul> <!-- raw HTML omitted --> </blockquote> <p>... (truncated)</p> </details> <details> <summary>Changelog</summary> <p><em>Sourced from <a href="https://github.com/apache/beam/blob/master/CHANGES.md">org.apache.beam:beam-sdks-java-expansion-service's changelog</a>.</em></p> <blockquote> <h1>[2.64.0] - 2025-03-31</h1> <h2>Highlights</h2> <ul> <li>Managed API for <a href="https://beam.apache.org/releases/javadoc/current/org/apache/beam/sdk/managed/Managed.html">Java</a> and <a href="https://beam.apache.org/releases/pydoc/current/apache_beam.transforms.managed.html#module-apache_beam.transforms.managed">Python</a> supports <a href="https://beam.apache.org/documentation/io/connectors/">key I/O connectors</a> Iceberg, Kafka, and BigQuery.</li> <li>New highly anticipated feature X added to Python SDK (<a href="https://github.com/apache/beam/issues/X">#X</a>).</li> <li>New highly anticipated feature Y added to Java SDK (<a href="https://github.com/apache/beam/issues/Y">#Y</a>).</li> </ul> <h2>I/Os</h2> <ul> <li>[Java] Use API compatible with both com.google.cloud.bigdataoss:util 2.x and 3.x in BatchLoads (<a href="https://redirect.github.com/apache/beam/pull/34105">#34105</a>)</li> <li>[IcebergIO] Added new CDC source for batch and streaming, available as <code>Managed.ICEBERG_CDC</code> (<a href="https://redirect.github.com/apache/beam/pull/33504">#33504</a>)</li> <li>[IcebergIO] Address edge case where bundle retry following a successful data commit results in data duplication (<a href="https://redirect.github.com/apache/beam/pull/34264">#34264</a>)</li> </ul> <h2>New Features / Improvements</h2> <ul> <li>[Python] Support custom coders in Reshuffle (<a href="https://redirect.github.com/apache/beam/issues/29908">#29908</a>, <a href="https://redirect.github.com/apache/beam/issues/33356">#33356</a>).</li> <li>[Java] Upgrade SLF4J to 2.0.16. Update default Spark version to 3.5.0. (<a href="https://redirect.github.com/apache/beam/pull/33574">#33574</a>)</li> <li>[Java] Support for <code>--add-modules</code> JVM option is added through a new pipeline option <code>JdkAddRootModules</code>. This allows extending the module graph with optional modules such as SDK incubator modules. Sample usage: <code><pipeline invocation> --jdkAddRootModules=jdk.incubator.vector</code> (<a href="https://redirect.github.com/apache/beam/issues/30281">#30281</a>).</li> <li>Managed API for <a href="https://beam.apache.org/releases/javadoc/current/org/apache/beam/sdk/managed/Managed.html">Java</a> and <a href="https://beam.apache.org/releases/pydoc/current/apache_beam.transforms.managed.html#module-apache_beam.transforms.managed">Python</a> supports <a href="https://beam.apache.org/documentation/io/connectors/">key I/O connectors</a> Iceberg, Kafka, and BigQuery.</li> <li>[YAML] Beam YAML UDFs (such as those used in MapToFields) can now have declared dependencies (e.g. pypi packages for Python, or extra jars for Java).</li> <li>Prism now supports event time triggers for most common cases. (<a href="https://redirect.github.com/apache/beam/issues/31438">#31438</a>) <ul> <li>Prism does not yet support triggered side inputs, or triggers on merging windows (such as session windows).</li> </ul> </li> </ul> <h2>Breaking Changes</h2> <ul> <li>[Python] Reshuffle now correctly respects user-specified type hints, fixing a previous bug where it might use FastPrimitivesCoder wrongly. This change could break pipelines with incorrect type hints in Reshuffle. If you have issues after upgrading, temporarily set update_compatibility_version to a previous Beam version to use the old behavior. The recommended solution is to fix the type hints in your code. (<a href="https://redirect.github.com/apache/beam/pull/33932">#33932</a>)</li> <li>[Java] SparkReceiver 2 has been moved to SparkReceiver 3 that supports Spark 3.x. (<a href="https://redirect.github.com/apache/beam/pull/33574">#33574</a>)</li> <li>[Python] Correct parsing of <code>collections.abc.Sequence</code> type hints was added, which can lead to pipelines failing type hint checks that were previously passing erroneously. These issues will be most commonly seen trying to consume a PCollection with a <code>Sequence</code> type hint after a GroupByKey or a CoGroupByKey. (<a href="https://redirect.github.com/apache/beam/pull/33999">#33999</a>).</li> </ul> <h2>Bugfixes</h2> <ul> <li>(Python) Fixed occasional pipeline stuckness that was affecting Python 3.11 users (<a href="https://redirect.github.com/apache/beam/issues/33966">#33966</a>).</li> <li>(Java) Fixed TIME field encodings for BigQuery Storage API writes on GenericRecords (<a href="https://redirect.github.com/apache/beam/pull/34059">#34059</a>).</li> <li>(Java) Fixed a race condition in JdbcIO which could cause hangs trying to acquire a connection (<a href="https://redirect.github.com/apache/beam/pull/34058">#34058</a>).</li> <li>(Java) Fix BigQuery Storage Write compatibility with Avro 1.8 (<a href="https://redirect.github.com/apache/beam/pull/34281">#34281</a>).</li> <li>Fixed checkpoint recovery and streaming behavior in Spark Classic and Portable runner's Flatten transform by replacing queueStream with SingleEmitInputDStream (<a href="https://redirect.github.com/apache/beam/pull/34080">#34080</a>, <a href="https://redirect.github.com/apache/beam/issues/18144">#18144</a>, <a href="https://redirect.github.com/apache/beam/issues/20426">#20426</a>)</li> <li>(Java) Fixed Read caching of UnboundedReader objects to effectively cache across multiple DoFns and avoid checkpointing unstarted reader. <a href="https://redirect.github.com/apache/beam/pull/34146">#34146</a> <a href="https://redirect.github.com/apache/beam/pull/33901">#33901</a></li> </ul> <h2>Known Issues</h2> <ul> <li>(Java) Current version of protobuf has a <a href="https://redirect.github.com/protocolbuffers/protobuf/issues/20599">bug</a> leading to incompatibilities with clients using older versions of Protobuf (<a href="https://redirect.github.com/GoogleCloudPlatform/DataflowTemplates/issues/2191">example issue</a>). This issue has been seen in SpannerIO in particular. Tracked in <a href="https://redirect.github.com/GoogleCloudPlatform/DataflowTemplates/issues/34452">#34452</a>.</li> <li>(Java) When constructing <code>SpannerConfig</code> for <code>SpannerIO</code>, calling <code>withHost</code> with a null or empty host will now result in a Null Pointer Exception (<code>java.lang.NullPointerException: Cannot invoke "java.lang.CharSequence.length()" because "this.text" is null</code>). See <a href="https://redirect.github.com/GoogleCloudPlatform/DataflowTemplates/issues/34489">GoogleCloudPlatform/DataflowTemplates#34489</a> for context.</li> </ul> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/apache/beam/commit/e979de95577a23bd79eac9764c13564e2a6896a9"><code>e979de9</code></a> Set version for 2.64.0 RC2</li> <li><a href="https://github.com/apache/beam/commit/56e11aa1dabf5031e987d34fe804fd53bf11c429"><code>56e11aa</code></a> Update build.gradle (<a href="https://redirect.github.com/apache/beam/issues/34401">#34401</a>) (<a href="https://redirect.github.com/apache/beam/issues/34406">#34406</a>)</li> <li><a href="https://github.com/apache/beam/commit/014e77d43e483de6eb06d31701890d6708590d2e"><code>014e77d</code></a> Perform correct release version validation (<a href="https://redirect.github.com/apache/beam/issues/34378">#34378</a>) (<a href="https://redirect.github.com/apache/beam/issues/34400">#34400</a>)</li> <li><a href="https://github.com/apache/beam/commit/efe443d33b4c81408197cd4581182ee3a82efbb7"><code>efe443d</code></a> fix the check on the 2.64 release branch (<a href="https://redirect.github.com/apache/beam/issues/34370">#34370</a>)</li> <li><a href="https://github.com/apache/beam/commit/10f4fc4ed58d904f25b628c130fc89f56f5a4153"><code>10f4fc4</code></a> Set Dataflow container to release version.</li> <li><a href="https://github.com/apache/beam/commit/44500319cacf453d79cbfe2b22a02c17730ef051"><code>4450031</code></a> Add sdf kafka poll latencies (<a href="https://redirect.github.com/apache/beam/issues/34275">#34275</a>)</li> <li><a href="https://github.com/apache/beam/commit/4500499706608ad95458596834c915d96c0fae43"><code>4500499</code></a> update the changes.md by adding 2.65.0 (<a href="https://redirect.github.com/apache/beam/issues/34369">#34369</a>)</li> <li><a href="https://github.com/apache/beam/commit/6842f3ac1b1de53ba153f9058c13f2fcc9c41202"><code>6842f3a</code></a> Add Encryption When Writing to Iceberg Tables in RecordWriter.java (<a href="https://redirect.github.com/apache/beam/issues/34021">#34021</a>)</li> <li><a href="https://github.com/apache/beam/commit/332fe03b72800f37d24951ab78533a74e453ba64"><code>332fe03</code></a> [Managed Iceberg] unbounded source (<a href="https://redirect.github.com/apache/beam/issues/33504">#33504</a>)</li> <li><a href="https://github.com/apache/beam/commit/9e1cf5aaee4abc9c36c7425bcdc39cf6a9fdc75c"><code>9e1cf5a</code></a> Kafta Table consumer properties feature for beamSQL (<a href="https://redirect.github.com/apache/beam/issues/34366">#34366</a>)</li> <li>Additional commits viewable in <a href="https://github.com/apache/beam/compare/v2.63.0...v2.64.0">compare view</a></li> </ul> </details> <br /> Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) </details> -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: dev-unsubscr...@beam.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org