This is an automated email from the ASF dual-hosted git repository.

jark pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/fluss.git


The following commit(s) were added to refs/heads/main by this push:
     new bd50f5e9b [docs] Update docs for release manager to correct wrong part 
(#2672)
bd50f5e9b is described below

commit bd50f5e9b90e3252c9cb7b8ce0e2de3fba3b7480
Author: yuxia Luo <[email protected]>
AuthorDate: Fri Feb 13 17:06:11 2026 +0800

    [docs] Update docs for release manager to correct wrong part (#2672)
---
 docker/quickstart-flink/Dockerfile                 |  49 ----
 docker/quickstart-flink/README.md                  |  59 -----
 docker/quickstart-flink/bin/sql-client             |  21 --
 docker/quickstart-flink/prepare_build.sh           | 261 ---------------------
 docker/quickstart-flink/sql/sql-client.sql         |  68 ------
 fluss-spark/fluss-spark-common/pom.xml             |  13 +
 tools/releasing/create_binary_release.sh           |   2 +
 .../how-to-release/creating-a-fluss-release.mdx    |  61 ++---
 .../how-to-release/release-manager-preparation.md  |   6 +-
 9 files changed, 49 insertions(+), 491 deletions(-)

diff --git a/docker/quickstart-flink/Dockerfile 
b/docker/quickstart-flink/Dockerfile
deleted file mode 100644
index 303728229..000000000
--- a/docker/quickstart-flink/Dockerfile
+++ /dev/null
@@ -1,49 +0,0 @@
-#
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#      http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-
-# Use Flink 1.20.0 as base image
-FROM flink:1.20.0-scala_2.12-java17
-
-# Switch to root user for installation and setup
-USER root
-
-# Install necessary packages
-RUN apt-get update && \
-    apt-get install -y tree && \
-    rm -rf /var/lib/apt/lists/*
-
-# Copy sql-client script to the container
-COPY bin/* /opt/sql-client/
-
-# Set working directory and environment
-WORKDIR /opt/sql-client
-ENV SQL_CLIENT_HOME=/opt/sql-client
-
-# Copy Fluss connector JARs and SQL files
-# Copy JARs to both sql-client lib and Flink lib directories
-COPY lib/* /opt/sql-client/lib/
-COPY sql/* /opt/sql-client/sql/
-COPY lib/* /opt/flink/lib/
-COPY opt/* /opt/flink/opt/
-
-# Modify docker-entrypoint.sh to allow Flink to run as root user
-# This is needed for the quickstart environment
-RUN sed -i 's/exec $(drop_privs_cmd)/exec/g' /docker-entrypoint.sh
-
-# Make sql-client script executable
-RUN ["chmod", "+x", "/opt/sql-client/sql-client"]
diff --git a/docker/quickstart-flink/README.md 
b/docker/quickstart-flink/README.md
deleted file mode 100644
index 4c74a3682..000000000
--- a/docker/quickstart-flink/README.md
+++ /dev/null
@@ -1,59 +0,0 @@
-<!--
- Licensed to the Apache Software Foundation (ASF) under one
- or more contributor license agreements.  See the NOTICE file
- distributed with this work for additional information
- regarding copyright ownership.  The ASF licenses this file
- to you under the Apache License, Version 2.0 (the
- "License"); you may not use this file except in compliance
- with the License.  You may obtain a copy of the License at
-
-      http://www.apache.org/licenses/LICENSE-2.0
-
- Unless required by applicable law or agreed to in writing, software
- distributed under the License is distributed on an "AS IS" BASIS,
- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- See the License for the specific language governing permissions and
- limitations under the License.
--->
-
-# Fluss Quickstart Flink Docker
-
-This directory contains the Docker setup for Fluss Quickstart with Flink 
integration.
-
-## Overview
-
-The Fluss Quickstart Flink Docker image provides a complete environment for 
running Flink with Fluss, powered by Paimon lake storage.
-
-## Prerequisites
-
-Before building the Docker image, ensure you have:
-
-1. Check out the code version that you want to use for the Docker image. Go to 
the project root directory and build Fluss using `./mvnw clean package 
-DskipTests`.
-The local build will be used for the Docker image.
-2. Docker installed and running
-3. Internet access for retrieving dependencies
-
-## Build Process
-
-The build process consists of two main steps:
-
-### Step 1: Prepare Build Files
-
-First, you need to prepare the required JAR files and dependencies:
-
-```bash
-# Make the script executable
-chmod +x prepare_build.sh
-
-# Run the preparation script
-./prepare_build.sh
-```
-
-### Step 2: Build Docker Image
-
-After the preparation is complete, build the Docker image:
-
-```bash
-# Build the Docker image
-docker build -t fluss/quickstart-flink:1.20-latest .
-```
diff --git a/docker/quickstart-flink/bin/sql-client 
b/docker/quickstart-flink/bin/sql-client
deleted file mode 100644
index 1288bfb72..000000000
--- a/docker/quickstart-flink/bin/sql-client
+++ /dev/null
@@ -1,21 +0,0 @@
-#!/bin/bash
-
-#
-# Licensed to the Apache Software Foundation (ASF) under one
-# or more contributor license agreements.  See the NOTICE file
-# distributed with this work for additional information
-# regarding copyright ownership.  The ASF licenses this file
-# to you under the Apache License, Version 2.0 (the
-# "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#     http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-
-${FLINK_HOME}/bin/sql-client.sh  -i ${SQL_CLIENT_HOME}/sql/sql-client.sql
\ No newline at end of file
diff --git a/docker/quickstart-flink/prepare_build.sh 
b/docker/quickstart-flink/prepare_build.sh
deleted file mode 100755
index a2904b05d..000000000
--- a/docker/quickstart-flink/prepare_build.sh
+++ /dev/null
@@ -1,261 +0,0 @@
-#!/bin/bash
-
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements.  See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance
-# with the License.  You may obtain a copy of the License at
-#
-#    http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-set -e
-
-# Configuration
-SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
-PROJECT_ROOT="$(cd "$SCRIPT_DIR/../.." && pwd)"
-
-# Logging functions
-log_info() {
-    echo "ℹ️  $1"
-}
-
-log_success() {
-    echo "✅ $1"
-}
-
-log_error() {
-    echo "❌ $1" >&2
-}
-
-# Utility function to copy JAR files with version numbers
-copy_jar() {
-    local src_pattern="$1"
-    local dest_dir="$2"
-    local description="$3"
-
-    log_info "Copying $description..."
-
-    # Find matching files
-    local matches=($src_pattern)
-    local count=${#matches[@]}
-
-    # No files matched
-    if (( count == 0 )); then
-        log_error "No matching JAR files found: $src_pattern"
-        log_error "Please build the Fluss project first: mvn clean package"
-        return 1
-    fi
-
-    # Multiple files matched
-    if (( count > 1 )); then
-        log_error "Multiple matching JAR files found:"
-        printf "    %s\n" "${matches[@]}"
-        return 1
-    fi
-
-    # Exactly one file matched → copy it with original file name
-    mkdir -p "$dest_dir"
-    cp "${matches[0]}" "$dest_dir/"
-    log_success "Copied: $(basename "${matches[0]}")"
-}
-
-# Utility function to download and verify JAR
-download_jar() {
-    local url="$1"
-    local dest_file="$2"
-    local expected_hash="$3"
-    local description="$4"
-
-    log_info "Downloading $description..."
-
-    # Download the file
-    if ! curl -fL -o "$dest_file" "$url"; then
-        log_error "Failed to download $description from $url"
-        return 1
-    fi
-
-    # Verify file size
-    if [ ! -s "$dest_file" ]; then
-        log_error "Downloaded file is empty: $dest_file"
-        return 1
-    fi
-
-    # Verify checksum if provided
-    if [ -n "$expected_hash" ]; then
-        local actual_hash=$(shasum "$dest_file" | awk '{print $1}')
-        if [ "$expected_hash" != "$actual_hash" ]; then
-            log_error "Checksum mismatch for $description"
-            log_error "Expected: $expected_hash"
-            log_error "Actual:   $actual_hash"
-            return 1
-        fi
-        log_success "Checksum verified for $description"
-    else
-        log_success "Downloaded $description"
-    fi
-}
-
-# Check if required directories exist
-check_prerequisites() {
-    log_info "Checking prerequisites..."
-
-    local required_dirs=(
-        "$PROJECT_ROOT/fluss-flink/fluss-flink-1.20/target"
-        "$PROJECT_ROOT/fluss-lake/fluss-lake-paimon/target"
-        "$PROJECT_ROOT/fluss-lake/fluss-lake-iceberg/target"
-        "$PROJECT_ROOT/fluss-flink/fluss-flink-tiering/target"
-    )
-
-    for dir in "${required_dirs[@]}"; do
-        if [ ! -d "$dir" ]; then
-            log_error "Required directory not found: $dir"
-            log_error "Please build the Fluss project first: mvn clean package"
-            exit 1
-        fi
-    done
-
-    log_success "All prerequisites met"
-}
-
-# Main execution
-main() {
-    log_info "Preparing JAR files for Fluss Quickstart Flink Docker..."
-    log_info "Project root: $PROJECT_ROOT"
-
-    # Check prerequisites
-    check_prerequisites
-
-    # Clean and create directories
-    log_info "Setting up directories..."
-    rm -rf lib opt
-    mkdir -p lib opt
-
-    # Copy Fluss connector JARs
-    log_info "Copying Fluss connector JARs..."
-    copy_jar 
"$PROJECT_ROOT/fluss-flink/fluss-flink-1.20/target/fluss-flink-1.20-*.jar" 
"./lib" "fluss-flink-1.20 connector"
-    copy_jar 
"$PROJECT_ROOT/fluss-lake/fluss-lake-paimon/target/fluss-lake-paimon-*.jar" 
"./lib" "fluss-lake-paimon connector"
-    copy_jar 
"$PROJECT_ROOT/fluss-lake/fluss-lake-iceberg/target/fluss-lake-iceberg-*.jar" 
"./lib" "fluss-lake-iceberg connector"
-
-    # Download external dependencies
-    log_info "Downloading external dependencies..."
-
-    # Download flink-faker for data generation
-    download_jar \
-        
"https://github.com/knaufk/flink-faker/releases/download/v0.5.3/flink-faker-0.5.3.jar";
 \
-        "./lib/flink-faker-0.5.3.jar" \
-        "" \
-        "flink-faker-0.5.3"
-
-    # Download Hadoop for HDFS/local filesystem support
-    download_jar \
-        
"https://repo1.maven.org/maven2/io/trino/hadoop/hadoop-apache/3.3.5-2/hadoop-apache-3.3.5-2.jar";
 \
-        "./lib/hadoop-apache-3.3.5-2.jar" \
-        "508255883b984483a45ca48d5af6365d4f013bb8" \
-        "hadoop-apache-3.3.5-2"
-
-    # Download paimon-flink connector
-    download_jar \
-        
"https://repo1.maven.org/maven2/org/apache/paimon/paimon-flink-1.20/1.2.0/paimon-flink-1.20-1.2.0.jar";
 \
-        "./lib/paimon-flink-1.20-1.2.0.jar" \
-        "b9f8762c6e575f6786f1d156a18d51682ffc975c" \
-        "paimon-flink-1.20-1.2.0"
-
-    # Iceberg Support
-    log_info "Downloading Iceberg connector JARs..."
-
-    # Download iceberg-flink-runtime for Flink 1.20 (version 1.10.1)
-    download_jar \
-        
"https://repo1.maven.org/maven2/org/apache/iceberg/iceberg-flink-runtime-1.20/1.10.1/iceberg-flink-runtime-1.20-1.10.1.jar";
 \
-        "./lib/iceberg-flink-runtime-1.20-1.10.1.jar" \
-        "" \
-        "iceberg-flink-runtime-1.20-1.10.1"
-
-
-    # Prepare lake tiering JAR
-    log_info "Preparing lake tiering JAR..."
-    copy_jar 
"$PROJECT_ROOT/fluss-flink/fluss-flink-tiering/target/fluss-flink-tiering-*.jar"
 "./opt" "fluss-flink-tiering"
-
-    # Final verification
-    verify_jars
-
-    # Show summary
-    show_summary
-}
-
-# Verify that all required JAR files are present
-verify_jars() {
-    log_info "Verifying all required JAR files are present..."
-
-    local missing_jars=()
-    local lib_jars=(
-        "fluss-flink-1.20-*.jar"
-        "fluss-lake-paimon-*.jar"
-        "fluss-lake-iceberg-*.jar"
-        "flink-faker-0.5.3.jar"
-        "hadoop-apache-3.3.5-2.jar"
-        "paimon-flink-1.20-1.2.0.jar"
-        "iceberg-flink-runtime-1.20-1.10.1.jar"
-    )
-
-    local opt_jars=(
-        "fluss-flink-tiering-*.jar"
-    )
-
-    # Check lib directory
-    for jar_pattern in "${lib_jars[@]}"; do
-        if ! ls ./lib/$jar_pattern >/dev/null 2>&1; then
-            missing_jars+=("lib/$jar_pattern")
-        fi
-    done
-
-    # Check opt directory
-    for jar_pattern in "${opt_jars[@]}"; do
-        if ! ls ./opt/$jar_pattern >/dev/null 2>&1; then
-            missing_jars+=("opt/$jar_pattern")
-        fi
-    done
-
-    # Report results
-    if [ ${#missing_jars[@]} -eq 0 ]; then
-        log_success "All required JAR files are present!"
-    else
-        log_error "Missing required JAR files:"
-        for jar in "${missing_jars[@]}"; do
-            log_error "  - $jar"
-        done
-        exit 1
-    fi
-}
-
-# Summary function
-show_summary() {
-    log_success "JAR files preparation completed!"
-    echo ""
-    log_info "📦 Generated JAR files:"
-    echo ""
-    echo "Lib directory (Flink connectors):"
-    ls -lh ./lib/ | tail -n +2 | awk '{printf "  %-50s %8s\n", $9, $5}'
-    echo ""
-    echo "Opt directory (Tiering service):"
-    ls -lh ./opt/ | tail -n +2 | awk '{printf "  %-50s %8s\n", $9, $5}'
-    echo ""
-    log_info "📋 Included Components:"
-    echo "  ✓ Fluss Flink 1.20 connector"
-    echo "  ✓ Fluss Lake Paimon connector"
-    echo "  ✓ Fluss Lake Iceberg connector"
-    echo "  ✓ Iceberg Flink runtime 1.20 (v1.10.1)"
-    echo "  ✓ Paimon Flink 1.20 (v1.2.0)"
-    echo "  ✓ Hadoop Apache (v3.3.5-2)"
-    echo "  ✓ Flink Faker (v0.5.3)"
-    echo "  ✓ Fluss Tiering service"
-}
-
-# Run main function
-main "$@"
diff --git a/docker/quickstart-flink/sql/sql-client.sql 
b/docker/quickstart-flink/sql/sql-client.sql
deleted file mode 100644
index 1d3c17556..000000000
--- a/docker/quickstart-flink/sql/sql-client.sql
+++ /dev/null
@@ -1,68 +0,0 @@
-/*
- * Licensed to the Apache Software Foundation (ASF) under one
- * or more contributor license agreements.  See the NOTICE file
- * distributed with this work for additional information
- * regarding copyright ownership.  The ASF licenses this file
- * to you under the Apache License, Version 2.0 (the
- * "License"); you may not use this file except in compliance
- * with the License.  You may obtain a copy of the License at
- *
- *     http://www.apache.org/licenses/LICENSE-2.0
- *
- * Unless required by applicable law or agreed to in writing, software
- * distributed under the License is distributed on an "AS IS" BASIS,
- * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- * See the License for the specific language governing permissions and
- * limitations under the License.
- */
-
-CREATE TEMPORARY TABLE source_order (
-    `order_key` BIGINT,
-    `cust_key` INT,
-    `total_price` DECIMAL(15, 2),
-    `order_date` DATE,
-    `order_priority` STRING,
-    `clerk` STRING
-) WITH (
-  'connector' = 'faker',
-  'rows-per-second' = '10',
-  'number-of-rows' = '10000',
-  'fields.order_key.expression' = '#{number.numberBetween 
''0'',''100000000''}',
-  'fields.cust_key.expression' = '#{number.numberBetween ''0'',''20''}',
-  'fields.total_price.expression' = '#{number.randomDouble 
''3'',''1'',''1000''}',
-  'fields.order_date.expression' = '#{date.past ''100'' ''DAYS''}',
-  'fields.order_priority.expression' = '#{regexify ''(low|medium|high){1}''}',
-  'fields.clerk.expression' = '#{regexify 
''(Clerk1|Clerk2|Clerk3|Clerk4){1}''}'
-);
-
-CREATE TEMPORARY TABLE source_customer (
-    `cust_key` INT,
-    `name` STRING,
-    `phone` STRING,
-    `nation_key` INT NOT NULL,
-    `acctbal` DECIMAL(15, 2),
-    `mktsegment` STRING,
-    PRIMARY KEY (`cust_key`) NOT ENFORCED
-) WITH (
-  'connector' = 'faker',
-  'number-of-rows' = '200',
-  'fields.cust_key.expression' = '#{number.numberBetween ''0'',''20''}',
-  'fields.name.expression' = '#{funnyName.name}',
-  'fields.nation_key.expression' = '#{number.numberBetween ''1'',''5''}',
-  'fields.phone.expression' = '#{phoneNumber.cellPhone}',
-  'fields.acctbal.expression' = '#{number.randomDouble ''3'',''1'',''1000''}',
-  'fields.mktsegment.expression' = '#{regexify 
''(AUTOMOBILE|BUILDING|FURNITURE|MACHINERY|HOUSEHOLD){1}''}'
-);
-
-CREATE TEMPORARY TABLE `source_nation` (
-  `nation_key` INT NOT NULL,
-  `name`       STRING,
-   PRIMARY KEY (`nation_key`) NOT ENFORCED
-) WITH (
-  'connector' = 'faker',
-  'number-of-rows' = '100',
-  'fields.nation_key.expression' = '#{number.numberBetween ''1'',''5''}',
-  'fields.name.expression' = '#{regexify 
''(CANADA|JORDAN|CHINA|UNITED|INDIA){1}''}'
-);
-
-SET 'table.exec.sink.not-null-enforcer'='DROP';
\ No newline at end of file
diff --git a/fluss-spark/fluss-spark-common/pom.xml 
b/fluss-spark/fluss-spark-common/pom.xml
index 32d2ad25e..84987d07d 100644
--- a/fluss-spark/fluss-spark-common/pom.xml
+++ b/fluss-spark/fluss-spark-common/pom.xml
@@ -90,6 +90,19 @@
                     </execution>
                 </executions>
             </plugin>
+
+            <!-- Configure Javadoc to limit source scanning to only the main 
Scala source directory.
+                 This excludes ANTLR4 generated code from 
target/generated-sources/antlr4 which causes issues with release=8 config.
+                 Without this exclusion, the javadoc plugin scans generated 
parser files and fails with
+                 "No source files for package 
org.apache.spark.sql.catalyst.parser" error because the generated code is 
incompatible with release=8.
+                 This causes build failure and blocks release builds. -->
+            <plugin>
+                <groupId>org.apache.maven.plugins</groupId>
+                <artifactId>maven-javadoc-plugin</artifactId>
+                <configuration>
+                    <sourcepath>${basedir}/src/main/scala</sourcepath>
+                </configuration>
+            </plugin>
         </plugins>
     </build>
 </project>
diff --git a/tools/releasing/create_binary_release.sh 
b/tools/releasing/create_binary_release.sh
index c0ed8adbc..b96bbf8c9 100755
--- a/tools/releasing/create_binary_release.sh
+++ b/tools/releasing/create_binary_release.sh
@@ -49,6 +49,8 @@ if [ "$(uname)" == "Darwin" ]; then
     export COPYFILE_DISABLE=1
 else
     SHASUM="sha512sum"
+    # Initialize TAR_OPTIONS as empty for Linux to avoid "unbound variable" 
errors
+    TAR_OPTIONS=""
 fi
 
 cd ..
diff --git a/website/community/how-to-release/creating-a-fluss-release.mdx 
b/website/community/how-to-release/creating-a-fluss-release.mdx
index bedcd8288..f4423756b 100644
--- a/website/community/how-to-release/creating-a-fluss-release.mdx
+++ b/website/community/how-to-release/creating-a-fluss-release.mdx
@@ -102,7 +102,7 @@ RELEASE_VERSION="0.8.0-incubating"
 SHORT_RELEASE_VERSION="0.8"
 CURRENT_SNAPSHOT_VERSION="$SHORT_RELEASE_VERSION-SNAPSHOT"
 NEXT_SNAPSHOT_VERSION="0.9-SNAPSHOT"
-SHORT_NEXT_SNAPSHOT_VERSION="0.9"
+SHORT_NEXT_VERSION="0.9"
 ```
 
 ### 7. Create a release branch
@@ -117,25 +117,21 @@ $ git checkout -b release-${SHORT_RELEASE_VERSION}
 $ git push origin release-${SHORT_RELEASE_VERSION}
 ```
 
-Update the version in the Helm Chart in:
+Check that the version in the Helm Chart matches `RELEASE_VERSION` in:
 - `helm/Chart.yaml`
 - `helm/values.yaml`
 - `helm/README.md`
 
-For example, if you are releasing a major version `0.8.0-incubating`, you 
should replace all the previous version `0.7.0-incubating` to the release 
version `0.8.0-incubating`.
-
-
-And commit/push the version bump:
+If not, update the version in those files, then commit and push:
 
 ```bash
 $ git commit -m "[helm] Bump helm version to x.y.z" helm/Chart.yaml 
helm/values.yaml helm/README.md
 $ git push origin release-${SHORT_RELEASE_VERSION}
 ```
-
   </TabItem>
 
   <TabItem value="bugfix" label="Bugfix release">
-If you're creating a new bugfix release (e.g., 0.9.1 instead of 0.9.0), you do 
not need to create a release branch. You can skip this step can check out the 
the already existing branch for that version:
+If you're creating a new bugfix release (e.g., 0.9.1 instead of 0.9.0), you do 
not need to create a release branch. You can skip this step and check out the 
already existing branch for that version:
 
 ```bash
 $ git checkout release-$SHORT_RELEASE_VERSION
@@ -146,8 +142,7 @@ Update the version in the Helm Chart in:
 - `helm/values.yaml`
 - `helm/README.md`
 
-For example, if you are releasing a bugfix version `0.8.1-incubating`, you 
should replace all the previous version `0.8.0-incubating` to the new version 
`0.8.1-incubating`.
-
+For a bugfix release you need to update the version in these files. For 
example, if you are releasing `0.8.1-incubating`, replace the previous version 
`0.8.0-incubating` with `0.8.1-incubating` in those files.
 
 And commit/push the version bump:
 
@@ -184,7 +179,7 @@ Besides, in the `main` branch, create a new "Upgrade Notes" 
markdown file for th
 ```
 ---
 title: Upgrade Notes
-sidebar_position: 3
+sidebar_position: 4
 ---
 
 # Upgrade Notes from v0.x to v0.y
@@ -195,30 +190,45 @@ Additionally, for the upgrade notes page of the version 
currently being released
 Commit the documentation changes, and push to the official repository.
 
 ```bash
-$ git commit -m "[docs] Create upgrade notes for $NEXT_SHORT_VERSION" .
+$ git add .
+$ git commit -m "[docs] Create upgrade notes for $SHORT_NEXT_VERSION"
 $ git push origin main
 ```
 
 **(3) Add version item in fluss-versions.json**
 
-Next, add a new version item for the current release version in the 
`website/fluss-versions.json` file on the `main` branch. The new version item 
should have the following fields and values (releasing `0.8.0` as an example):
+Next, add a new version item for the next release version in the 
`website/fluss-versions.json` file on the `main` branch. The new version item 
should have the following fields and values (current releasing `0.9.0`, next 
release `0.10.0` as an example):
 
 ```json
 {
-  "versionName": "version-0.8",
-  "fullVersion": "0.8.0-incubating",
-  "shortVersion": "0.8",
-  "dockerVersion": "0.8.0-incubating-rc1",
+  "versionName": "next",
+  "fullVersion": "0.10-SNAPSHOT",
+  "shortVersion": "0.10",
+  "dockerVersion": "0.10-SNAPSHOT",
+  "paimonVersion": "1.3.1",
+  "paimonVersionShort": "1.3",
   "released": false
 }
 ```
 
-Additionally, update the `fullVersion`, `shortVersion`, and `dockerVersion` 
fields for the `next` version entry to reflect the next release version.
+Additionally, update the `fullVersion`, `shortVersion`, and `dockerVersion` 
fields for the `current` version entry to reflect the current release version.
+
+```json
+{
+    "versionName": "version-0.9",
+    "fullVersion": "0.9.0-incubating",
+    "shortVersion": "0.9",
+    "dockerVersion": "0.9.0-incubating",
+    "paimonVersion": "1.3.1",
+    "paimonVersionShort": "1.3",
+    "released": false
+ }
+```
 
 Commit the documentation changes, and push to the official repository.
 
 ```bash
-$ git commit -m "[docs] Add version item for $RELEASE_VERSION in 
fluss-versions.json
+$ git commit -m "[docs] Add version item for $SHORT_NEXT_VERSION in 
fluss-versions.json" .
 $ git push origin main
 ```
 
@@ -265,9 +275,9 @@ export FLUSS_CURRENT_RELEASE_BRANCH={currentReleaseBranch}
 # export FLUSS_CURRENT_RELEASE_BRANCH=release-0.8
 
 # get contributor list
-git log $FLUSS_PREVIOUS_RELEASE_BRANCH..$FLUSS_CURRENT_RELEASE_BRANCH 
--pretty=format:"%an" | sort -u | paste -sd "," - | sed 's/,/, /g'
+git log 
origin/$FLUSS_PREVIOUS_RELEASE_BRANCH..origin/$FLUSS_CURRENT_RELEASE_BRANCH 
--pretty=format:"%an" | sort -u | paste -sd "," - | sed 's/,/, /g'
 # get total number of commits
-git log $FLUSS_PREVIOUS_RELEASE_BRANCH...$FLUSS_CURRENT_RELEASE_BRANCH 
--pretty=oneline | wc -l
+git log 
origin/$FLUSS_PREVIOUS_RELEASE_BRANCH..origin/$FLUSS_CURRENT_RELEASE_BRANCH 
--pretty=oneline | wc -l
 ```
 
 Besides, Create a pull request to add download links for the release on the 
Download page `website/src/pages/downloads.md`.
@@ -471,17 +481,8 @@ $ cd docker/fluss
 docker/fluss $ docker buildx build --push --platform 
linux/arm64/v8,linux/amd64 --tag apache/fluss:${RELEASE_VERSION}-rc${RC_NUM} .
 ```
 
-Then, run the following commands to build and push the flink — a customized 
Apache Flink image that includes all necessary libraries and JARs for the Fluss 
Quickstart guide:
-
-```bash
-docker/fluss $ cd ../quickstart-flink
-docker/quickstart-flink $ ./prepare_build.sh
-docker/quickstart-flink $ docker buildx build --push --platform 
linux/arm64/v8,linux/amd64 --tag 
apache/fluss-quickstart-flink:1.20-${RELEASE_VERSION}-rc${RC_NUM} .
-```
-
 Verify the RC images are present:
 - https://hub.docker.com/r/apache/fluss/tags
-- https://hub.docker.com/r/apache/fluss-quickstart-flink/tags
 
 Then, update the `dockerVersion` field for the current release RC version in 
the `website/fluss-versions.json` file on the `main` branch to the value of 
`${RELEASE_VERSION}_RC${RC_NUM}` (which includes the RC suffix, e.g., 
`0.8.0-incubating-rc1`). This update should be committed and pushed to the 
`main` branch.
 
diff --git a/website/community/how-to-release/release-manager-preparation.md 
b/website/community/how-to-release/release-manager-preparation.md
index c00eadc32..700b3bf45 100644
--- a/website/community/how-to-release/release-manager-preparation.md
+++ b/website/community/how-to-release/release-manager-preparation.md
@@ -15,7 +15,7 @@ Note: The following setup is a one-time configuration 
required for release prepa
 
 This release process is suggested to operate on MacOS or Linux systems, and 
the following tools are required:
 
-- Java 8
+- Java 11
 - Apache Maven 3.8.6
 - GnuPG 2.x
 - Git
@@ -121,7 +121,7 @@ sub   rsa4096 2025-08-07 [E]
 Determine your Apache GPG Key and Key ID, as follows:
 
 ```bash
-gpg --list-keys
+gpg --list-keys --keyid-format short
 ```
 
 This will list your GPG keys. One of these should reflect your Apache account, 
for example:
@@ -135,7 +135,7 @@ sub   2048R/BA4D50BE 2016-02-23
 
 Here, the key ID is the 8-digit hex string in the pub line: 845E6689.
 
-Now, add your Apache GPG key to the Flink’s KEYS file in the 
[release](https://dist.apache.org/repos/dist/release/incubator/fluss/KEYS) 
repository at [dist.apache.org](https://dist.apache.org). Follow the 
instructions listed at the top of these files. (Note: Only PPMC members have 
write access to the release repository. If you end up getting 403 errors ask on 
the mailing list for assistance.) PPMC member can refer following scripts to 
add your Apache GPG key to the KEYS in the release re [...]
+Now, add your Apache GPG key to the Fluss’s KEYS file in the 
[release](https://dist.apache.org/repos/dist/release/incubator/fluss/KEYS) 
repository at [dist.apache.org](https://dist.apache.org). Follow the 
instructions listed at the top of these files. (Note: Only PPMC members have 
write access to the release repository. If you end up getting 403 errors ask on 
the mailing list for assistance.) PPMC member can refer following scripts to 
add your Apache GPG key to the KEYS in the release re [...]
 
 ```
 svn co https://dist.apache.org/repos/dist/release/incubator/fluss fluss-dist

Reply via email to