github-actions[bot] commented on PR #40182:
URL: https://github.com/apache/doris/pull/40182#issuecomment-2541723072

   #### `sh-checker report`
   
   To get the full details, please check in the 
[job]("https://github.com/apache/doris/actions/runs/12318604508";) output.
   
   <details>
   <summary>shellcheck errors</summary>
   
   ```
   
   'shellcheck ' returned error 1 finding the following syntactical issues:
   
   ----------
   
   In samples/datalake/deltalake_and_kudu/start-trinoconnector-compose.sh line 
162:
   export KUDU_QUICKSTART_IP=$(ifconfig | grep "inet " | grep -Fv 127.0.0.1 |  
awk '{print $2}' | tail -1)
          ^----------------^ SC2155 (warning): Declare and assign separately to 
avoid masking return values.
   
   
   In samples/datalake/deltalake_and_kudu/stop-trinoconnector-compose.sh line 
20:
   export KUDU_QUICKSTART_IP=$(ifconfig | grep "inet " | grep -Fv 127.0.0.1 |  
awk '{print $2}' | tail -1)
          ^----------------^ SC2155 (warning): Declare and assign separately to 
avoid masking return values.
   
   
   In samples/datalake/lakesoul/scripts/load_tpch_data.sh line 21:
   cd tpch-dbgen
   ^-----------^ SC2164 (warning): Use 'cd ... || exit' or 'cd ... || return' 
in case cd fails.
   
   Did you mean: 
   cd tpch-dbgen || exit
   
   
   In samples/datalake/lakesoul/start_all.sh line 65:
               rm -rf ${OUT_DIR}
                      ^--------^ SC2086 (info): Double quote to prevent 
globbing and word splitting.
   
   Did you mean: 
               rm -rf "${OUT_DIR}"
   
   For more information:
     https://www.shellcheck.net/wiki/SC2155 -- Declare and assign separately to 
...
     https://www.shellcheck.net/wiki/SC2164 -- Use 'cd ... || exit' or 'cd ... 
|...
     https://www.shellcheck.net/wiki/SC2086 -- Double quote to prevent globbing 
...
   ----------
   
   You can address the above issues in one of three ways:
   1. Manually correct the issue in the offending shell script;
   2. Disable specific issues by adding the comment:
     # shellcheck disable=NNNN
   above the line that contains the issue, where NNNN is the error code;
   3. Add '-e NNNN' to the SHELLCHECK_OPTS setting in your .yml action file.
   
   
   
   ```
   </details>
   
   <details>
   <summary>shfmt errors</summary>
   
   ```
   
   'shfmt ' returned error 1 finding the following formatting issues:
   
   ----------
   --- samples/datalake/deltalake_and_kudu/scripts/start_doris.sh.orig
   +++ samples/datalake/deltalake_and_kudu/scripts/start_doris.sh
   @@ -20,8 +20,8 @@
    export JAVA_HOME=/opt/jdk-17.0.2
    
    cp -r /opt/doris-bin /opt/doris
   -echo "trino_connector_plugin_dir=/opt/connectors/" >> 
/opt/doris/fe/conf/fe.conf
   -echo "trino_connector_plugin_dir=/opt/connectors/" >> 
/opt/doris/be/conf/be.conf
   +echo "trino_connector_plugin_dir=/opt/connectors/" 
>>/opt/doris/fe/conf/fe.conf
   +echo "trino_connector_plugin_dir=/opt/connectors/" 
>>/opt/doris/be/conf/be.conf
    
    /opt/doris/fe/bin/start_fe.sh --daemon
    /opt/doris/be/bin/start_be.sh --daemon
   --- samples/datalake/deltalake_and_kudu/start-trinoconnector-compose.sh.orig
   +++ samples/datalake/deltalake_and_kudu/start-trinoconnector-compose.sh
   @@ -33,7 +33,6 @@
    hdfs_plugin="ff4a3e3b32dcce27f4df58f17938abde"
    kudu_java_example="1afe0a890785e8d0011ea7342ae5e43d"
    
   -
    download_source_file() {
        local FILE_PATH="$1"
        local EXPECTED_MD5="$2"
   @@ -79,9 +78,6 @@
    download_source_file "trino-hdfs-435-20240724.tar.gz" "${hdfs_plugin}" 
"https://github.com/apache/doris-thirdparty/releases/download/trino-435-20240724";
    download_source_file "kudu-java-example-1.0-SNAPSHOT.jar" 
"${kudu_java_example}" 
"https://github.com/apache/doris-thirdparty/releases/download/trino-435-20240724";
    
   -
   -
   -
    if [[ ! -f "jdk-17.0.2/SUCCESS" ]]; then
        echo "Prepare jdk17 environment"
        if [[ -d "jdk-17.0.2" ]]; then
   @@ -156,10 +152,9 @@
        touch connectors/trino-delta-lake-435/hdfs/SUCCESS
    fi
    
   -
    cd ../
    
   -export KUDU_QUICKSTART_IP=$(ifconfig | grep "inet " | grep -Fv 127.0.0.1 |  
awk '{print $2}' | tail -1)
   +export KUDU_QUICKSTART_IP=$(ifconfig | grep "inet " | grep -Fv 127.0.0.1 | 
awk '{print $2}' | tail -1)
    
    docker compose -f trinoconnector-compose.yml --env-file 
trinoconnector-compose.env up -d
    echo "Create hive table ..."
   --- samples/datalake/deltalake_and_kudu/stop-trinoconnector-compose.sh.orig
   +++ samples/datalake/deltalake_and_kudu/stop-trinoconnector-compose.sh
   @@ -17,6 +17,6 @@
    # specific language governing permissions and limitations
    # under the License.
    
   -export KUDU_QUICKSTART_IP=$(ifconfig | grep "inet " | grep -Fv 127.0.0.1 |  
awk '{print $2}' | tail -1)
   +export KUDU_QUICKSTART_IP=$(ifconfig | grep "inet " | grep -Fv 127.0.0.1 | 
awk '{print $2}' | tail -1)
    
    docker compose -f trinoconnector-compose.yml --env-file 
trinoconnector-compose.env down
   --- samples/datalake/lakesoul/scripts/load_tpch_data.sh.orig
   +++ samples/datalake/lakesoul/scripts/load_tpch_data.sh
   --- samples/datalake/lakesoul/start_all.sh.orig
   +++ samples/datalake/lakesoul/start_all.sh
   @@ -70,7 +70,6 @@
        fi
    }
    
   -
    curdir="$(cd "$(dirname "${BASH_SOURCE[0]}")" &>/dev/null && pwd)"
    cd "${curdir}" || exit
    
   @@ -92,12 +91,10 @@
    download_source_file "hadoop-3.3.5.tar.gz" 
"1b6175712d813e8baec48ed68098ca85" 
"https://lakesoul-bucket.obs.cn-southwest-2.myhuaweicloud.com/doris";
    unpack_tar "hadoop-3.3.5.tar.gz" "hadoop-3.3.5"
    
   -
    download_source_file "flink-s3-fs-hadoop-1.17.1.jar" 
"0a631b07ba3e3b6c54e7c7c920ac6487" 
"https://repo1.maven.org/maven2/org/apache/flink/flink-s3-fs-hadoop/1.17.1";
    download_source_file "parquet-hadoop-bundle-1.12.3.jar" 
"3a78d684a1938e68c6a57f59863e9106" 
"https://repo1.maven.org/maven2/org/apache/parquet/parquet-hadoop-bundle/1.12.3";
    download_source_file "flink-parquet-1.17.1.jar" 
"559fda5535d4018fb923c4ec198340f0" 
"https://repo1.maven.org/maven2/org/apache/flink/flink-parquet/1.17.1";
    
   -
    if [[ ! -f "doris-bin/SUCCESS" ]]; then
        echo "Prepare ${DORIS_PACKAGE} environment"
        if [[ -d "doris-bin" ]]; then
   @@ -147,7 +144,6 @@
    echo "Start prepare data for lakesoul tables..."
    sudo docker exec -it doris-lakesoul-spark spark-sql --conf 
spark.sql.extensions=com.dmetasoul.lakesoul.sql.LakeSoulSparkSessionExtension 
--conf spark.hadoop.fs.s3.impl=org.apache.hadoop.fs.s3a.S3AFileSystem --conf 
spark.hadoop.fs.s3a.buffer.dir=/opt/spark/work-dir/s3a --conf 
spark.hadoop.fs.s3a.path.style.access=true --conf 
spark.hadoop.fs.s3a.endpoint=http://minio:9000 --conf 
spark.hadoop.fs.s3a.aws.credentials.provider=org.apache.hadoop.fs.s3a.AnonymousAWSCredentialsProvider
 --conf spark.hadoop.fs.s3a.access.key=admin --conf 
spark.hadoop.fs.s3a.secret.key=password -f /opt/sql/prepare_data.sql | tee -a 
init.log >/dev/null
    
   -
    echo 
"============================================================================="
    echo "Success to launch doris+iceberg+paimon+flink+spark+minio 
environments!"
    echo "You can:"
   --- samples/datalake/lakesoul/start_flink_client.sh.orig
   +++ samples/datalake/lakesoul/start_flink_client.sh
   @@ -16,4 +16,4 @@
    # specific language governing permissions and limitations
    # under the License.
    
   -sudo docker exec -it doris-lakesoul-jobmanager sql-client.sh 
   +sudo docker exec -it doris-lakesoul-jobmanager sql-client.sh
   --- samples/datalake/lakesoul/start_spark_sql.sh.orig
   +++ samples/datalake/lakesoul/start_spark_sql.sh
   ----------
   
   You can reformat the above files to meet shfmt's requirements by typing:
   
     shfmt  -w filename
   
   
   ```
   </details>
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to