This is an automated email from the ASF dual-hosted git repository.
kou pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/arrow.git
The following commit(s) were added to refs/heads/main by this push:
new eda392c301 GH-47125: [CI][Dev] Fix shellcheck errors in the
ci/scripts/integration_hdfs.sh (#47126)
eda392c301 is described below
commit eda392c301cc79e3b92c9af4dcd925182c59e31d
Author: Hiroyuki Sato <[email protected]>
AuthorDate: Tue Jul 22 11:16:17 2025 +0900
GH-47125: [CI][Dev] Fix shellcheck errors in the
ci/scripts/integration_hdfs.sh (#47126)
### Rationale for this change
This is the sub issue #44748.
* SC2034: source_dir appears unused. Verify use (or export if used
externally).
* SC2086: Double quote to prevent globbing and word splitting.
* SC2155: Declare and assign separately to avoid masking return values.
```
shellcheck ci/scripts/integration_hdfs.sh
In ci/scripts/integration_hdfs.sh line 22:
source_dir=${1}/cpp
^--------^ SC2034 (warning): source_dir appears unused. Verify use (or
export if used externally).
In ci/scripts/integration_hdfs.sh line 25:
export CLASSPATH=$($HADOOP_HOME/bin/hadoop classpath --glob)
^-------^ SC2155 (warning): Declare and assign separately to avoid
masking return values.
^----------^ SC2086 (info): Double quote to prevent
globbing and word splitting.
Did you mean:
export CLASSPATH=$("$HADOOP_HOME"/bin/hadoop classpath --glob)
In ci/scripts/integration_hdfs.sh line 45:
pushd ${build_dir}
^----------^ SC2086 (info): Double quote to prevent globbing and word
splitting.
Did you mean:
pushd "${build_dir}"
For more information:
https://www.shellcheck.net/wiki/SC2034 -- source_dir appears unused.
Verify...
https://www.shellcheck.net/wiki/SC2155 -- Declare and assign separately
to ...
https://www.shellcheck.net/wiki/SC2086 -- Double quote to prevent
globbing ...
```
### What changes are included in this PR?
* SC2034: disable shellcheck
* SC2086: Quote variables.
* SC2155: separate variable declaration and export.
### Are these changes tested?
Yes.
### Are there any user-facing changes?
No.
* GitHub Issue: #47125
Authored-by: Hiroyuki Sato <[email protected]>
Signed-off-by: Sutou Kouhei <[email protected]>
---
.pre-commit-config.yaml | 1 +
ci/scripts/integration_hdfs.sh | 6 ++++--
2 files changed, 5 insertions(+), 2 deletions(-)
diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml
index f1258d0ee0..cf1d8e9835 100644
--- a/.pre-commit-config.yaml
+++ b/.pre-commit-config.yaml
@@ -331,6 +331,7 @@ repos:
?^ci/scripts/integration_arrow_build\.sh$|
?^ci/scripts/integration_arrow\.sh$|
?^ci/scripts/integration_dask\.sh$|
+ ?^ci/scripts/integration_hdfs\.sh$|
?^ci/scripts/integration_spark\.sh$|
?^ci/scripts/matlab_build\.sh$|
?^ci/scripts/msys2_system_clean\.sh$|
diff --git a/ci/scripts/integration_hdfs.sh b/ci/scripts/integration_hdfs.sh
index d0444ccb74..bb3a9b5140 100755
--- a/ci/scripts/integration_hdfs.sh
+++ b/ci/scripts/integration_hdfs.sh
@@ -19,10 +19,12 @@
set -e
+# shellcheck disable=SC2034
source_dir=${1}/cpp
build_dir=${2}/cpp
-export CLASSPATH=$($HADOOP_HOME/bin/hadoop classpath --glob)
+HADOOP_CLASSPATH=$("$HADOOP_HOME/bin/hadoop" classpath --glob)
+export CLASSPATH="${HADOOP_CLASSPATH}"
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export LIBHDFS3_CONF=$HADOOP_CONF_DIR/hdfs-site.xml
export ARROW_LIBHDFS3_DIR=$CONDA_PREFIX/lib
@@ -42,7 +44,7 @@ function use_libhdfs_dir() {
# execute cpp tests
export ARROW_HDFS_TEST_LIBHDFS_REQUIRE=ON
-pushd ${build_dir}
+pushd "${build_dir}"
debug/arrow-io-hdfs-test
debug/arrow-hdfs-test