This is an automated email from the ASF dual-hosted git repository.
srowen pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new 7b58fffdeeb [SPARK-46100][CORE][PYTHON] Reduce stack depth by replace
(string|array).size with (string|array).length
7b58fffdeeb is described below
commit 7b58fffdeeb70524e18ad80ea0aa53e2ac910e2a
Author: Jiaan Geng
AuthorDate: Sat Nov 25 14:38:34 2023 -0600
[SPARK-46100][CORE][PYTHON] Reduce stack depth by replace
(string|array).size with (string|array).length
### What changes were proposed in this pull request?
There are a lot of `[string|array].size` called.
In fact, the size calls the underlying length, this behavior increase the
stack length.
We should call `[string|array].length` directly.
We also get the compile waring `Replace .size with .length on arrays and
strings`
This PR just improve the core module.
### Why are the changes needed?
Reduce stack depth by replace (string|array).size with (string|array).length
### Does this PR introduce _any_ user-facing change?
'No'.
### How was this patch tested?
Exists test cases.
### Was this patch authored or co-authored using generative AI tooling?
'No'.
Closes #44011 from beliefer/SPARK-46100.
Authored-by: Jiaan Geng
Signed-off-by: Sean Owen
---
.../org/apache/spark/api/python/PythonRunner.scala | 2 +-
.../apache/spark/deploy/master/ui/MasterPage.scala | 4 +-
.../apache/spark/executor/ExecutorMetrics.scala| 2 +-
.../org/apache/spark/resource/ResourceUtils.scala | 2 +-
.../apache/spark/scheduler/TaskDescription.scala | 2 +-
.../apache/spark/scheduler/TaskSchedulerImpl.scala | 4 +-
.../org/apache/spark/ui/ConsoleProgressBar.scala | 2 +-
.../org/apache/spark/util/HadoopFSUtils.scala | 2 +-
.../util/io/ChunkedByteBufferFileRegion.scala | 2 +-
.../scala/org/apache/spark/CheckpointSuite.scala | 16 ++---
.../scala/org/apache/spark/DistributedSuite.scala | 16 ++---
.../test/scala/org/apache/spark/FileSuite.scala| 2 +-
.../org/apache/spark/MapOutputTrackerSuite.scala | 4 +-
.../scala/org/apache/spark/PartitioningSuite.scala | 4 +-
.../test/scala/org/apache/spark/ShuffleSuite.scala | 2 +-
.../spark/deploy/DecommissionWorkerSuite.scala | 2 +-
.../org/apache/spark/deploy/SparkSubmitSuite.scala | 4 +-
.../deploy/StandaloneDynamicAllocationSuite.scala | 22 +++---
.../spark/deploy/client/AppClientSuite.scala | 6 +-
.../deploy/history/FsHistoryProviderSuite.scala| 20 +++---
.../deploy/rest/StandaloneRestSubmitSuite.scala| 2 +-
.../input/WholeTextFileRecordReaderSuite.scala | 4 +-
.../internal/plugin/PluginContainerSuite.scala | 4 +-
.../apache/spark/rdd/AsyncRDDActionsSuite.scala| 2 +-
.../apache/spark/rdd/LocalCheckpointSuite.scala| 2 +-
.../apache/spark/rdd/PairRDDFunctionsSuite.scala | 44 ++--
.../scala/org/apache/spark/rdd/PipedRDDSuite.scala | 10 +--
.../test/scala/org/apache/spark/rdd/RDDSuite.scala | 80 +++---
.../scala/org/apache/spark/rdd/SortingSuite.scala | 6 +-
.../apache/spark/rdd/ZippedPartitionsSuite.scala | 4 +-
.../spark/resource/ResourceProfileSuite.scala | 2 +-
.../apache/spark/resource/ResourceUtilsSuite.scala | 6 +-
.../apache/spark/scheduler/AQEShuffledRDD.scala| 2 +-
.../CoarseGrainedSchedulerBackendSuite.scala | 2 +-
.../apache/spark/scheduler/DAGSchedulerSuite.scala | 32 -
.../apache/spark/scheduler/MapStatusSuite.scala| 2 +-
.../scheduler/OutputCommitCoordinatorSuite.scala | 8 +--
.../spark/scheduler/TaskSchedulerImplSuite.scala | 12 ++--
.../spark/scheduler/TaskSetManagerSuite.scala | 4 +-
.../KryoSerializerDistributedSuite.scala | 2 +-
.../sort/IndexShuffleBlockResolverSuite.scala | 2 +-
.../org/apache/spark/storage/DiskStoreSuite.scala | 2 +-
.../org/apache/spark/util/FileAppenderSuite.scala | 4 +-
.../spark/util/collection/SizeTrackerSuite.scala | 2 +-
44 files changed, 180 insertions(+), 180 deletions(-)
diff --git a/core/src/main/scala/org/apache/spark/api/python/PythonRunner.scala
b/core/src/main/scala/org/apache/spark/api/python/PythonRunner.scala
index d6363182606..e6d5a750ea3 100644
--- a/core/src/main/scala/org/apache/spark/api/python/PythonRunner.scala
+++ b/core/src/main/scala/org/apache/spark/api/python/PythonRunner.scala
@@ -378,7 +378,7 @@ private[spark] abstract class BasePythonRunner[IN, OUT](
resources.foreach { case (k, v) =>
PythonRDD.writeUTF(k, dataOut)
PythonRDD.writeUTF(v.name, dataOut)
- dataOut.writeInt(v.addresses.size)
+ dataOut.writeInt(v.addresses.length)
v.addresses.foreach { case addr =>
PythonRDD.writeUTF(addr, dataOut)
}
diff --git