reviews
Thread
Date
Earlier messages
Later messages
Messages by Thread
Re: [PR] [SPARK-53189][INFRA] Use `Temurin` JDK distribution in `build_and_test.yml` [spark]
via GitHub
Re: [PR] [SPARK-53189][INFRA] Use `Temurin` JDK distribution in `build_and_test.yml` [spark]
via GitHub
Re: [PR] [SPARK-53189][INFRA] Use `Temurin` JDK distribution in `build_and_test.yml` [spark]
via GitHub
Re: [PR] [SPARK-53189][INFRA] Use `Temurin` JDK distribution in `build_and_test.yml` [spark]
via GitHub
Re: [PR] [SPARK-53189][INFRA] Use `Temurin` JDK distribution in `build_and_test.yml` [spark]
via GitHub
Re: [PR] [SPARK-53189][INFRA] Use `Temurin` JDK distribution in `build_and_test.yml` [spark]
via GitHub
Re: [PR] [SPARK-53189][INFRA] Use `Temurin` JDK distribution in `build_and_test.yml` [spark]
via GitHub
Re: [PR] [SPARK-53189][INFRA] Use `Temurin` JDK distribution in `build_and_test.yml` [spark]
via GitHub
Re: [PR] [SPARK-53189][INFRA] Use `Temurin` JDK distribution in `build_and_test.yml` [spark]
via GitHub
Re: [PR] [SPARK-53189][INFRA] Use `Temurin` JDK distribution in `build_and_test.yml` [spark]
via GitHub
Re: [PR] [SPARK-53189][INFRA] Use `Temurin` JDK distribution in `build_and_test.yml` [spark]
via GitHub
[PR] add handling for exceptions from reading corrupt zstd files [spark]
via GitHub
Re: [PR] [SPARK-52482][SQL][CORE] Improve exception handling for reading certain corrupt zstd files [spark]
via GitHub
Re: [PR] [SPARK-52482][SQL][CORE] Improve exception handling for reading certain corrupt zstd files [spark]
via GitHub
Re: [PR] [SPARK-52482][SQL][CORE] Improve exception handling for reading certain corrupt zstd files [spark]
via GitHub
Re: [PR] [SPARK-52482][SQL][CORE] Improve exception handling for reading certain corrupt zstd files [spark]
via GitHub
Re: [PR] [SPARK-52482][SQL][CORE] Improve exception handling for reading certain corrupt zstd files [spark]
via GitHub
Re: [PR] [SPARK-52482][SQL][CORE] Improve exception handling for reading certain corrupt zstd files [spark]
via GitHub
Re: [PR] [SPARK-52482][SQL][CORE] Improve exception handling for reading certain corrupt zstd files [spark]
via GitHub
Re: [PR] [SPARK-52482][SQL][CORE] Improve exception handling for reading certain corrupt zstd files [spark]
via GitHub
Re: [PR] [SPARK-52482][SQL][CORE] Improve exception handling for reading certain corrupt zstd files [spark]
via GitHub
[PR] [SPARK-53188][CORE][SQL] Support `readFully` in `SparkStreamUtils` and `JavaUtils` [spark]
via GitHub
Re: [PR] [SPARK-53188][CORE][SQL] Support `readFully` in `SparkStreamUtils` and `JavaUtils` [spark]
via GitHub
Re: [PR] [SPARK-53188][CORE][SQL] Support `readFully` in `SparkStreamUtils` and `JavaUtils` [spark]
via GitHub
Re: [PR] [SPARK-53188][CORE][SQL] Support `readFully` in `SparkStreamUtils` and `JavaUtils` [spark]
via GitHub
Re: [PR] [SPARK-53188][CORE][SQL] Support `readFully` in `SparkStreamUtils` and `JavaUtils` [spark]
via GitHub
Re: [PR] [SPARK-53188][CORE][SQL] Support `readFully` in `SparkStreamUtils` and `JavaUtils` [spark]
via GitHub
Re: [PR] [SPARK-52638][SQL] Allow preserving Hive-style column order to be configurable [spark]
via GitHub
Re: [PR] [SPARK-52638][SQL] Allow preserving Hive-style column order to be configurable [spark]
via GitHub
Re: [PR] [SPARK-52638][SQL] Allow preserving Hive-style column order to be configurable [spark]
via GitHub
Re: [PR] [DO NOT MERGE]Testing nested correlations handling [spark]
via GitHub
Re: [PR] [DO NOT MERGE]Testing nested correlations handling [spark]
via GitHub
Re: [PR] [SPARK-51886][SQL]Part2.a Adds support for decorrelating nested correlated subqueries in Optimizer [spark]
via GitHub
Re: [PR] [SPARK-51886][SQL]Part2.a Adds support for decorrelating nested correlated subqueries in Optimizer [spark]
via GitHub
Re: [PR] [SPARK-51949][CONNECT] Bump up the default value of `CONNECT_GRPC_MARSHALLER_RECURSION_LIMIT` [spark]
via GitHub
Re: [PR] [SPARK-51949][CONNECT] Bump up the default value of `CONNECT_GRPC_MARSHALLER_RECURSION_LIMIT` [spark]
via GitHub
[PR] [SPARK-53184][PS] `melt` when "value" has MultiIndex column labels [spark]
via GitHub
Re: [PR] [SPARK-53184][PS] `melt` when "value" has MultiIndex column labels [spark]
via GitHub
Re: [PR] [SPARK-53184][PS] `melt` when "value" has MultiIndex column labels [spark]
via GitHub
Re: [PR] [SPARK-53184][PS] `melt` when "value" has MultiIndex column labels [spark]
via GitHub
[PR] [SPARK-53186] Fix probe port override from helm chart [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-53186] Fix probe port override from helm chart [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-53186] Fix probe port override from helm chart [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-53186] Fix probe port override from helm chart [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-53186] Fix probe port override from helm chart [spark-kubernetes-operator]
via GitHub
[PR] [SPARK-53187]Support SparkCluster event related metrics set [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-53187]Support SparkCluster event related metrics set [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-53187]Support SparkCluster event related metrics set [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-53187]Support SparkCluster event related metrics set [spark-kubernetes-operator]
via GitHub
[PR] initial impl need to verify [spark]
via GitHub
[PR] [SPARK-53185][CORE][YARN][TESTS] Use `SparkStreamUtils.toString` instead of `ByteStreams.toByteArray` [spark]
via GitHub
Re: [PR] [SPARK-53185][CORE][YARN][TESTS] Use `SparkStreamUtils.toString` instead of `ByteStreams.toByteArray` [spark]
via GitHub
Re: [PR] [SPARK-53185][CORE][YARN][TESTS] Use `SparkStreamUtils.toString` instead of `ByteStreams.toByteArray` [spark]
via GitHub
Re: [PR] [SPARK-53185][CORE][YARN][TESTS] Use `SparkStreamUtils.toString` instead of `ByteStreams.toByteArray` [spark]
via GitHub
Re: [PR] [SPARK-53185][CORE][YARN][TESTS] Use `SparkStreamUtils.toString` instead of `ByteStreams.toByteArray` [spark]
via GitHub
[PR] [SPARK-53183][SQL] Use Java `Files.readString` instead of `o.a.s.sql.catalyst.util.fileToString` [spark]
via GitHub
Re: [PR] [SPARK-53183][SQL] Use Java `Files.readString` instead of `o.a.s.sql.catalyst.util.fileToString` [spark]
via GitHub
Re: [PR] [SPARK-53183][SQL] Use Java `Files.readString` instead of `o.a.s.sql.catalyst.util.fileToString` [spark]
via GitHub
Re: [PR] [SPARK-53183][SQL] Use Java `Files.readString` instead of `o.a.s.sql.catalyst.util.fileToString` [spark]
via GitHub
Re: [PR] [SPARK-53183][SQL] Use Java `Files.readString` instead of `o.a.s.sql.catalyst.util.fileToString` [spark]
via GitHub
Re: [PR] [SPARK-53183][SQL] Use Java `Files.readString` instead of `o.a.s.sql.catalyst.util.fileToString` [spark]
via GitHub
Re: [PR] [SPARK-53183][SQL] Use Java `Files.readString` instead of `o.a.s.sql.catalyst.util.fileToString` [spark]
via GitHub
Re: [PR] [SPARK-53183][SQL] Use Java `Files.readString` instead of `o.a.s.sql.catalyst.util.fileToString` [spark]
via GitHub
[PR] [WIP][SPARK-53181][PS] Enable doc tests under ANSI [spark]
via GitHub
Re: [PR] [SPARK-53181][PS] Enable doc tests under ANSI [spark]
via GitHub
Re: [PR] [SPARK-53181][PS] Enable doc tests under ANSI [spark]
via GitHub
Re: [PR] [SPARK-53181][PS] Enable doc tests under ANSI [spark]
via GitHub
Re: [PR] [SPARK-53181][PS] Enable doc tests under ANSI [spark]
via GitHub
Re: [PR] [SPARK-53181][PS] Enable doc tests under ANSI [spark]
via GitHub
[PR] [SPARK-53180][CORE] Use Java `InputStream.skipNBytes` instead of `ByteStreams.skipFully` [spark]
via GitHub
Re: [PR] [SPARK-53180][CORE] Use Java `InputStream.skipNBytes` instead of `ByteStreams.skipFully` [spark]
via GitHub
Re: [PR] [SPARK-53180][CORE] Use Java `InputStream.skipNBytes` instead of `ByteStreams.skipFully` [spark]
via GitHub
Re: [PR] [SPARK-53180][CORE] Use Java `InputStream.skipNBytes` instead of `ByteStreams.skipFully` [spark]
via GitHub
Re: [PR] [SPARK-53180][CORE] Use Java `InputStream.skipNBytes` instead of `ByteStreams.skipFully` [spark]
via GitHub
[PR] [SPARK-53179][CORE][TESTS] Use `SparkStreamUtils.toString` instead of `CharStreams.toString` [spark]
via GitHub
Re: [PR] [SPARK-53179][CORE][TESTS] Use `SparkStreamUtils.toString` instead of `CharStreams.toString` [spark]
via GitHub
Re: [PR] [SPARK-53179][CORE][TESTS] Use `SparkStreamUtils.toString` instead of `CharStreams.toString` [spark]
via GitHub
Re: [PR] [SPARK-53179][CORE][TESTS] Use `SparkStreamUtils.toString` instead of `CharStreams.toString` [spark]
via GitHub
Re: [PR] [SPARK-53179][CORE][TESTS] Use `SparkStreamUtils.toString` instead of `CharStreams.toString` [spark]
via GitHub
[PR] Fix cases where package declarations do not match the directory structure related to SDP [spark]
via GitHub
Re: [PR] Fix cases where package declarations do not match the directory structure related to SDP [spark]
via GitHub
[PR] [SPARK-53178][BUILD] Upgrade `curator` to 5.9.0 [spark]
via GitHub
Re: [PR] [SPARK-53178][BUILD] Upgrade `curator` to 5.9.0 [spark]
via GitHub
Re: [PR] [SPARK-53178][BUILD] Upgrade `curator` to 5.9.0 [spark]
via GitHub
Re: [PR] [SPARK-53178][BUILD] Upgrade `curator` to 5.9.0 [spark]
via GitHub
Re: [PR] [SPARK-53178][BUILD] Upgrade `curator` to 5.9.0 [spark]
via GitHub
[PR] [SPARK-53176][DEPLOY] Spark launcher should respect `--load-spark-defaults` [spark]
via GitHub
Re: [PR] [SPARK-53176][DEPLOY] Spark launcher should respect `--load-spark-defaults` [spark]
via GitHub
Re: [PR] [SPARK-53176][DEPLOY] Spark launcher should respect `--load-spark-defaults` [spark]
via GitHub
Re: [PR] [SPARK-53176][DEPLOY] Spark launcher should respect `--load-spark-defaults` [spark]
via GitHub
Re: [PR] [SPARK-53176][DEPLOY] Spark launcher should respect `--load-spark-defaults` [spark]
via GitHub
Re: [PR] [SPARK-53176][DEPLOY] Spark launcher should respect `--load-spark-defaults` [spark]
via GitHub
Re: [PR] [SPARK-53176][DEPLOY] Spark launcher should respect `--load-spark-defaults` [spark]
via GitHub
Re: [PR] [SPARK-53176][DEPLOY] Spark launcher should respect `--load-spark-defaults` [spark]
via GitHub
Re: [PR] [SPARK-53176][DEPLOY] Spark launcher should respect `--load-spark-defaults` [spark]
via GitHub
Re: [PR] [SPARK-53176][DEPLOY] Spark launcher should respect `--load-spark-defaults` [spark]
via GitHub
Re: [PR] [SPARK-53176][DEPLOY] Spark launcher should respect `--load-spark-defaults` [spark]
via GitHub
Re: [PR] [SPARK-53176][DEPLOY] Spark launcher should respect `--load-spark-defaults` [spark]
via GitHub
Re: [PR] [SPARK-53176][DEPLOY] Spark launcher should respect `--load-spark-defaults` [spark]
via GitHub
Re: [PR] [SPARK-53176][DEPLOY] Spark launcher should respect `--load-spark-defaults` [spark]
via GitHub
Re: [PR] [SPARK-53176][DEPLOY] Spark launcher should respect `--load-spark-defaults` [spark]
via GitHub
Re: [PR] [SPARK-53176][DEPLOY] Spark launcher should respect `--load-spark-defaults` [spark]
via GitHub
Re: [PR] [SPARK-53176][DEPLOY] Spark launcher should respect `--load-spark-defaults` [spark]
via GitHub
Re: [PR] [SPARK-53176][DEPLOY] Spark launcher should respect `--load-spark-defaults` [spark]
via GitHub
Re: [PR] [SPARK-53176][DEPLOY] Spark launcher should respect `--load-spark-defaults` [spark]
via GitHub
Re: [PR] [SPARK-53176][DEPLOY] Spark launcher should respect `--load-spark-defaults` [spark]
via GitHub
Re: [PR] [SPARK-53176][DEPLOY] Spark launcher should respect `--load-spark-defaults` [spark]
via GitHub
Re: [PR] [SPARK-53176][DEPLOY] Spark launcher should respect `--load-spark-defaults` [spark]
via GitHub
Re: [PR] [SPARK-53176][DEPLOY] Spark launcher should respect `--load-spark-defaults` [spark]
via GitHub
Re: [PR] [SPARK-53176][DEPLOY] Spark launcher should respect `--load-spark-defaults` [spark]
via GitHub
Re: [PR] [SPARK-53176][DEPLOY] Spark launcher should respect `--load-spark-defaults` [spark]
via GitHub
[PR] [SPARK-53177][K8S] Use Java `Base64` instead of `com.google.common.ioBaseEncoding` [spark]
via GitHub
Re: [PR] [SPARK-53177][K8S] Use Java `Base64` instead of `com.google.common.io.BaseEncoding` [spark]
via GitHub
Re: [PR] [SPARK-53177][K8S] Use Java `Base64` instead of `com.google.common.io.BaseEncoding` [spark]
via GitHub
Re: [PR] [SPARK-53177][K8S] Use Java `Base64` instead of `com.google.common.io.BaseEncoding` [spark]
via GitHub
Re: [PR] [SPARK-53177][K8S] Use Java `Base64` instead of `com.google.common.io.BaseEncoding` [spark]
via GitHub
[PR] [SPARK-53148][CONNECT][SQL] Make SqlCommand in SparkConnectPlanner side effect free [spark]
via GitHub
[PR] [SPARK-53174][CORE] Add TMPDIR environment variable with the value of java.io.tmpdir [spark]
via GitHub
Re: [PR] [SPARK-53174][CORE] Add TMPDIR environment variable with the value of java.io.tmpdir [spark]
via GitHub
Re: [PR] [MINOR] Fix versioning tag [spark-connect-go]
via GitHub
Re: [PR] [MINOR] Fix versioning tag [spark-connect-go]
via GitHub
[PR] [SPARK-53173][SQL][TESTS] Improve the regex pattern in the replaceNotIncludedMsg method [spark]
via GitHub
Re: [PR] [SPARK-53173][SQL][TESTS] Improve the regex pattern in the replaceNotIncludedMsg method [spark]
via GitHub
Re: [PR] [SPARK-53173][SQL][TESTS] Improve `Owner` regex pattern in the replaceNotIncludedMsg method [spark]
via GitHub
Re: [PR] [SPARK-53173][SQL][TESTS] Improve `Owner` regex pattern in the replaceNotIncludedMsg method [spark]
via GitHub
Re: [PR] [SPARK-53173][SQL][TESTS] Improve `Owner` regex pattern in the replaceNotIncludedMsg method [spark]
via GitHub
Re: [PR] [SPARK-53173][SQL][TESTS] Improve `Owner` regex pattern in the replaceNotIncludedMsg method [spark]
via GitHub
[PR] [SPARK-53171][CORE] Improvement UTF8String repeat [spark]
via GitHub
Re: [PR] [SPARK-53171][CORE] Improvement UTF8String repeat [spark]
via GitHub
Re: [PR] [SPARK-53171][CORE] Improvement UTF8String repeat [spark]
via GitHub
Re: [PR] [SPARK-53171][CORE] Improvement UTF8String repeat [spark]
via GitHub
[PR] [DRAFT] Custom HiveThriftServer2 [spark]
via GitHub
[PR] [SPARK-52215][PYTHON][TESTS][FOLLOW-UP] Fix `test_arrow_udf_output_nested_arrays` [spark]
via GitHub
Re: [PR] [SPARK-52215][PYTHON][TESTS][FOLLOW-UP] Fix `test_arrow_udf_output_nested_arrays` [spark]
via GitHub
Re: [PR] [SPARK-52215][PYTHON][TESTS][FOLLOW-UP] Fix `test_arrow_udf_output_nested_arrays` [spark]
via GitHub
Re: [PR] [SPARK-52215][PYTHON][TESTS][FOLLOW-UP] Fix `test_arrow_udf_output_nested_arrays` [spark]
via GitHub
Re: [PR] [SPARK-52215][PYTHON][TESTS][FOLLOW-UP] Fix `test_arrow_udf_output_nested_arrays` [spark]
via GitHub
Re: [PR] [SPARK-52215][PYTHON][TESTS][FOLLOW-UP] Fix `test_arrow_udf_output_nested_arrays` [spark]
via GitHub
[PR] [SPARK-53170][CORE] Improve `SparkUserAppException` to have `cause` parameter [spark]
via GitHub
Re: [PR] [SPARK-53170][CORE] Improve `SparkUserAppException` to have `cause` parameter [spark]
via GitHub
Re: [PR] [SPARK-53170][CORE] Improve `SparkUserAppException` to have `cause` parameter [spark]
via GitHub
Re: [PR] [SPARK-53170][CORE] Improve `SparkUserAppException` to have `cause` parameter [spark]
via GitHub
Re: [PR] [SPARK-53170][CORE] Improve `SparkUserAppException` to have `cause` parameter [spark]
via GitHub
[PR] [SPARK-53167][DEPLOY] Spark launcher isRemote also respects properties files [spark]
via GitHub
Re: [PR] [SPARK-53167][DEPLOY] Spark launcher isRemote also respects properties file [spark]
via GitHub
Re: [PR] [SPARK-53167][DEPLOY] Spark launcher isRemote also respects properties file [spark]
via GitHub
Re: [PR] [SPARK-53167][DEPLOY] Spark launcher isRemote also respects properties file [spark]
via GitHub
Re: [PR] [SPARK-53167][DEPLOY] Spark launcher isRemote also respects properties file [spark]
via GitHub
Re: [PR] [SPARK-53167][DEPLOY] Spark launcher isRemote also respects properties file [spark]
via GitHub
Re: [PR] [SPARK-53167][DEPLOY] Spark launcher isRemote also respects properties file [spark]
via GitHub
[PR] [SPARK-53168][SQL][TESTS] Decouple the test cases in the sql module from the configuration values in `log4j2.properties` [spark]
via GitHub
Re: [PR] [SPARK-53168][SQL][TESTS] Decouple the test cases in the sql module from the configuration values in `log4j2.properties` [spark]
via GitHub
Re: [PR] [SPARK-53168][SQL][TESTS] Decouple the test cases in the sql module from the configuration values in `log4j2.properties` [spark]
via GitHub
Re: [PR] [SPARK-53168][CORE][TESTS] Change default value of the input parameter `level` for `SparkFunSuite#withLogAppender` from `None` to `Some(Level.INFO)` [spark]
via GitHub
Re: [PR] [SPARK-53168][CORE][TESTS] Change default value of the input parameter `level` for `SparkFunSuite#withLogAppender` from `None` to `Some(Level.INFO)` [spark]
via GitHub
Re: [PR] [SPARK-53168][CORE][TESTS] Change default value of the input parameter `level` for `SparkFunSuite#withLogAppender` from `None` to `Some(Level.INFO)` [spark]
via GitHub
[PR] [MINOR][SQL][TESTS] Fix `appender.file.filter.threshold.level` to `warn` in `log4j2.properties` used for testing sql module [spark]
via GitHub
Re: [PR] [MINOR][SQL][TESTS] Fix `appender.file.filter.threshold.level` to `warn` in `log4j2.properties` used for testing sql module [spark]
via GitHub
Re: [PR] [MINOR][SQL][TESTS] Fix `appender.file.filter.threshold.level` to `warn` in `log4j2.properties` used for testing sql module [spark]
via GitHub
Re: [PR] [SPARK-53169][SQL][TESTS] Fix `appender.file.filter.threshold.level` to `warn` in `log4j2.properties` used for testing sql module [spark]
via GitHub
Re: [PR] [SPARK-53169][SQL][TESTS] Fix `appender.file.filter.threshold.level` to `warn` in `log4j2.properties` used for testing sql module [spark]
via GitHub
Re: [PR] [SPARK-53169][SQL][TESTS] Fix `appender.file.filter.threshold.level` to `warn` in `log4j2.properties` used for testing sql module [spark]
via GitHub
Re: [PR] [SPARK-53169][SQL][TESTS] Fix `appender.file.filter.threshold.level` to `warn` in `log4j2.properties` used for testing sql module [spark]
via GitHub
Re: [PR] [SPARK-53169][SQL][TESTS] Fix `appender.file.filter.threshold.level` to `warn` in `log4j2.properties` used for testing sql module [spark]
via GitHub
Re: [PR] [SPARK-53169][SQL][TESTS] Fix `appender.file.filter.threshold.level` to `warn` in `log4j2.properties` used for testing sql module [spark]
via GitHub
Re: [PR] [SPARK-53169][SQL][TESTS] Fix `appender.file.filter.threshold.level` to `warn` in `log4j2.properties` used for testing sql module [spark]
via GitHub
Re: [PR] [SPARK-53169][SQL][TESTS] Fix `appender.file.filter.threshold.level` to `warn` in `log4j2.properties` used for testing sql module [spark]
via GitHub
Re: [PR] [SPARK-53169][SQL] Fix comments of `appender.file.filter.threshold.level` in `log4j2.properties` used for testing sql module [spark]
via GitHub
Re: [PR] [SPARK-53169][SQL] Fix comments of `appender.file.filter.threshold.level` in `log4j2.properties` used for testing sql module [spark]
via GitHub
Re: [PR] [SPARK-53169][SQL][TESTS] Fix `appender.file.filter.threshold.level` to `warn` in `log4j2.properties` used for testing sql module [spark]
via GitHub
Re: [PR] [SPARK-53169][SQL] Fix comments of `appender.file.filter.threshold.level` in `log4j2.properties` used for testing sql module [spark]
via GitHub
Re: [PR] [SPARK-53169][SQL][TESTS] Fix `appender.file.filter.threshold.level` to `warn` in `log4j2.properties` used for testing sql module [spark]
via GitHub
Re: [PR] [SPARK-53169][SQL] Remove comments related to "`Set the logger level of File Appender to`" from `log4j2.properties` [spark]
via GitHub
Re: [PR] [SPARK-53169][SQL] Remove comments related to "`Set the logger level of File Appender to`" from `log4j2.properties` [spark]
via GitHub
Re: [PR] [SPARK-53169][SQL] Remove comments related to "`Set the logger level of File Appender to`" from `log4j2.properties` [spark]
via GitHub
[PR] [SPARK-53165][CORE] Add `SparkExitCode.CLASS_NOT_FOUND` [spark]
via GitHub
Re: [PR] [SPARK-53165][CORE] Add `SparkExitCode.CLASS_NOT_FOUND` [spark]
via GitHub
Re: [PR] [SPARK-53165][CORE] Add `SparkExitCode.CLASS_NOT_FOUND` [spark]
via GitHub
Re: [PR] [SPARK-53165][CORE] Add `SparkExitCode.CLASS_NOT_FOUND` [spark]
via GitHub
Re: [PR] [SPARK-53165][CORE] Add `SparkExitCode.CLASS_NOT_FOUND` [spark]
via GitHub
[PR] [SPARK-53166][CORE] Use `SparkExitCode.EXIT_FAILURE` in `SparkPipelines` object [spark]
via GitHub
Re: [PR] [SPARK-53166][CORE] Use `SparkExitCode.EXIT_FAILURE` in `SparkPipelines` object [spark]
via GitHub
Re: [PR] [SPARK-53166][CORE] Use `SparkExitCode.EXIT_FAILURE` in `SparkPipelines` object [spark]
via GitHub
Re: [PR] [SPARK-53166][CORE] Use `SparkExitCode.EXIT_FAILURE` in `SparkPipelines` object [spark]
via GitHub
Re: [PR] [SPARK-53166][CORE] Use `SparkExitCode.EXIT_FAILURE` in `SparkPipelines` object [spark]
via GitHub
Re: [PR] [SPARK-53166][CORE] Use `SparkExitCode.EXIT_FAILURE` in `SparkPipelines` object [spark]
via GitHub
[PR] [SPARK-53164][CORE][K8S][DSTREAM] Use Java `Files.readAllBytes` instead of `Files.toByteArray` [spark]
via GitHub
Re: [PR] [SPARK-53164][CORE][K8S][DSTREAM] Use Java `Files.readAllBytes` instead of `Files.toByteArray` [spark]
via GitHub
Re: [PR] [SPARK-53164][CORE][K8S][DSTREAM] Use Java `Files.readAllBytes` instead of `Files.toByteArray` [spark]
via GitHub
Re: [PR] [SPARK-53164][CORE][K8S][DSTREAM] Use Java `Files.readAllBytes` instead of `Files.toByteArray` [spark]
via GitHub
Re: [PR] [SPARK-53164][CORE][K8S][DSTREAM] Use Java `Files.readAllBytes` instead of `Files.toByteArray` [spark]
via GitHub
Re: [PR] [SPARK-53164][CORE][K8S][DSTREAM] Use Java `Files.readAllBytes` instead of `Files.toByteArray` [spark]
via GitHub
Re: [PR] [SPARK-53164][CORE][K8S][DSTREAM] Use Java `Files.readAllBytes` instead of `Files.toByteArray` [spark]
via GitHub
Re: [PR] [SPARK-53164][CORE][K8S][DSTREAM] Use Java `Files.readAllBytes` instead of `Files.toByteArray` [spark]
via GitHub
Re: [PR] [SPARK-53164][CORE][K8S][DSTREAM] Use Java `Files.readAllBytes` instead of `Files.toByteArray` [spark]
via GitHub
Re: [PR] [SPARK-53164][CORE][K8S][DSTREAM] Use Java `Files.readAllBytes` instead of `Files.toByteArray` [spark]
via GitHub
Re: [PR] [SPARK-53164][CORE][K8S][DSTREAM] Use Java `Files.readAllBytes` instead of `Files.toByteArray` [spark]
via GitHub
[PR] [SPARK-53163][PYTHON][INFRA] Upgrade PyArrow to 21.0.0 [spark]
via GitHub
Re: [PR] [SPARK-53163][PYTHON][INFRA] Upgrade PyArrow to 21.0.0 [spark]
via GitHub
[PR] [MINOR] Remove a wrong comment of SparkTestUtils [spark]
via GitHub
Earlier messages
Later messages