reviews
Thread
Date
Earlier messages
Later messages
Messages by Thread
[PR] [SPARK-56430][PYTHON] Remove unnecessary .keys() in dict iterations [spark]
via GitHub
Re: [PR] [SPARK-56430][PYTHON] Remove unnecessary .keys() in dict iterations [spark]
via GitHub
Re: [PR] [SPARK-52709][SQL] Fix parsing of STRUCT<> [spark]
via GitHub
[PR] [SPARK-52709][SQL] Fix parsing of STRUCT<> [spark]
via GitHub
Re: [PR] [SPARK-52709][SQL] Fix parsing of STRUCT<> [spark]
via GitHub
Re: [PR] [SPARK-52709][SQL] Fix parsing of STRUCT<> [spark]
via GitHub
Re: [PR] [SPARK-52709][SQL] Fix parsing of STRUCT<> [spark]
via GitHub
Re: [PR] [SPARK-52709][SQL] Fix parsing of STRUCT<> [spark]
via GitHub
Re: [PR] [SPARK-52709][SQL] Fix parsing of STRUCT<> [spark]
via GitHub
Re: [PR] [SPARK-50186][CORE] Clarify executor OutOfMemoryError handling docs [spark]
via GitHub
Re: [PR] [SPARK-50186][CORE] Clarify executor OutOfMemoryError handling docs [spark]
via GitHub
[PR] [SPARK-56428][PYTHON] Move eval type specific data to eval_conf for UDTF [spark]
via GitHub
Re: [PR] [SPARK-56428][PYTHON] Move eval type specific data to eval_conf for UDTF [spark]
via GitHub
Re: [PR] [SPARK-56428][PYTHON] Move eval type specific data to eval_conf for UDTF [spark]
via GitHub
[PR] [SPARK-56427][PYTHON][INFRA] Remove six and py from requirements.txt [spark]
via GitHub
Re: [PR] [SPARK-56427][PYTHON][INFRA] Remove six and py from requirements.txt [spark]
via GitHub
Re: [PR] [SPARK-56427][PYTHON][INFRA] Remove six and py from requirements.txt [spark]
via GitHub
[PR] [SPARK-56426][SQL] Fix LATERAL VIEW column alias with dot in name [spark]
via GitHub
Re: [PR] [SPARK-56426][SQL] Fix LATERAL VIEW column alias with dot in name [spark]
via GitHub
[PR] [SPARK-56425] Split `DataFrame.swift` into extension-based files [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-56425] Split `DataFrame.swift` into extension-based files [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-56425] Split `DataFrame.swift` into extension-based files [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-56425] Split `DataFrame.swift` into extension-based files [spark-connect-swift]
via GitHub
[PR] [SPARK-56423] Remove manual `Equatable` implementation from `StorageLevel` [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-56423] Remove manual `Equatable` implementation from `StorageLevel` [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-56423] Remove manual `Equatable` implementation from `StorageLevel` [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-56423] Remove manual `Equatable` implementation from `StorageLevel` [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-56423] Remove manual `Equatable` implementation from `StorageLevel` [spark-connect-swift]
via GitHub
[PR] [SPARK-56422] Introduce `createPlan` method to deduplicate `Plan` creation pattern [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-56422] Introduce `createPlan` method to deduplicate `Plan` creation pattern [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-56422] Introduce `createPlan` method to deduplicate `Plan` creation pattern [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-56422] Introduce `createPlan` method to deduplicate `Plan` creation pattern [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-56422] Introduce `createPlan` method to deduplicate `Plan` creation pattern [spark-connect-swift]
via GitHub
[PR] [SPARK-56421] Introduce `IntervalType` protocol for `(DayTime|YearMonth)Interval` [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-56421] Introduce `IntervalType` protocol for `(DayTime|YearMonth)Interval` [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-56421] Introduce `IntervalType` protocol for `(DayTime|YearMonth)Interval` [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-56421] Introduce `IntervalType` protocol for `(DayTime|YearMonth)Interval` [spark-connect-swift]
via GitHub
[PR] [SPARK-56420] Introduce `toExpression` method to deduplicate literal conversions [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-56420] Introduce `toExpression` method to deduplicate literal conversions [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-56420] Introduce `toExpression` method to deduplicate literal conversions [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-56420] Introduce `toExpression` method to deduplicate literal conversions [spark-connect-swift]
via GitHub
[PR] [SPARK-56419] Use consistent `CaseInsensitiveDictionary` initialization style [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-56419] Use consistent `CaseInsensitiveDictionary` initialization style [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-56419] Use consistent `CaseInsensitiveDictionary` initialization style [spark-connect-swift]
via GitHub
Re: [PR] [SPARK-56419] Use consistent `CaseInsensitiveDictionary` initialization style [spark-connect-swift]
via GitHub
[PR] [SPARK-56365] Remove hard-coded `APP_VERSION` from Dockerfile [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56365] Remove hard-coded `APP_VERSION` from Dockerfile [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56365] Remove hard-coded `APP_VERSION` from Dockerfile [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56365] Remove hard-coded `APP_VERSION` from Dockerfile [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56365] Remove hard-coded `APP_VERSION` from Dockerfile [spark-kubernetes-operator]
via GitHub
[PR] [SPARK-56418] Unify `getOrCreateLocalFileFor(Driver|Executor)Spec` to `getOrCreateLocalFileForSpec` [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56418] Unify `getOrCreateLocalFileFor(Driver|Executor)Spec` to `getOrCreateLocalFileForSpec` [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56418] Unify `getOrCreateLocalFileFor(Driver|Executor)Spec` to `getOrCreateLocalFileForSpec` [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56418] Unify `getOrCreateLocalFileFor(Driver|Executor)Spec` to `getOrCreateLocalFileForSpec` [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56418] Unify `getOrCreateLocalFileFor(Driver|Executor)Spec` to `getOrCreateLocalFileForSpec` [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-55939][SQL] Add built-in DataSketches ItemsSketch (Frequent Items) functions [spark]
via GitHub
[PR] [SPARK-56417] Consolidate Lombok and JUnitPlatform configurations into root `build.gradle` [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56417] Consolidate `Lombok` and `JUnitPlatform` configurations into root `build.gradle` [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56417] Consolidate `Lombok` and `JUnitPlatform` configurations into root `build.gradle` [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56417] Consolidate `Lombok` and `JUnitPlatform` configurations into root `build.gradle` [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56417] Consolidate `Lombok` and `JUnitPlatform` configurations into root `build.gradle` [spark-kubernetes-operator]
via GitHub
[PR] [SPARK-56415][INFRA] Simplify create_spark_jira.py for LLM-driven JIRA ticket creation [spark]
via GitHub
Re: [PR] [SPARK-56415][INFRA] Simplify create_spark_jira.py for LLM-driven JIRA ticket creation [spark]
via GitHub
Re: [PR] [SPARK-56415][INFRA] Simplify create_spark_jira.py for LLM-driven JIRA ticket creation [spark]
via GitHub
Re: [PR] [SPARK-56415][INFRA] Simplify create_spark_jira.py for LLM-driven JIRA ticket creation [spark]
via GitHub
Re: [PR] [SPARK-56415][INFRA] Simplify create_spark_jira.py for LLM-driven JIRA ticket creation [spark]
via GitHub
Re: [PR] [SPARK-56415][INFRA] Simplify create_spark_jira.py for LLM-driven JIRA ticket creation [spark]
via GitHub
Re: [PR] [SPARK-56415][INFRA] Simplify create_spark_jira.py for LLM-driven JIRA ticket creation [spark]
via GitHub
Re: [PR] [SPARK-56415][INFRA] Simplify create_spark_jira.py for LLM-driven JIRA ticket creation [spark]
via GitHub
Re: [PR] [SPARK-56415][INFRA] Refactor create_spark_jira.py for LLM use and extract shared utilities [spark]
via GitHub
Re: [PR] [SPARK-56415][INFRA] Refactor create_spark_jira.py for LLM use and extract shared utilities [spark]
via GitHub
Re: [PR] [SPARK-56415][INFRA] Refactor create_spark_jira.py for LLM use and extract shared utilities [spark]
via GitHub
Re: [PR] [SPARK-56415][INFRA] Refactor create_spark_jira.py for LLM use and extract shared utilities [spark]
via GitHub
Re: [PR] [SPARK-56415][INFRA] Refactor create_spark_jira.py for LLM use and extract shared utilities [spark]
via GitHub
[PR] [SPARK-xxxx][SQL] Per-write options should take precedence over session config in Parquet and Avro [spark]
via GitHub
Re: [PR] [SPARK-56414][SQL] Per-write options should take precedence over session config in file source writes [spark]
via GitHub
Re: [PR] [SPARK-56414][SQL] Per-write options should take precedence over session config in file source writes [spark]
via GitHub
[PR] [PROTOTYPE][DO_NOT_MERGE] Streaming write schema evolution [spark]
via GitHub
Re: [PR] [SPARK-56470] Schema evolution for DSv2 sinks during streaming write [spark]
via GitHub
Re: [PR] [SPARK-56470][SQL] Schema evolution for DSv2 sinks during streaming write [spark]
via GitHub
Re: [PR] [SPARK-56470][SQL] Schema evolution for DSv2 sinks during streaming write [spark]
via GitHub
Re: [PR] [SPARK-56470][SQL] Schema evolution for DSv2 sinks during streaming write [spark]
via GitHub
[PR] [WIP][SQL][DML] DSv2 Transaction API [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
Re: [PR] [WIP][SQL][DML] DSv2 Transaction Management [spark]
via GitHub
[PR] [SPARK-56409] Consolidate `Netty` exclusions into root `build.gradle` [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56409] Consolidate `Netty` exclusions into root `build.gradle` [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56409] Consolidate `Netty` exclusions into root `build.gradle` [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56409] Consolidate `Netty` exclusions into root `build.gradle` [spark-kubernetes-operator]
via GitHub
[PR] [SPARK-55276][DOCS] Document how SDP datasets are stored and refreshed [spark]
via GitHub
Re: [PR] [SPARK-56451][DOCS] Document how SDP datasets are stored and refreshed [spark]
via GitHub
Re: [PR] [SPARK-56451][DOCS] Document how SDP datasets are stored and refreshed [spark]
via GitHub
Re: [PR] [SPARK-56451][DOCS] Document how SDP datasets are stored and refreshed [spark]
via GitHub
Re: [PR] [SPARK-56451][DOCS][SDP] Document how SDP datasets are stored and refreshed [spark]
via GitHub
Re: [PR] [SPARK-56451][DOCS][SDP] Document how SDP datasets are stored and refreshed [spark]
via GitHub
[I] Document how SDP datasets are stored and refreshed [spark]
via GitHub
Re: [I] Document how SDP datasets are stored and refreshed [spark]
via GitHub
Re: [I] Document how SDP datasets are stored and refreshed [spark]
via GitHub
[PR] [WIP][SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
Re: [PR] [SPARK-56410][SQL][CORE] Add bounded k-way merge support in UnsafeExternalSorter to reduce OOM risk [spark]
via GitHub
[PR] [SPARK-56255][PYTHON][CONNECT] Make spark.read.csv accept DataFrame input [spark]
via GitHub
Re: [PR] [SPARK-56255][PYTHON][CONNECT] Make spark.read.csv accept DataFrame input [spark]
via GitHub
Re: [PR] [SPARK-56255][PYTHON][CONNECT] Make spark.read.csv accept DataFrame input [spark]
via GitHub
Re: [PR] [SPARK-56255][PYTHON][CONNECT] Make spark.read.csv accept DataFrame input [spark]
via GitHub
Re: [PR] [SPARK-56255][PYTHON][CONNECT] Make spark.read.csv accept DataFrame input [spark]
via GitHub
Re: [PR] [SPARK-56255][PYTHON][CONNECT] Make spark.read.csv accept DataFrame input [spark]
via GitHub
Re: [PR] [SPARK-56255][PYTHON][CONNECT] Make spark.read.csv accept DataFrame input [spark]
via GitHub
Re: [PR] [SPARK-56255][PYTHON][CONNECT] Make spark.read.csv accept DataFrame input [spark]
via GitHub
Re: [PR] [SPARK-56255][PYTHON][CONNECT] Make spark.read.csv accept DataFrame input [spark]
via GitHub
[PR] [SQL] Add ParquetFormatVersion enum and writer version option [spark]
via GitHub
Re: [PR] [SQL] Add ParquetFormatVersion enum and writer version option [spark]
via GitHub
[PR] [SPARK-56408] Introduce `OwnerResourceDecorator` and use it instead `(Driver|Cluster)Decorator` [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56408] Introduce `OwnerResourceDecorator` and use it instead `(Driver|Cluster)Decorator` [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56408] Introduce `OwnerResourceDecorator` and use it instead `(Driver|Cluster)Decorator` [spark-kubernetes-operator]
via GitHub
Re: [PR] [SPARK-56408] Introduce `OwnerResourceDecorator` and use it instead `(Driver|Cluster)Decorator` [spark-kubernetes-operator]
via GitHub
[PR] [SPARK-56407][BUILD][TESTS] Remove pre-built class files and JARs used in artifact transfer tests [spark]
via GitHub
Re: [PR] [SPARK-56407][BUILD][TESTS] Remove pre-built class files and JARs used in artifact transfer tests [spark]
via GitHub
Re: [PR] [SPARK-56407][BUILD][TESTS] Remove pre-built class files and JARs used in artifact transfer tests [spark]
via GitHub
Re: [PR] [SPARK-56407][BUILD][TESTS] Remove pre-built class files and JARs used in artifact transfer tests [spark]
via GitHub
Re: [PR] [SPARK-56407][BUILD][TESTS] Remove pre-built class files and JARs used in artifact transfer tests [spark]
via GitHub
Re: [PR] [SPARK-56407][BUILD][TESTS] Remove pre-built class files and JARs used in artifact transfer tests [spark]
via GitHub
[PR] [SPARK-56406][SS] Stream-stream join v4: skip writing secondary index if the operator will not evict from that side [spark]
via GitHub
Re: [PR] [SPARK-56406][SS] Stream-stream join v4: skip writing secondary index if the operator will not evict from that side [spark]
via GitHub
[PR] [SPARK-XXXXX][PYTHON][TESTS] Skip Kafka streaming RTM tests when dependencies are not installed [spark]
via GitHub
Re: [PR] [SPARK-55306][PYTHON][TESTS][FOLLOW-UP] Skip Kafka streaming RTM tests when dependencies are not installed [spark]
via GitHub
Re: [PR] [SPARK-55306][PYTHON][TESTS][FOLLOW-UP] Skip Kafka streaming RTM tests when dependencies are not installed [spark]
via GitHub
Re: [PR] [SPARK-55306][PYTHON][TESTS][FOLLOW-UP] Skip Kafka streaming RTM tests when dependencies are not installed [spark]
via GitHub
Re: [PR] [SPARK-55306][PYTHON][TESTS][FOLLOW-UP] Skip Kafka streaming RTM tests when dependencies are not installed [spark]
via GitHub
Re: [PR] [SPARK-55306][PYTHON][TESTS][FOLLOW-UP] Skip Kafka streaming RTM tests when dependencies are not installed [spark]
via GitHub
Re: [PR] [SPARK-55306][PYTHON][TESTS][FOLLOW-UP] Skip Kafka streaming RTM tests when dependencies are not installed [spark]
via GitHub
Re: [PR] [SPARK-55306][PYTHON][TESTS][FOLLOW-UP] Skip Kafka streaming RTM tests when dependencies are not installed [spark]
via GitHub
[PR] [SPARK-55910][SQL][TESTS] Merge `SQLTestUtils` into `QueryTest` [spark]
via GitHub
Re: [PR] [SPARK-55748][SQL] Use `DSv2` for `avro|csv|json|kafka|orc|parquet|text` by default [spark]
via GitHub
Re: [PR] [SPARK-55324][PYTHON] Make convert_numpy support ArrayType [spark]
via GitHub
Re: [PR] [SPARK-55324][PYTHON] Make convert_numpy support ArrayType [spark]
via GitHub
[PR] update [spark]
via GitHub
Re: [PR] [SPARK-56019][SQL] Close JDBC connection on task kill to unblock native socket reads [spark]
via GitHub
Re: [PR] [SPARK-56019][SQL] Close JDBC connection on task kill to unblock native socket reads [spark]
via GitHub
Re: [PR] [SPARK-56019][SQL] Close JDBC connection on task kill to unblock native socket reads [spark]
via GitHub
Re: [PR] [SPARK-56019][SQL] Close JDBC connection on task kill to unblock native socket reads [spark]
via GitHub
[PR] [SPARK-56402][SS] Apply rangeScan API in stream-stream join format version 4 [spark]
via GitHub
Re: [PR] [SPARK-56402][SS] Apply rangeScan API in stream-stream join format version 4 [spark]
via GitHub
Re: [PR] [SPARK-56402][SS] Apply rangeScan API in stream-stream join format version 4 [spark]
via GitHub
Re: [PR] [SPARK-56402][SS] Apply rangeScan API in stream-stream join format version 4 [spark]
via GitHub
Re: [PR] [SPARK-56402][SS] Apply rangeScan API in stream-stream join format version 4 [spark]
via GitHub
Re: [PR] [SPARK-56402][SS] Apply rangeScan API in stream-stream join format version 4 [spark]
via GitHub
Re: [PR] [SPARK-56402][SS] Apply rangeScan API in stream-stream join format version 4 [spark]
via GitHub
Re: [PR] [SPARK-56402][SS] Apply rangeScan API in stream-stream join format version 4 [spark]
via GitHub
Re: [PR] [SPARK-56402][SS] Apply rangeScan API in stream-stream join format version 4 [spark]
via GitHub
Earlier messages
Later messages