[jira] [Created] (SPARK-46595) Refine docstring of `map_from_arrays/map_from_entries/map_concat`

2024-01-04 Thread Yang Jie (Jira)
Yang Jie created SPARK-46595:


 Summary: Refine docstring of 
`map_from_arrays/map_from_entries/map_concat`
 Key: SPARK-46595
 URL: https://issues.apache.org/jira/browse/SPARK-46595
 Project: Spark
  Issue Type: Sub-task
  Components: Documentation, PySpark
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-46585) `metricPeaks` can be directly constructed as an immutable.ArraySeq instead of use `mutable.ArraySeq.toSeq` in `Executor`

2024-01-04 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46585?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-46585.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 44586
[https://github.com/apache/spark/pull/44586]

>   `metricPeaks` can be directly constructed as an immutable.ArraySeq instead 
> of use `mutable.ArraySeq.toSeq` in `Executor`
> --
>
> Key: SPARK-46585
> URL: https://issues.apache.org/jira/browse/SPARK-46585
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-46550) Upgrade `datasketches-java` to 5.0.1

2024-01-03 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46550?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie updated SPARK-46550:
-
Description: 
* [https://github.com/apache/datasketches-java/releases/tag/4.0.0]
 * [https://github.com/apache/datasketches-java/releases/tag/4.1.0]
 * [https://github.com/apache/datasketches-java/releases/tag/4.2.0]
 * [https://github.com/apache/datasketches-java/releases/tag/5.0.0]
 * 
[https://github.com/apache/datasketches-java/releases/tag/5.0.1|https://github.com/apache/datasketches-java/releases/tag/5.0.0]

  was:
* [https://github.com/apache/datasketches-java/releases/tag/4.0.0]
 * [https://github.com/apache/datasketches-java/releases/tag/4.1.0]
 * [https://github.com/apache/datasketches-java/releases/tag/4.2.0]
 * https://github.com/apache/datasketches-java/releases/tag/5.0.0


> Upgrade `datasketches-java` to 5.0.1
> 
>
> Key: SPARK-46550
> URL: https://issues.apache.org/jira/browse/SPARK-46550
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> * [https://github.com/apache/datasketches-java/releases/tag/4.0.0]
>  * [https://github.com/apache/datasketches-java/releases/tag/4.1.0]
>  * [https://github.com/apache/datasketches-java/releases/tag/4.2.0]
>  * [https://github.com/apache/datasketches-java/releases/tag/5.0.0]
>  * 
> [https://github.com/apache/datasketches-java/releases/tag/5.0.1|https://github.com/apache/datasketches-java/releases/tag/5.0.0]



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-46586) Support `s.c.immutable.ArraySeq` as `customCollectionCls` in `MapObjects`

2024-01-03 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46586?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie updated SPARK-46586:
-
Description: 
{code:java}
val myUdf = udf((a: immutable.ArraySeq[Int]) =>
    immutable.ArraySeq.unsafeWrapArray[Int](Array(a.head + 99)))
  checkAnswer(Seq(Array(1))
    .toDF("col")
    .select(myUdf(Column("col"))),
    Row(ArrayBuffer(100))) {code}
will test failed in Spark 4.0 when using Scala 2.13

> Support `s.c.immutable.ArraySeq` as `customCollectionCls` in `MapObjects`
> -
>
> Key: SPARK-46586
> URL: https://issues.apache.org/jira/browse/SPARK-46586
> Project: Spark
>  Issue Type: Improvement
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Priority: Major
>
> {code:java}
> val myUdf = udf((a: immutable.ArraySeq[Int]) =>
>     immutable.ArraySeq.unsafeWrapArray[Int](Array(a.head + 99)))
>   checkAnswer(Seq(Array(1))
>     .toDF("col")
>     .select(myUdf(Column("col"))),
>     Row(ArrayBuffer(100))) {code}
> will test failed in Spark 4.0 when using Scala 2.13



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-46586) Support `s.c.immutable.ArraySeq` as `customCollectionCls` in `MapObjects`

2024-01-03 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46586?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie updated SPARK-46586:
-
Summary: Support `s.c.immutable.ArraySeq` as `customCollectionCls` in 
`MapObjects`  (was: Make `MapObjects` support `s.c.immutable.ArraySeq`)

> Support `s.c.immutable.ArraySeq` as `customCollectionCls` in `MapObjects`
> -
>
> Key: SPARK-46586
> URL: https://issues.apache.org/jira/browse/SPARK-46586
> Project: Spark
>  Issue Type: Improvement
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46586) Make `MapObjects` support `s.c.immutable.ArraySeq`

2024-01-03 Thread Yang Jie (Jira)
Yang Jie created SPARK-46586:


 Summary: Make `MapObjects` support `s.c.immutable.ArraySeq`
 Key: SPARK-46586
 URL: https://issues.apache.org/jira/browse/SPARK-46586
 Project: Spark
  Issue Type: Improvement
  Components: SQL
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46585) `metricPeaks` can be directly constructed as an immutable.ArraySeq instead of use `mutable.ArraySeq.toSeq` in `Executor`

2024-01-03 Thread Yang Jie (Jira)
Yang Jie created SPARK-46585:


 Summary:   `metricPeaks` can be directly constructed as an 
immutable.ArraySeq instead of use `mutable.ArraySeq.toSeq` in `Executor`
 Key: SPARK-46585
 URL: https://issues.apache.org/jira/browse/SPARK-46585
 Project: Spark
  Issue Type: Improvement
  Components: Spark Core
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46578) Remove ThreadLocal for fallbackConf

2024-01-03 Thread Yang Jie (Jira)
Yang Jie created SPARK-46578:


 Summary: Remove ThreadLocal for fallbackConf
 Key: SPARK-46578
 URL: https://issues.apache.org/jira/browse/SPARK-46578
 Project: Spark
  Issue Type: Improvement
  Components: Spark Core
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46573) Should always use `appId` instead of `conf.appId` in `LoggingPodStatusWatcherImpl`

2024-01-02 Thread Yang Jie (Jira)
Yang Jie created SPARK-46573:


 Summary: Should always use `appId` instead of `conf.appId` in 
`LoggingPodStatusWatcherImpl`
 Key: SPARK-46573
 URL: https://issues.apache.org/jira/browse/SPARK-46573
 Project: Spark
  Issue Type: Improvement
  Components: Kubernetes
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-46559) Wrap the `export` in the package name with backticks

2024-01-02 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46559?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-46559.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 44555
[https://github.com/apache/spark/pull/44555]

> Wrap the `export` in the package name with backticks
> 
>
> Key: SPARK-46559
> URL: https://issues.apache.org/jira/browse/SPARK-46559
> Project: Spark
>  Issue Type: Improvement
>  Components: MLlib
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> `export` will be a keyword in Scala 3, using it directly in the package name 
> will cause a compilation error.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-46559) Wrap the `export` in the package name with backticks

2024-01-02 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46559?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie reassigned SPARK-46559:


Assignee: Yang Jie

> Wrap the `export` in the package name with backticks
> 
>
> Key: SPARK-46559
> URL: https://issues.apache.org/jira/browse/SPARK-46559
> Project: Spark
>  Issue Type: Improvement
>  Components: MLlib
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Minor
>  Labels: pull-request-available
>
> `export` will be a keyword in Scala 3, using it directly in the package name 
> will cause a compilation error.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-46257) Upgrade Derby to 10.16.1.1

2024-01-02 Thread Yang Jie (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-46257?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17801763#comment-17801763
 ] 

Yang Jie commented on SPARK-46257:
--

[~julienlau] Has 10.16.1.2 been released? I still can't find it in the Maven 
Central Repository.
 
 
 
 
 

> Upgrade Derby to 10.16.1.1
> --
>
> Key: SPARK-46257
> URL: https://issues.apache.org/jira/browse/SPARK-46257
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> https://db.apache.org/derby/releases/release-10_16_1_1.cgi



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-46561) Use `exists` instead of `filter + nonEmpty` to get `showResourceColumn` in `MasterPage.scala`

2024-01-01 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46561?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-46561.
--
Resolution: Won't Fix

> Use `exists` instead of `filter + nonEmpty` to get `showResourceColumn` in 
> `MasterPage.scala`
> -
>
> Key: SPARK-46561
> URL: https://issues.apache.org/jira/browse/SPARK-46561
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core, Web UI
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Priority: Minor
>  Labels: pull-request-available
>
> {code:java}
> def render(request: HttpServletRequest): Seq[Node] = {
>   val state = getMasterState
>   val showResourceColumn = 
> state.workers.filter(_.resourcesInfoUsed.nonEmpty).nonEmpty{code}
> we can use `exists` instead of 
> `workers.filter(_.resourcesInfoUsed.nonEmpty).nonEmpty`



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-46562) Remove retrieval of `keytabFile` from `UserGroupInformation` in `HiveAuthFactory`

2024-01-01 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46562?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie updated SPARK-46562:
-
Summary: Remove retrieval of `keytabFile` from `UserGroupInformation` in 
`HiveAuthFactory`  (was: Remove the process of obtaining `keytabFile` from 
`UserGroupInformation` in  `HiveAuthFactory`)

> Remove retrieval of `keytabFile` from `UserGroupInformation` in 
> `HiveAuthFactory`
> -
>
> Key: SPARK-46562
> URL: https://issues.apache.org/jira/browse/SPARK-46562
> Project: Spark
>  Issue Type: Improvement
>  Components: SQL
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46562) Remove the process of obtaining `keytabFile` from `UserGroupInformation` in `HiveAuthFactory`

2024-01-01 Thread Yang Jie (Jira)
Yang Jie created SPARK-46562:


 Summary: Remove the process of obtaining `keytabFile` from 
`UserGroupInformation` in  `HiveAuthFactory`
 Key: SPARK-46562
 URL: https://issues.apache.org/jira/browse/SPARK-46562
 Project: Spark
  Issue Type: Improvement
  Components: SQL
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46561) Use `exists` instead of `filter + nonEmpty` to get `showResourceColumn` in `MasterPage.scala`

2024-01-01 Thread Yang Jie (Jira)
Yang Jie created SPARK-46561:


 Summary: Use `exists` instead of `filter + nonEmpty` to get 
`showResourceColumn` in `MasterPage.scala`
 Key: SPARK-46561
 URL: https://issues.apache.org/jira/browse/SPARK-46561
 Project: Spark
  Issue Type: Improvement
  Components: Spark Core, Web UI
Affects Versions: 4.0.0
Reporter: Yang Jie


{code:java}
def render(request: HttpServletRequest): Seq[Node] = {
  val state = getMasterState

  val showResourceColumn = 
state.workers.filter(_.resourcesInfoUsed.nonEmpty).nonEmpty{code}
we can use `exists` instead of 
`workers.filter(_.resourcesInfoUsed.nonEmpty).nonEmpty`



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46559) Wrap the `export` in the package name with backticks

2024-01-01 Thread Yang Jie (Jira)
Yang Jie created SPARK-46559:


 Summary: Wrap the `export` in the package name with backticks
 Key: SPARK-46559
 URL: https://issues.apache.org/jira/browse/SPARK-46559
 Project: Spark
  Issue Type: Improvement
  Components: MLlib
Affects Versions: 4.0.0
Reporter: Yang Jie


`export` will be a keyword in Scala 3, using it directly in the package name 
will cause a compilation error.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46558) Extract a helper method to eliminate the duplicate code in `GrpcExceptionConverter` that retrieves `MessageParameters` from `ErrorParams`

2024-01-01 Thread Yang Jie (Jira)
Yang Jie created SPARK-46558:


 Summary: Extract a helper method to eliminate the duplicate code 
in `GrpcExceptionConverter` that retrieves `MessageParameters` from 
`ErrorParams`
 Key: SPARK-46558
 URL: https://issues.apache.org/jira/browse/SPARK-46558
 Project: Spark
  Issue Type: Improvement
  Components: Connect
Affects Versions: 4.0.0
Reporter: Yang Jie


{code:java}
params.errorClass match {
  case Some(_) => params.messageParameters
  case None => Map("message" -> params.message)
} {code}
The above code pattern appears 7 times in `GrpcExceptionConverter`.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-46554) Upgrade slf4j to 2.0.10

2024-01-01 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46554?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-46554.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 44544
[https://github.com/apache/spark/pull/44544]

> Upgrade slf4j to 2.0.10
> ---
>
> Key: SPARK-46554
> URL: https://issues.apache.org/jira/browse/SPARK-46554
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-46551) Refine docstring of `flatten/sequence/shuffle`

2024-01-01 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46551?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-46551.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 44548
[https://github.com/apache/spark/pull/44548]

> Refine docstring of `flatten/sequence/shuffle`
> --
>
> Key: SPARK-46551
> URL: https://issues.apache.org/jira/browse/SPARK-46551
> Project: Spark
>  Issue Type: Sub-task
>  Components: PySpark
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-46551) Refine docstring of `flatten/sequence/shuffle`

2024-01-01 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46551?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie reassigned SPARK-46551:


Assignee: Yang Jie

> Refine docstring of `flatten/sequence/shuffle`
> --
>
> Key: SPARK-46551
> URL: https://issues.apache.org/jira/browse/SPARK-46551
> Project: Spark
>  Issue Type: Sub-task
>  Components: PySpark
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46551) Refine docstring of `flatten/sequence/shuffle`

2023-12-31 Thread Yang Jie (Jira)
Yang Jie created SPARK-46551:


 Summary: Refine docstring of `flatten/sequence/shuffle`
 Key: SPARK-46551
 URL: https://issues.apache.org/jira/browse/SPARK-46551
 Project: Spark
  Issue Type: Sub-task
  Components: PySpark
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46550) Upgrade `datasketches-java` to 5.0.0

2023-12-31 Thread Yang Jie (Jira)
Yang Jie created SPARK-46550:


 Summary: Upgrade `datasketches-java` to 5.0.0
 Key: SPARK-46550
 URL: https://issues.apache.org/jira/browse/SPARK-46550
 Project: Spark
  Issue Type: Improvement
  Components: Build
Affects Versions: 4.0.0
Reporter: Yang Jie


* [https://github.com/apache/datasketches-java/releases/tag/4.0.0]
 * [https://github.com/apache/datasketches-java/releases/tag/4.1.0]
 * [https://github.com/apache/datasketches-java/releases/tag/4.2.0]
 * https://github.com/apache/datasketches-java/releases/tag/5.0.0



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-46548) Refine docstring of `get/array_zip/sort_array`

2023-12-30 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46548?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-46548.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 44545
[https://github.com/apache/spark/pull/44545]

> Refine docstring of `get/array_zip/sort_array`
> --
>
> Key: SPARK-46548
> URL: https://issues.apache.org/jira/browse/SPARK-46548
> Project: Spark
>  Issue Type: Sub-task
>  Components: Documentation, PySpark
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-46548) Refine docstring of `get/array_zip/sort_array`

2023-12-30 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46548?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie reassigned SPARK-46548:


Assignee: Yang Jie

> Refine docstring of `get/array_zip/sort_array`
> --
>
> Key: SPARK-46548
> URL: https://issues.apache.org/jira/browse/SPARK-46548
> Project: Spark
>  Issue Type: Sub-task
>  Components: Documentation, PySpark
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46548) Refine docstring of `get/array_zip/sort_array`

2023-12-30 Thread Yang Jie (Jira)
Yang Jie created SPARK-46548:


 Summary: Refine docstring of `get/array_zip/sort_array`
 Key: SPARK-46548
 URL: https://issues.apache.org/jira/browse/SPARK-46548
 Project: Spark
  Issue Type: Sub-task
  Components: Documentation, PySpark
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46542) Remove the check for `c>=0` from `ExternalCatalogUtils#needsEscaping`

2023-12-28 Thread Yang Jie (Jira)
Yang Jie created SPARK-46542:


 Summary: Remove the check for `c>=0` from 
`ExternalCatalogUtils#needsEscaping`
 Key: SPARK-46542
 URL: https://issues.apache.org/jira/browse/SPARK-46542
 Project: Spark
  Issue Type: Improvement
  Components: SQL
Affects Versions: 4.0.0
Reporter: Yang Jie


 
{code:java}
def needsEscaping(c: Char): Boolean = {
  c >= 0 && c < charToEscape.size() && charToEscape.get(c)
} {code}
 

 

The numerical range of Char in Scala is from 0 to 65,535, so `c>=0` is always 
true.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-46533) Refine docstring of `array_min/array_max/array_size/array_repeat`

2023-12-28 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46533?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie reassigned SPARK-46533:


Assignee: Yang Jie

> Refine docstring of `array_min/array_max/array_size/array_repeat`
> -
>
> Key: SPARK-46533
> URL: https://issues.apache.org/jira/browse/SPARK-46533
> Project: Spark
>  Issue Type: Sub-task
>  Components: Documentation, PySpark
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-46533) Refine docstring of `array_min/array_max/array_size/array_repeat`

2023-12-28 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46533?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-46533.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 44522
[https://github.com/apache/spark/pull/44522]

> Refine docstring of `array_min/array_max/array_size/array_repeat`
> -
>
> Key: SPARK-46533
> URL: https://issues.apache.org/jira/browse/SPARK-46533
> Project: Spark
>  Issue Type: Sub-task
>  Components: Documentation, PySpark
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46533) Refine docstring of `array_min/array_max/array_size/array_repeat`

2023-12-27 Thread Yang Jie (Jira)
Yang Jie created SPARK-46533:


 Summary: Refine docstring of 
`array_min/array_max/array_size/array_repeat`
 Key: SPARK-46533
 URL: https://issues.apache.org/jira/browse/SPARK-46533
 Project: Spark
  Issue Type: Sub-task
  Components: Documentation, PySpark
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46531) Move the version management of 'datasketches-java' from 'catalyst' to 'parent'

2023-12-27 Thread Yang Jie (Jira)
Yang Jie created SPARK-46531:


 Summary: Move the version management of 'datasketches-java' from 
'catalyst' to 'parent'
 Key: SPARK-46531
 URL: https://issues.apache.org/jira/browse/SPARK-46531
 Project: Spark
  Issue Type: Improvement
  Components: Build
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46521) Refine docstring of `array_compact/array_distinct/array_remove`

2023-12-26 Thread Yang Jie (Jira)
Yang Jie created SPARK-46521:


 Summary: Refine docstring of 
`array_compact/array_distinct/array_remove`
 Key: SPARK-46521
 URL: https://issues.apache.org/jira/browse/SPARK-46521
 Project: Spark
  Issue Type: Sub-task
  Components: Documentation, PySpark
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-46506) Refine docstring of `array_intersect/array_union/array_except`

2023-12-26 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46506?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie reassigned SPARK-46506:


Assignee: Yang Jie

> Refine docstring of `array_intersect/array_union/array_except`
> --
>
> Key: SPARK-46506
> URL: https://issues.apache.org/jira/browse/SPARK-46506
> Project: Spark
>  Issue Type: Sub-task
>  Components: Documentation, PySpark
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-46506) Refine docstring of `array_intersect/array_union/array_except`

2023-12-26 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46506?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-46506.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 44490
[https://github.com/apache/spark/pull/44490]

> Refine docstring of `array_intersect/array_union/array_except`
> --
>
> Key: SPARK-46506
> URL: https://issues.apache.org/jira/browse/SPARK-46506
> Project: Spark
>  Issue Type: Sub-task
>  Components: Documentation, PySpark
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46508) Upgrade Jackson to 2.16.1

2023-12-25 Thread Yang Jie (Jira)
Yang Jie created SPARK-46508:


 Summary: Upgrade Jackson to 2.16.1
 Key: SPARK-46508
 URL: https://issues.apache.org/jira/browse/SPARK-46508
 Project: Spark
  Issue Type: Improvement
  Components: Build
Affects Versions: 4.0.0
Reporter: Yang Jie


https://github.com/FasterXML/jackson/wiki/Jackson-Release-2.16.1



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46506) Refine docstring of `array_intersect/array_union/array_except`

2023-12-25 Thread Yang Jie (Jira)
Yang Jie created SPARK-46506:


 Summary: Refine docstring of 
`array_intersect/array_union/array_except`
 Key: SPARK-46506
 URL: https://issues.apache.org/jira/browse/SPARK-46506
 Project: Spark
  Issue Type: Sub-task
  Components: Documentation, PySpark
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-46498) Remove an unused local variables from `o.a.spark.util.Utils#getConfiguredLocalDirs`

2023-12-25 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46498?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie updated SPARK-46498:
-
Summary: Remove an unused local variables from 
`o.a.spark.util.Utils#getConfiguredLocalDirs`  (was: Clean up unused methods 
and local variables in `o.a.s.util.Utils`)

> Remove an unused local variables from 
> `o.a.spark.util.Utils#getConfiguredLocalDirs`
> ---
>
> Key: SPARK-46498
> URL: https://issues.apache.org/jira/browse/SPARK-46498
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Priority: Minor
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46498) Clean up unused methods and local variables in `o.a.s.util.Utils`

2023-12-24 Thread Yang Jie (Jira)
Yang Jie created SPARK-46498:


 Summary: Clean up unused methods and local variables in 
`o.a.s.util.Utils`
 Key: SPARK-46498
 URL: https://issues.apache.org/jira/browse/SPARK-46498
 Project: Spark
  Issue Type: Improvement
  Components: Spark Core
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46497) Re-enable the test cases that were ignored in SPARK-45309

2023-12-24 Thread Yang Jie (Jira)
Yang Jie created SPARK-46497:


 Summary: Re-enable the test cases that were ignored in SPARK-45309
 Key: SPARK-46497
 URL: https://issues.apache.org/jira/browse/SPARK-46497
 Project: Spark
  Issue Type: Improvement
  Components: SQL, Tests
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46493) Upgrade `apache-rat` to 0.15

2023-12-23 Thread Yang Jie (Jira)
Yang Jie created SPARK-46493:


 Summary: Upgrade `apache-rat` to 0.15
 Key: SPARK-46493
 URL: https://issues.apache.org/jira/browse/SPARK-46493
 Project: Spark
  Issue Type: Improvement
  Components: Build
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-46492) Simplify the Java version check in `SparkBuild.scala`

2023-12-23 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46492?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie updated SPARK-46492:
-
Summary: Simplify the Java version check in `SparkBuild.scala`  (was: Clean 
up Java version check in `SparkBuild.scala`)

> Simplify the Java version check in `SparkBuild.scala`
> -
>
> Key: SPARK-46492
> URL: https://issues.apache.org/jira/browse/SPARK-46492
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46492) Clean up Java version check in `SparkBuild.scala`

2023-12-23 Thread Yang Jie (Jira)
Yang Jie created SPARK-46492:


 Summary: Clean up Java version check in `SparkBuild.scala`
 Key: SPARK-46492
 URL: https://issues.apache.org/jira/browse/SPARK-46492
 Project: Spark
  Issue Type: Improvement
  Components: Build
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-46479) Change Utils.isJavaVersionAtLeast21 to use the utility method from commons-lang3

2023-12-23 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46479?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie reassigned SPARK-46479:


Assignee: Yang Jie

> Change Utils.isJavaVersionAtLeast21 to use the utility method from 
> commons-lang3
> 
>
> Key: SPARK-46479
> URL: https://issues.apache.org/jira/browse/SPARK-46479
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
>
> commons-lang3 added support for checking `JAVA_21` after version 3.13.0, so 
> we can directly use the utility methods provided by commons-lang3.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-46479) Change Utils.isJavaVersionAtLeast21 to use the utility method from commons-lang3

2023-12-23 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46479?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-46479.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 4
[https://github.com/apache/spark/pull/4]

> Change Utils.isJavaVersionAtLeast21 to use the utility method from 
> commons-lang3
> 
>
> Key: SPARK-46479
> URL: https://issues.apache.org/jira/browse/SPARK-46479
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> commons-lang3 added support for checking `JAVA_21` after version 3.13.0, so 
> we can directly use the utility methods provided by commons-lang3.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-46483) Exclude `apache-rat-*.jar` from `ScalaUnidoc / unidoc / unidocAllClasspaths`

2023-12-22 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46483?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie reassigned SPARK-46483:


Assignee: Yang Jie

> Exclude `apache-rat-*.jar` from `ScalaUnidoc / unidoc / unidocAllClasspaths`
> 
>
> Key: SPARK-46483
> URL: https://issues.apache.org/jira/browse/SPARK-46483
> Project: Spark
>  Issue Type: Improvement
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-46483) Exclude `apache-rat-*.jar` from `ScalaUnidoc / unidoc / unidocAllClasspaths`

2023-12-22 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46483?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-46483.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 44453
[https://github.com/apache/spark/pull/44453]

> Exclude `apache-rat-*.jar` from `ScalaUnidoc / unidoc / unidocAllClasspaths`
> 
>
> Key: SPARK-46483
> URL: https://issues.apache.org/jira/browse/SPARK-46483
> Project: Spark
>  Issue Type: Improvement
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46483) Exclude `apache-rat-*.jar` from `ScalaUnidoc / unidoc / unidocAllClasspaths`

2023-12-21 Thread Yang Jie (Jira)
Yang Jie created SPARK-46483:


 Summary: Exclude `apache-rat-*.jar` from `ScalaUnidoc / unidoc / 
unidocAllClasspaths`
 Key: SPARK-46483
 URL: https://issues.apache.org/jira/browse/SPARK-46483
 Project: Spark
  Issue Type: Improvement
  Components: Project Infra
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-46472) Refine docstring of `array_prepend/array_append/array_insert`

2023-12-21 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46472?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie reassigned SPARK-46472:


Assignee: Yang Jie

> Refine docstring of `array_prepend/array_append/array_insert`
> -
>
> Key: SPARK-46472
> URL: https://issues.apache.org/jira/browse/SPARK-46472
> Project: Spark
>  Issue Type: Sub-task
>  Components: Documentation, PySpark
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-46472) Refine docstring of `array_prepend/array_append/array_insert`

2023-12-21 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46472?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-46472.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 44436
[https://github.com/apache/spark/pull/44436]

> Refine docstring of `array_prepend/array_append/array_insert`
> -
>
> Key: SPARK-46472
> URL: https://issues.apache.org/jira/browse/SPARK-46472
> Project: Spark
>  Issue Type: Sub-task
>  Components: Documentation, PySpark
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-46476) Move `IvyTestUtils` back to test dir

2023-12-21 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46476?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-46476.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 0
[https://github.com/apache/spark/pull/0]

> Move `IvyTestUtils` back to test dir
> 
>
> Key: SPARK-46476
> URL: https://issues.apache.org/jira/browse/SPARK-46476
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-46476) Move `IvyTestUtils` back to test dir

2023-12-21 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46476?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie reassigned SPARK-46476:


Assignee: Yang Jie

> Move `IvyTestUtils` back to test dir
> 
>
> Key: SPARK-46476
> URL: https://issues.apache.org/jira/browse/SPARK-46476
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46479) Change Utils.isJavaVersionAtLeast21 to use the utility method from commons-lang3

2023-12-21 Thread Yang Jie (Jira)
Yang Jie created SPARK-46479:


 Summary: Change Utils.isJavaVersionAtLeast21 to use the utility 
method from commons-lang3
 Key: SPARK-46479
 URL: https://issues.apache.org/jira/browse/SPARK-46479
 Project: Spark
  Issue Type: Improvement
  Components: Spark Core
Affects Versions: 4.0.0
Reporter: Yang Jie


commons-lang3 added support for checking `JAVA_21` after version 3.13.0, so we 
can directly use the utility methods provided by commons-lang3.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46476) Move `IvyTestUtils` back to test dir

2023-12-21 Thread Yang Jie (Jira)
Yang Jie created SPARK-46476:


 Summary: Move `IvyTestUtils` back to test dir
 Key: SPARK-46476
 URL: https://issues.apache.org/jira/browse/SPARK-46476
 Project: Spark
  Issue Type: Improvement
  Components: Spark Core
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46472) Refine docstring of `array_prepend/array_append/array_insert`

2023-12-20 Thread Yang Jie (Jira)
Yang Jie created SPARK-46472:


 Summary: Refine docstring of 
`array_prepend/array_append/array_insert`
 Key: SPARK-46472
 URL: https://issues.apache.org/jira/browse/SPARK-46472
 Project: Spark
  Issue Type: Sub-task
  Components: Documentation, PySpark
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46469) Clean up useless local variables in `InsertIntoHiveTable`

2023-12-20 Thread Yang Jie (Jira)
Yang Jie created SPARK-46469:


 Summary: Clean up useless local variables in `InsertIntoHiveTable`
 Key: SPARK-46469
 URL: https://issues.apache.org/jira/browse/SPARK-46469
 Project: Spark
  Issue Type: Improvement
  Components: SQL
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-46461) The `sbt console` command is not available

2023-12-20 Thread Yang Jie (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-46461?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17798975#comment-17798975
 ] 

Yang Jie commented on SPARK-46461:
--

cc [~srowen] [~dongjoon] [~gurwls223] 

I'm not certain if `sbt console` command needs to be fixed, but I've found 
configurations related to the `console` command in `SparkBuild.scala` that 
should have been unusable for quite some time.

 

[https://github.com/apache/spark/blob/c9cfaac90fd423c3a38e295234e24744b946cb02/project/SparkBuild.scala#L1126-L1148]

 
{code:java}
object SQL {
  lazy val settings = Seq(
    (console / initialCommands) :=
      """
        |import org.apache.spark.SparkContext
        |import org.apache.spark.sql.SQLContext
        |import org.apache.spark.sql.catalyst.analysis._
        |import org.apache.spark.sql.catalyst.dsl._
        |import org.apache.spark.sql.catalyst.errors._
        |import org.apache.spark.sql.catalyst.expressions._
        |import org.apache.spark.sql.catalyst.plans.logical._
        |import org.apache.spark.sql.catalyst.rules._
        |import org.apache.spark.sql.catalyst.util._
        |import org.apache.spark.sql.execution
        |import org.apache.spark.sql.functions._
        |import org.apache.spark.sql.types._
        |
        |val sc = new SparkContext("local[*]", "dev-shell")
        |val sqlContext = new SQLContext(sc)
        |import sqlContext.implicits._
        |import sqlContext._
      """.stripMargin,
    (console / cleanupCommands) := "sc.stop()"
  )
} {code}
 

[https://github.com/apache/spark/blob/c9cfaac90fd423c3a38e295234e24744b946cb02/project/SparkBuild.scala#L1164-L1180]

 
{code:java}
    (console / initialCommands) :=
      """
        |import org.apache.spark.SparkContext
        |import org.apache.spark.sql.catalyst.analysis._
        |import org.apache.spark.sql.catalyst.dsl._
        |import org.apache.spark.sql.catalyst.errors._
        |import org.apache.spark.sql.catalyst.expressions._
        |import org.apache.spark.sql.catalyst.plans.logical._
        |import org.apache.spark.sql.catalyst.rules._
        |import org.apache.spark.sql.catalyst.util._
        |import org.apache.spark.sql.execution
        |import org.apache.spark.sql.functions._
        |import org.apache.spark.sql.hive._
        |import org.apache.spark.sql.hive.test.TestHive._
        |import org.apache.spark.sql.hive.test.TestHive.implicits._
        |import org.apache.spark.sql.types._""".stripMargin,
    (console / cleanupCommands) := "sparkContext.stop()", {code}

> The `sbt console` command is not available
> --
>
> Key: SPARK-46461
> URL: https://issues.apache.org/jira/browse/SPARK-46461
> Project: Spark
>  Issue Type: Bug
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Priority: Minor
>
> # Unable to define expressions after executing the `build/sbt console` command
> {code:java}
> scala> val i = 1 // show
> package $line3 {
>   sealed class $read extends _root_.scala.Serializable {
>     def () = {
>       super.;
>       ()
>     };
>     sealed class $iw extends _root_.java.io.Serializable {
>       def () = {
>         super.;
>         ()
>       };
>       val i = 1
>     };
>     val $iw = new $iw.
>   };
>   object $read extends scala.AnyRef {
>     def () = {
>       super.;
>       ()
>     };
>     val INSTANCE = new $read.
>   }
> }
> warning: -target is deprecated: Use -release instead to compile against the 
> correct platform API.
> Applicable -Wconf / @nowarn filters for this warning: msg= message>, cat=deprecation
>        ^
>        error: expected class or object definition {code}
> 2.  Due to the default unused imports check, the error "unused imports" will 
> be reported after executing the `build/sbt sql/console` command
> {code:java}
> Welcome to Scala 2.13.12 (OpenJDK 64-Bit Server VM, Java 17.0.9).
> Type in expressions for evaluation. Or try :help.
> warning: -target is deprecated: Use -release instead to compile against the 
> correct platform API.
> Applicable -Wconf / @nowarn filters for this warning: msg= message>, cat=deprecation
>        import org.apache.spark.sql.catalyst.errors._
>                                             ^
> On line 6: error: object errors is not a member of package 
> org.apache.spark.sql.catalyst
>        import org.apache.spark.sql.catalyst.analysis._
>                                                      ^
> On line 4: error: Unused import
>        Applicable -Wconf / @nowarn filters for this fatal warning: msg= of the message>, cat=unused-imports, site=
>        import org.apache.spark.sql.catalyst.dsl._
>                                                 ^
> On line 5: error: Unused import
>        Applicable -Wconf / @nowarn filters for this fatal warning: msg= of the message>, cat=unused-imports, site=
>        

[jira] [Updated] (SPARK-46461) The `sbt console` command is not available

2023-12-19 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46461?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie updated SPARK-46461:
-
Issue Type: Bug  (was: Improvement)

> The `sbt console` command is not available
> --
>
> Key: SPARK-46461
> URL: https://issues.apache.org/jira/browse/SPARK-46461
> Project: Spark
>  Issue Type: Bug
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Priority: Major
>
> # Unable to define expressions after executing the `build/sbt console` command
> {code:java}
> scala> val i = 1 // show
> package $line3 {
>   sealed class $read extends _root_.scala.Serializable {
>     def () = {
>       super.;
>       ()
>     };
>     sealed class $iw extends _root_.java.io.Serializable {
>       def () = {
>         super.;
>         ()
>       };
>       val i = 1
>     };
>     val $iw = new $iw.
>   };
>   object $read extends scala.AnyRef {
>     def () = {
>       super.;
>       ()
>     };
>     val INSTANCE = new $read.
>   }
> }
> warning: -target is deprecated: Use -release instead to compile against the 
> correct platform API.
> Applicable -Wconf / @nowarn filters for this warning: msg= message>, cat=deprecation
>        ^
>        error: expected class or object definition {code}
> 2.  Due to the default unused imports check, the error "unused imports" will 
> be reported after executing the `build/sbt sql/console` command
> {code:java}
> Welcome to Scala 2.13.12 (OpenJDK 64-Bit Server VM, Java 17.0.9).
> Type in expressions for evaluation. Or try :help.
> warning: -target is deprecated: Use -release instead to compile against the 
> correct platform API.
> Applicable -Wconf / @nowarn filters for this warning: msg= message>, cat=deprecation
>        import org.apache.spark.sql.catalyst.errors._
>                                             ^
> On line 6: error: object errors is not a member of package 
> org.apache.spark.sql.catalyst
>        import org.apache.spark.sql.catalyst.analysis._
>                                                      ^
> On line 4: error: Unused import
>        Applicable -Wconf / @nowarn filters for this fatal warning: msg= of the message>, cat=unused-imports, site=
>        import org.apache.spark.sql.catalyst.dsl._
>                                                 ^
> On line 5: error: Unused import
>        Applicable -Wconf / @nowarn filters for this fatal warning: msg= of the message>, cat=unused-imports, site=
>        import org.apache.spark.sql.catalyst.errors._
>                                                    ^
> On line 6: error: Unused import
>        Applicable -Wconf / @nowarn filters for this fatal warning: msg= of the message>, cat=unused-imports, site=
>        import org.apache.spark.sql.catalyst.expressions._
>                                                         ^
> On line 7: error: Unused import
>        Applicable -Wconf / @nowarn filters for this fatal warning: msg= of the message>, cat=unused-imports, site=
>        import org.apache.spark.sql.catalyst.plans.logical._
>                                                           ^
> On line 8: error: Unused import
>        Applicable -Wconf / @nowarn filters for this fatal warning: msg= of the message>, cat=unused-imports, site=
>        import org.apache.spark.sql.catalyst.rules._
>                                                   ^
> On line 9: error: Unused import
>        Applicable -Wconf / @nowarn filters for this fatal warning: msg= of the message>, cat=unused-imports, site=
>        import org.apache.spark.sql.catalyst.util._
>                                                  ^
> On line 10: error: Unused import
>        Applicable -Wconf / @nowarn filters for this fatal warning: msg= of the message>, cat=unused-imports, site=
>        import org.apache.spark.sql.execution
>                                    ^
> On line 11: error: Unused import
>        Applicable -Wconf / @nowarn filters for this fatal warning: msg= of the message>, cat=unused-imports, site=
>        import org.apache.spark.sql.functions._
>                                              ^
> On line 12: error: Unused import
>        Applicable -Wconf / @nowarn filters for this fatal warning: msg= of the message>, cat=unused-imports, site=
>        import org.apache.spark.sql.types._
>                                          ^
> On line 13: error: Unused import
>        Applicable -Wconf / @nowarn filters for this fatal warning: msg= of the message>, cat=unused-imports, site=
>        import sqlContext.implicits._
>                                    ^
> On line 17: error: Unused import
>        Applicable -Wconf / @nowarn filters for this fatal warning: msg= of the message>, cat=unused-imports, site=
>        import sqlContext._
>                          ^
> On line 18: error: Unused 

[jira] [Updated] (SPARK-46461) The `sbt console` command is not available

2023-12-19 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46461?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie updated SPARK-46461:
-
Priority: Minor  (was: Major)

> The `sbt console` command is not available
> --
>
> Key: SPARK-46461
> URL: https://issues.apache.org/jira/browse/SPARK-46461
> Project: Spark
>  Issue Type: Bug
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Priority: Minor
>
> # Unable to define expressions after executing the `build/sbt console` command
> {code:java}
> scala> val i = 1 // show
> package $line3 {
>   sealed class $read extends _root_.scala.Serializable {
>     def () = {
>       super.;
>       ()
>     };
>     sealed class $iw extends _root_.java.io.Serializable {
>       def () = {
>         super.;
>         ()
>       };
>       val i = 1
>     };
>     val $iw = new $iw.
>   };
>   object $read extends scala.AnyRef {
>     def () = {
>       super.;
>       ()
>     };
>     val INSTANCE = new $read.
>   }
> }
> warning: -target is deprecated: Use -release instead to compile against the 
> correct platform API.
> Applicable -Wconf / @nowarn filters for this warning: msg= message>, cat=deprecation
>        ^
>        error: expected class or object definition {code}
> 2.  Due to the default unused imports check, the error "unused imports" will 
> be reported after executing the `build/sbt sql/console` command
> {code:java}
> Welcome to Scala 2.13.12 (OpenJDK 64-Bit Server VM, Java 17.0.9).
> Type in expressions for evaluation. Or try :help.
> warning: -target is deprecated: Use -release instead to compile against the 
> correct platform API.
> Applicable -Wconf / @nowarn filters for this warning: msg= message>, cat=deprecation
>        import org.apache.spark.sql.catalyst.errors._
>                                             ^
> On line 6: error: object errors is not a member of package 
> org.apache.spark.sql.catalyst
>        import org.apache.spark.sql.catalyst.analysis._
>                                                      ^
> On line 4: error: Unused import
>        Applicable -Wconf / @nowarn filters for this fatal warning: msg= of the message>, cat=unused-imports, site=
>        import org.apache.spark.sql.catalyst.dsl._
>                                                 ^
> On line 5: error: Unused import
>        Applicable -Wconf / @nowarn filters for this fatal warning: msg= of the message>, cat=unused-imports, site=
>        import org.apache.spark.sql.catalyst.errors._
>                                                    ^
> On line 6: error: Unused import
>        Applicable -Wconf / @nowarn filters for this fatal warning: msg= of the message>, cat=unused-imports, site=
>        import org.apache.spark.sql.catalyst.expressions._
>                                                         ^
> On line 7: error: Unused import
>        Applicable -Wconf / @nowarn filters for this fatal warning: msg= of the message>, cat=unused-imports, site=
>        import org.apache.spark.sql.catalyst.plans.logical._
>                                                           ^
> On line 8: error: Unused import
>        Applicable -Wconf / @nowarn filters for this fatal warning: msg= of the message>, cat=unused-imports, site=
>        import org.apache.spark.sql.catalyst.rules._
>                                                   ^
> On line 9: error: Unused import
>        Applicable -Wconf / @nowarn filters for this fatal warning: msg= of the message>, cat=unused-imports, site=
>        import org.apache.spark.sql.catalyst.util._
>                                                  ^
> On line 10: error: Unused import
>        Applicable -Wconf / @nowarn filters for this fatal warning: msg= of the message>, cat=unused-imports, site=
>        import org.apache.spark.sql.execution
>                                    ^
> On line 11: error: Unused import
>        Applicable -Wconf / @nowarn filters for this fatal warning: msg= of the message>, cat=unused-imports, site=
>        import org.apache.spark.sql.functions._
>                                              ^
> On line 12: error: Unused import
>        Applicable -Wconf / @nowarn filters for this fatal warning: msg= of the message>, cat=unused-imports, site=
>        import org.apache.spark.sql.types._
>                                          ^
> On line 13: error: Unused import
>        Applicable -Wconf / @nowarn filters for this fatal warning: msg= of the message>, cat=unused-imports, site=
>        import sqlContext.implicits._
>                                    ^
> On line 17: error: Unused import
>        Applicable -Wconf / @nowarn filters for this fatal warning: msg= of the message>, cat=unused-imports, site=
>        import sqlContext._
>                          ^
> On line 18: error: Unused import
>   

[jira] [Created] (SPARK-46461) The `sbt console` command is not available

2023-12-19 Thread Yang Jie (Jira)
Yang Jie created SPARK-46461:


 Summary: The `sbt console` command is not available
 Key: SPARK-46461
 URL: https://issues.apache.org/jira/browse/SPARK-46461
 Project: Spark
  Issue Type: Improvement
  Components: Build
Affects Versions: 4.0.0
Reporter: Yang Jie


# Unable to define expressions after executing the `build/sbt console` command

{code:java}
scala> val i = 1 // show
package $line3 {
  sealed class $read extends _root_.scala.Serializable {
    def () = {
      super.;
      ()
    };
    sealed class $iw extends _root_.java.io.Serializable {
      def () = {
        super.;
        ()
      };
      val i = 1
    };
    val $iw = new $iw.
  };
  object $read extends scala.AnyRef {
    def () = {
      super.;
      ()
    };
    val INSTANCE = new $read.
  }
}
warning: -target is deprecated: Use -release instead to compile against the 
correct platform API.
Applicable -Wconf / @nowarn filters for this warning: msg=, cat=deprecation
       ^
       error: expected class or object definition {code}
2.  Due to the default unused imports check, the error "unused imports" will be 
reported after executing the `build/sbt sql/console` command
{code:java}
Welcome to Scala 2.13.12 (OpenJDK 64-Bit Server VM, Java 17.0.9).
Type in expressions for evaluation. Or try :help.
warning: -target is deprecated: Use -release instead to compile against the 
correct platform API.
Applicable -Wconf / @nowarn filters for this warning: msg=, cat=deprecation
       import org.apache.spark.sql.catalyst.errors._
                                            ^
On line 6: error: object errors is not a member of package 
org.apache.spark.sql.catalyst
       import org.apache.spark.sql.catalyst.analysis._
                                                     ^
On line 4: error: Unused import
       Applicable -Wconf / @nowarn filters for this fatal warning: msg=, cat=unused-imports, site=
       import org.apache.spark.sql.catalyst.dsl._
                                                ^
On line 5: error: Unused import
       Applicable -Wconf / @nowarn filters for this fatal warning: msg=, cat=unused-imports, site=
       import org.apache.spark.sql.catalyst.errors._
                                                   ^
On line 6: error: Unused import
       Applicable -Wconf / @nowarn filters for this fatal warning: msg=, cat=unused-imports, site=
       import org.apache.spark.sql.catalyst.expressions._
                                                        ^
On line 7: error: Unused import
       Applicable -Wconf / @nowarn filters for this fatal warning: msg=, cat=unused-imports, site=
       import org.apache.spark.sql.catalyst.plans.logical._
                                                          ^
On line 8: error: Unused import
       Applicable -Wconf / @nowarn filters for this fatal warning: msg=, cat=unused-imports, site=
       import org.apache.spark.sql.catalyst.rules._
                                                  ^
On line 9: error: Unused import
       Applicable -Wconf / @nowarn filters for this fatal warning: msg=, cat=unused-imports, site=
       import org.apache.spark.sql.catalyst.util._
                                                 ^
On line 10: error: Unused import
       Applicable -Wconf / @nowarn filters for this fatal warning: msg=, cat=unused-imports, site=
       import org.apache.spark.sql.execution
                                   ^
On line 11: error: Unused import
       Applicable -Wconf / @nowarn filters for this fatal warning: msg=, cat=unused-imports, site=
       import org.apache.spark.sql.functions._
                                             ^
On line 12: error: Unused import
       Applicable -Wconf / @nowarn filters for this fatal warning: msg=, cat=unused-imports, site=
       import org.apache.spark.sql.types._
                                         ^
On line 13: error: Unused import
       Applicable -Wconf / @nowarn filters for this fatal warning: msg=, cat=unused-imports, site=
       import sqlContext.implicits._
                                   ^
On line 17: error: Unused import
       Applicable -Wconf / @nowarn filters for this fatal warning: msg=, cat=unused-imports, site=
       import sqlContext._
                         ^
On line 18: error: Unused import
       Applicable -Wconf / @nowarn filters for this fatal warning: msg=, cat=unused-imports, site=


scala>  {code}
It is necessary to delete `-Wunused:imports` from both SparkBuild. sbt and 
pom.xml in order to avoid this error



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-46302) Fix maven daily testing

2023-12-19 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46302?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie reassigned SPARK-46302:


Assignee: BingKun Pan

> Fix maven daily testing
> ---
>
> Key: SPARK-46302
> URL: https://issues.apache.org/jira/browse/SPARK-46302
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-46302) Fix maven daily testing

2023-12-19 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46302?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-46302.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 44208
[https://github.com/apache/spark/pull/44208]

> Fix maven daily testing
> ---
>
> Key: SPARK-46302
> URL: https://issues.apache.org/jira/browse/SPARK-46302
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: BingKun Pan
>Assignee: BingKun Pan
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-46454) Remove redundant `headOption`

2023-12-19 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46454?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie updated SPARK-46454:
-
Priority: Trivial  (was: Major)

> Remove redundant `headOption`
> -
>
> Key: SPARK-46454
> URL: https://issues.apache.org/jira/browse/SPARK-46454
> Project: Spark
>  Issue Type: Improvement
>  Components: DStreams, SQL
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Priority: Trivial
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46454) Remove redundant `headOption`

2023-12-19 Thread Yang Jie (Jira)
Yang Jie created SPARK-46454:


 Summary: Remove redundant `headOption`
 Key: SPARK-46454
 URL: https://issues.apache.org/jira/browse/SPARK-46454
 Project: Spark
  Issue Type: Improvement
  Components: DStreams, SQL
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-46441) Upgrade to use `org.scalatestplus:mockito-5`

2023-12-17 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46441?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie updated SPARK-46441:
-
Summary: Upgrade to use `org.scalatestplus:mockito-5`  (was: Migrating from 
Mockito 4 to 5)

> Upgrade to use `org.scalatestplus:mockito-5`
> 
>
> Key: SPARK-46441
> URL: https://issues.apache.org/jira/browse/SPARK-46441
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Priority: Minor
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46441) Migrating from Mockito 4 to 5

2023-12-17 Thread Yang Jie (Jira)
Yang Jie created SPARK-46441:


 Summary: Migrating from Mockito 4 to 5
 Key: SPARK-46441
 URL: https://issues.apache.org/jira/browse/SPARK-46441
 Project: Spark
  Issue Type: Improvement
  Components: Build
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-46435) Exclude `branch-3.3` from `publish_snapshot.yml`

2023-12-17 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46435?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie reassigned SPARK-46435:


Assignee: Dongjoon Hyun

> Exclude `branch-3.3` from `publish_snapshot.yml`
> 
>
> Key: SPARK-46435
> URL: https://issues.apache.org/jira/browse/SPARK-46435
> Project: Spark
>  Issue Type: Task
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Minor
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-46435) Exclude `branch-3.3` from `publish_snapshot.yml`

2023-12-17 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46435?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-46435.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 44390
[https://github.com/apache/spark/pull/44390]

> Exclude `branch-3.3` from `publish_snapshot.yml`
> 
>
> Key: SPARK-46435
> URL: https://issues.apache.org/jira/browse/SPARK-46435
> Project: Spark
>  Issue Type: Task
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46436) Clean up compatibility configurations related to branch-3.3 daily testing in build_and_test.yml

2023-12-17 Thread Yang Jie (Jira)
Yang Jie created SPARK-46436:


 Summary: Clean up compatibility configurations related to 
branch-3.3 daily testing in build_and_test.yml
 Key: SPARK-46436
 URL: https://issues.apache.org/jira/browse/SPARK-46436
 Project: Spark
  Issue Type: Improvement
  Components: Project Infra
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-46416) Add @tailrec to HadoopFSUtils#shouldFilterOutPath

2023-12-14 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46416?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-46416.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 44345
[https://github.com/apache/spark/pull/44345]

> Add @tailrec to HadoopFSUtils#shouldFilterOutPath
> -
>
> Key: SPARK-46416
> URL: https://issues.apache.org/jira/browse/SPARK-46416
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-46416) Add @tailrec to HadoopFSUtils#shouldFilterOutPath

2023-12-14 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46416?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie reassigned SPARK-46416:


Assignee: Yang Jie

> Add @tailrec to HadoopFSUtils#shouldFilterOutPath
> -
>
> Key: SPARK-46416
> URL: https://issues.apache.org/jira/browse/SPARK-46416
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Minor
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46416) Add @tailrec to HadoopFSUtils#shouldFilterOutPath

2023-12-14 Thread Yang Jie (Jira)
Yang Jie created SPARK-46416:


 Summary: Add @tailrec to HadoopFSUtils#shouldFilterOutPath
 Key: SPARK-46416
 URL: https://issues.apache.org/jira/browse/SPARK-46416
 Project: Spark
  Issue Type: Improvement
  Components: Spark Core
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46411) Change to use bcprov/bcpkix-jdk18on for test

2023-12-14 Thread Yang Jie (Jira)
Yang Jie created SPARK-46411:


 Summary: Change to use bcprov/bcpkix-jdk18on for test
 Key: SPARK-46411
 URL: https://issues.apache.org/jira/browse/SPARK-46411
 Project: Spark
  Issue Type: Improvement
  Components: Build
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-46401) Should use `!isEmpty()` on RoaringBitmap instead of `getCardinality() > 0`

2023-12-14 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46401?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie reassigned SPARK-46401:


Assignee: Yang Jie

> Should use `!isEmpty()` on RoaringBitmap instead of `getCardinality() > 0`
> --
>
> Key: SPARK-46401
> URL: https://issues.apache.org/jira/browse/SPARK-46401
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-46401) Should use `!isEmpty()` on RoaringBitmap instead of `getCardinality() > 0`

2023-12-14 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46401?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-46401.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 44347
[https://github.com/apache/spark/pull/44347]

> Should use `!isEmpty()` on RoaringBitmap instead of `getCardinality() > 0`
> --
>
> Key: SPARK-46401
> URL: https://issues.apache.org/jira/browse/SPARK-46401
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-42307) Assign name to _LEGACY_ERROR_TEMP_2232

2023-12-13 Thread Yang Jie (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-42307?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17796561#comment-17796561
 ] 

Yang Jie commented on SPARK-42307:
--

Sorry, I mistakenly responded to this Jira, please ignore the previous message.
 
 
 
 
 

> Assign name to _LEGACY_ERROR_TEMP_2232
> --
>
> Key: SPARK-42307
> URL: https://issues.apache.org/jira/browse/SPARK-42307
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Affects Versions: 3.4.0
>Reporter: Haejoon Lee
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Comment Edited] (SPARK-44172) Use Jackson API Instead of Json4s

2023-12-13 Thread Yang Jie (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-44172?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17796560#comment-17796560
 ] 

Yang Jie edited comment on SPARK-44172 at 12/14/23 5:30 AM:


[~hannahkamundson] 
Thank you very much for your interest in this work, but it would be best to 
initiate a formal discussion in the dev mailing list before starting work.

 

A long time ago, I submitted a [PR|https://github.com/apache/spark/pull/37604], 
but didn't get much of a response. I created this Jira, but I'm also not sure 
if now is the right time to drop the dependency on Json4s.


was (Author: luciferyang):
[~hannahkamundson] 
Thank you very much for your interest in this ticket, but it would be best to 
initiate a formal discussion in the dev mailing list before starting work.

 

A long time ago, I submitted a [PR|https://github.com/apache/spark/pull/37604], 
but didn't get much of a response. I created this Jira, but I'm also not sure 
if now is the right time to drop the dependency on Json4s.

> Use Jackson API Instead of Json4s
> -
>
> Key: SPARK-44172
> URL: https://issues.apache.org/jira/browse/SPARK-44172
> Project: Spark
>  Issue Type: Sub-task
>  Components: Connect, MLlib, Spark Core, SQL, Structured Streaming
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] (SPARK-42307) Assign name to _LEGACY_ERROR_TEMP_2232

2023-12-13 Thread Yang Jie (Jira)


[ https://issues.apache.org/jira/browse/SPARK-42307 ]


Yang Jie deleted comment on SPARK-42307:
--

was (Author: luciferyang):
[~hannahkamundson] 
Thank you very much for your interest in this ticket, but it would be best to 
initiate a formal discussion in the dev mailing list before starting work.

 

A long time ago, I submitted a [PR|https://github.com/apache/spark/pull/37604], 
but didn't get much of a response. I created this Jira, but I'm also not sure 
if now is the right time to drop the dependency on Json4s.

> Assign name to _LEGACY_ERROR_TEMP_2232
> --
>
> Key: SPARK-42307
> URL: https://issues.apache.org/jira/browse/SPARK-42307
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Affects Versions: 3.4.0
>Reporter: Haejoon Lee
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-44172) Use Jackson API Instead of Json4s

2023-12-13 Thread Yang Jie (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-44172?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17796560#comment-17796560
 ] 

Yang Jie commented on SPARK-44172:
--

[~hannahkamundson] 
Thank you very much for your interest in this ticket, but it would be best to 
initiate a formal discussion in the dev mailing list before starting work.

 

A long time ago, I submitted a [PR|https://github.com/apache/spark/pull/37604], 
but didn't get much of a response. I created this Jira, but I'm also not sure 
if now is the right time to drop the dependency on Json4s.

> Use Jackson API Instead of Json4s
> -
>
> Key: SPARK-44172
> URL: https://issues.apache.org/jira/browse/SPARK-44172
> Project: Spark
>  Issue Type: Sub-task
>  Components: Connect, MLlib, Spark Core, SQL, Structured Streaming
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Comment Edited] (SPARK-42307) Assign name to _LEGACY_ERROR_TEMP_2232

2023-12-13 Thread Yang Jie (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-42307?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17796559#comment-17796559
 ] 

Yang Jie edited comment on SPARK-42307 at 12/14/23 5:30 AM:


[~hannahkamundson] 
Thank you very much for your interest in this ticket, but it would be best to 
initiate a formal discussion in the dev mailing list before starting work.

 

A long time ago, I submitted a [PR|https://github.com/apache/spark/pull/37604], 
but didn't get much of a response. I created this Jira, but I'm also not sure 
if now is the right time to drop the dependency on Json4s.


was (Author: luciferyang):
[~hannahkamundson] 
Thank you very much for your interest in this ticket, but it would be best to 
initiate a formal discussion in the dev mailing list before starting work.

 

A long time ago, I submitted a [PR|https://github.com/apache/spark/pull/37604], 
but didn't get much of a response. I created this Jira, but I'm also not sure 
if now is the right time to drop the dependency on Json4s.
 
 
 
 
 
 
 
 
 
 

> Assign name to _LEGACY_ERROR_TEMP_2232
> --
>
> Key: SPARK-42307
> URL: https://issues.apache.org/jira/browse/SPARK-42307
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Affects Versions: 3.4.0
>Reporter: Haejoon Lee
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-42307) Assign name to _LEGACY_ERROR_TEMP_2232

2023-12-13 Thread Yang Jie (Jira)


[ 
https://issues.apache.org/jira/browse/SPARK-42307?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17796559#comment-17796559
 ] 

Yang Jie commented on SPARK-42307:
--

[~hannahkamundson] 
Thank you very much for your interest in this ticket, but it would be best to 
initiate a formal discussion in the dev mailing list before starting work.

 

A long time ago, I submitted a [PR|https://github.com/apache/spark/pull/37604], 
but didn't get much of a response. I created this Jira, but I'm also not sure 
if now is the right time to drop the dependency on Json4s.
 
 
 
 
 
 
 
 
 
 

> Assign name to _LEGACY_ERROR_TEMP_2232
> --
>
> Key: SPARK-42307
> URL: https://issues.apache.org/jira/browse/SPARK-42307
> Project: Spark
>  Issue Type: Sub-task
>  Components: SQL
>Affects Versions: 3.4.0
>Reporter: Haejoon Lee
>Priority: Major
>  Labels: pull-request-available
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46401) Should use `!isEmpty()` on RoaringBitmap instead of `getCardinality() > 0`

2023-12-13 Thread Yang Jie (Jira)
Yang Jie created SPARK-46401:


 Summary: Should use `!isEmpty()` on RoaringBitmap instead of 
`getCardinality() > 0`
 Key: SPARK-46401
 URL: https://issues.apache.org/jira/browse/SPARK-46401
 Project: Spark
  Issue Type: Improvement
  Components: Spark Core
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46389) Should manually close the RocksDB/LevelDB when checkVersion fails

2023-12-12 Thread Yang Jie (Jira)
Yang Jie created SPARK-46389:


 Summary: Should manually close the RocksDB/LevelDB when 
checkVersion fails
 Key: SPARK-46389
 URL: https://issues.apache.org/jira/browse/SPARK-46389
 Project: Spark
  Issue Type: Improvement
  Components: Spark Core
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-46376) Simplify the code to generate the Spark tarball `filename` in the `HiveExternalCatalogVersionsSuite`.

2023-12-11 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46376?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie updated SPARK-46376:
-
Summary: Simplify the code to generate the Spark tarball `filename` in the 
`HiveExternalCatalogVersionsSuite`.  (was: Simplify the way to generate the 
Spark tarball `filename` in the `HiveExternalCatalogVersionsSuite`.)

> Simplify the code to generate the Spark tarball `filename` in the 
> `HiveExternalCatalogVersionsSuite`.
> -
>
> Key: SPARK-46376
> URL: https://issues.apache.org/jira/browse/SPARK-46376
> Project: Spark
>  Issue Type: Improvement
>  Components: SQL, Tests
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Priority: Minor
>  Labels: pull-request-available
>
> {code:java}
> val filename = VersionUtils.majorMinorPatchVersion(version) match {
>   case Some((major, _, _)) if major > 3 => s"spark-$version-bin-hadoop3.tgz"
>   case Some((3, minor, _)) if minor >= 3 => s"spark-$version-bin-hadoop3.tgz"
>   case Some((3, minor, _)) if minor < 3 => s"spark-$version-bin-hadoop3.2.tgz"
>   case Some((_, _, _)) => s"spark-$version-bin-hadoop2.7.tgz"
>   case None => s"spark-$version-bin-hadoop2.7.tgz"
> } {code}
> Currently, the minimum tested version is Spark 3.3, so there is no need for 
> complex case matching anymore.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46376) Simplify the way to generate the Spark tarball `filename` in the `HiveExternalCatalogVersionsSuite`.

2023-12-11 Thread Yang Jie (Jira)
Yang Jie created SPARK-46376:


 Summary: Simplify the way to generate the Spark tarball `filename` 
in the `HiveExternalCatalogVersionsSuite`.
 Key: SPARK-46376
 URL: https://issues.apache.org/jira/browse/SPARK-46376
 Project: Spark
  Issue Type: Improvement
  Components: SQL, Tests
Affects Versions: 4.0.0
Reporter: Yang Jie


{code:java}
val filename = VersionUtils.majorMinorPatchVersion(version) match {
  case Some((major, _, _)) if major > 3 => s"spark-$version-bin-hadoop3.tgz"
  case Some((3, minor, _)) if minor >= 3 => s"spark-$version-bin-hadoop3.tgz"
  case Some((3, minor, _)) if minor < 3 => s"spark-$version-bin-hadoop3.2.tgz"
  case Some((_, _, _)) => s"spark-$version-bin-hadoop2.7.tgz"
  case None => s"spark-$version-bin-hadoop2.7.tgz"
} {code}
Currently, the minimum tested version is Spark 3.3, so there is no need for 
complex case matching anymore.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46358) Mino refactor of `ResponseValidator#verifyResponse`

2023-12-10 Thread Yang Jie (Jira)
Yang Jie created SPARK-46358:


 Summary: Mino refactor of `ResponseValidator#verifyResponse`
 Key: SPARK-46358
 URL: https://issues.apache.org/jira/browse/SPARK-46358
 Project: Spark
  Issue Type: Improvement
  Components: Connect
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-46335) Upgrade Maven to 3.9.6 for MNG-7913

2023-12-08 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46335?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-46335.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 44267
[https://github.com/apache/spark/pull/44267]

> Upgrade Maven to 3.9.6 for MNG-7913
> ---
>
> Key: SPARK-46335
> URL: https://issues.apache.org/jira/browse/SPARK-46335
> Project: Spark
>  Issue Type: Bug
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Assignee: Dongjoon Hyun
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46325) Remove unnecessary override functions when constructing `WrappedCloseableIterator` in `ResponseValidator#wrapIterator`

2023-12-07 Thread Yang Jie (Jira)
Yang Jie created SPARK-46325:


 Summary: Remove unnecessary override functions when constructing 
`WrappedCloseableIterator` in `ResponseValidator#wrapIterator`
 Key: SPARK-46325
 URL: https://issues.apache.org/jira/browse/SPARK-46325
 Project: Spark
  Issue Type: Improvement
  Components: Connect
Affects Versions: 4.0.0
Reporter: Yang Jie


Should reuse functions defined in {{WrappedCloseableIterator}} instead of 
overriding them

 
*ResponseValidator#wrapIterator*
 
{code:java}
def wrapIterator[T <: GeneratedMessageV3, V <: CloseableIterator[T]](
inner: V): WrappedCloseableIterator[T] = {
  new WrappedCloseableIterator[T] {

override def innerIterator: Iterator[T] = inner

override def hasNext: Boolean = {
  innerIterator.hasNext
}

override def next(): T = {
  verifyResponse {
innerIterator.next()
  }
}

override def close(): Unit = {
  innerIterator match {
case it: CloseableIterator[T] => it.close()
case _ => // nothing
  }
}
  }
} {code}
*WrappedCloseableIterator*
 
{code:java}
private[sql] abstract class WrappedCloseableIterator[E] extends 
CloseableIterator[E] {

  def innerIterator: Iterator[E]

  override def next(): E = innerIterator.next()

  override def hasNext: Boolean = innerIterator.hasNext

  override def close(): Unit = innerIterator match {
case it: CloseableIterator[E] => it.close()
case _ => // nothing
  }
} {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-46275) Protobuf: Permissive mode should return null rather than struct with null fields

2023-12-07 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46275?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie updated SPARK-46275:
-
Affects Version/s: 3.4.2
   (was: 3.4.0)

> Protobuf: Permissive mode should return null rather than struct with null 
> fields
> 
>
> Key: SPARK-46275
> URL: https://issues.apache.org/jira/browse/SPARK-46275
> Project: Spark
>  Issue Type: Bug
>  Components: Protobuf, Structured Streaming
>Affects Versions: 3.4.2, 3.5.0, 4.0.0
>Reporter: Raghu Angadi
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0, 3.5.1
>
>
> Consider a protobuf with two fields {{message Person { string name = 1; int 
> id = 2; }}
>  * The struct returned by {{from_protobuf("Person")}} like this:
>  ** STRUCT
>  * If the underlying binary record fails to deserialize, it results in a 
> exception and query fails.
>  * Buf if the option {{mode}} is set to {{PERMISSIVE}} , malformed records 
> are tolerated {{null}} is returned.
>  ** {*}BUT{*}: The retuned struct looks like this \{"name: null, id: "null"}
>  * 
>  ** 
>  *** This is not convenient to the user.
>  *** *Ideally,* {{from_protobuf()}} *should return* {{null}} *.*
>  ** {{from_protobuf()}} borrowed the current behavior from {{from_avro()}} 
> implementation. It is not clear what the motivation was.
> I think we should update the implementation to return {{null}} rather than a 
> struct with null-fields inside.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-46275) Protobuf: Permissive mode should return null rather than struct with null fields

2023-12-07 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46275?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie updated SPARK-46275:
-
Affects Version/s: 3.5.0
   4.0.0

> Protobuf: Permissive mode should return null rather than struct with null 
> fields
> 
>
> Key: SPARK-46275
> URL: https://issues.apache.org/jira/browse/SPARK-46275
> Project: Spark
>  Issue Type: Bug
>  Components: Protobuf, Structured Streaming
>Affects Versions: 3.4.0, 3.5.0, 4.0.0
>Reporter: Raghu Angadi
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0, 3.5.1
>
>
> Consider a protobuf with two fields {{message Person { string name = 1; int 
> id = 2; }}
>  * The struct returned by {{from_protobuf("Person")}} like this:
>  ** STRUCT
>  * If the underlying binary record fails to deserialize, it results in a 
> exception and query fails.
>  * Buf if the option {{mode}} is set to {{PERMISSIVE}} , malformed records 
> are tolerated {{null}} is returned.
>  ** {*}BUT{*}: The retuned struct looks like this \{"name: null, id: "null"}
>  * 
>  ** 
>  *** This is not convenient to the user.
>  *** *Ideally,* {{from_protobuf()}} *should return* {{null}} *.*
>  ** {{from_protobuf()}} borrowed the current behavior from {{from_avro()}} 
> implementation. It is not clear what the motivation was.
> I think we should update the implementation to return {{null}} rather than a 
> struct with null-fields inside.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-46305) Remove the special Zookeeper version in the `streaming-kafka-0-10` and `sql-kafka-0-10` modules

2023-12-07 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46305?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie updated SPARK-46305:
-
Component/s: Build
 (was: SQL)

>  Remove the special Zookeeper version in the `streaming-kafka-0-10` and 
> `sql-kafka-0-10` modules
> 
>
> Key: SPARK-46305
> URL: https://issues.apache.org/jira/browse/SPARK-46305
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Priority: Minor
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46305) Remove the special Zookeeper version in the `streaming-kafka-0-10` and `sql-kafka-0-10` modules

2023-12-07 Thread Yang Jie (Jira)
Yang Jie created SPARK-46305:


 Summary:  Remove the special Zookeeper version in the 
`streaming-kafka-0-10` and `sql-kafka-0-10` modules
 Key: SPARK-46305
 URL: https://issues.apache.org/jira/browse/SPARK-46305
 Project: Spark
  Issue Type: Improvement
  Components: SQL
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-46283) Avoid testing the `streaming-kinesis-asl` module in the daily tests of branch-3.x.

2023-12-06 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46283?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie updated SPARK-46283:
-
Description: 
After the merge of https://github.com/apache/spark/pull/43736, the master 
branch began testing the `streaming-kinesis-asl` module. 

At the same time, because the daily test will reuse `build_and_test.yml`, the 
daily test of branch-3.x also began testing `streaming-kinesis-asl`. 

However, in branch-3.x, the env `ENABLE_KINESIS_TESTS` is hard-coded as 1 in 
`dev/sparktestsupport/modules.py`:

https://github.com/apache/spark/blob/1321b4e64deaa1e58bf297c25b72319083056568/dev/sparktestsupport/modules.py#L332-L346

which leads to the failure of the daily test of branch-3.x:

- branch-3.3: https://github.com/apache/spark/actions/runs/7111246311
- branch-3.4: https://github.com/apache/spark/actions/runs/7098435892
- branch-3.5: https://github.com/apache/spark/actions/runs/7099811235

```
[info] org.apache.spark.streaming.kinesis.WithoutAggregationKinesisStreamSuite 
*** ABORTED *** (1 second, 14 milliseconds)
[info]   java.lang.Exception: Kinesis tests enabled using environment variable 
ENABLE_KINESIS_TESTS
[info] but could not find AWS credentials. Please follow instructions in AWS 
documentation
[info] to set the credentials in your system such that the 
DefaultAWSCredentialsProviderChain
[info] can find the credentials.
[info]   at 
org.apache.spark.streaming.kinesis.KinesisTestUtils$.getAWSCredentials(KinesisTestUtils.scala:258)
[info]   at 
org.apache.spark.streaming.kinesis.KinesisTestUtils.kinesisClient$lzycompute(KinesisTestUtils.scala:58)
[info]   at 
org.apache.spark.streaming.kinesis.KinesisTestUtils.kinesisClient(KinesisTestUtils.scala:57)
[info]   at 
org.apache.spark.streaming.kinesis.KinesisTestUtils.describeStream(KinesisTestUtils.scala:168)
[info]   at 
org.apache.spark.streaming.kinesis.KinesisTestUtils.findNonExistentStreamName(KinesisTestUtils.scala:181)
[info]   at 
org.apache.spark.streaming.kinesis.KinesisTestUtils.createStream(KinesisTestUtils.scala:84)
[info]   at 
org.apache.spark.streaming.kinesis.KinesisStreamTests.$anonfun$beforeAll$1(KinesisStreamSuite.scala:61)
[info]   at 
org.apache.spark.streaming.kinesis.KinesisFunSuite.runIfTestsEnabled(KinesisFunSuite.scala:41)
[info]   at 
org.apache.spark.streaming.kinesis.KinesisFunSuite.runIfTestsEnabled$(KinesisFunSuite.scala:39)
[info]   at 
org.apache.spark.streaming.kinesis.KinesisStreamTests.runIfTestsEnabled(KinesisStreamSuite.scala:42)
[info]   at 
org.apache.spark.streaming.kinesis.KinesisStreamTests.beforeAll(KinesisStreamSuite.scala:59)
[info]   at 
org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:212)
[info]   at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
[info]   at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
[info]   at 
org.apache.spark.streaming.kinesis.KinesisStreamTests.org$scalatest$BeforeAndAfter$$super$run(KinesisStreamSuite.scala:42)
[info]   at org.scalatest.BeforeAndAfter.run(BeforeAndAfter.scala:273)
[info]   at org.scalatest.BeforeAndAfter.run$(BeforeAndAfter.scala:271)
[info]   at 
org.apache.spark.streaming.kinesis.KinesisStreamTests.run(KinesisStreamSuite.scala:42)
[info]   at 
org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:321)
[info]   at 
org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:517)
[info]   at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:414)
[info]   at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[info]   at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[info]   at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[info]   at java.lang.Thread.run(Thread.java:750)
[info] Test run 
org.apache.spark.streaming.kinesis.JavaKinesisInputDStreamBuilderSuite started
[info] Test 
org.apache.spark.streaming.kinesis.JavaKinesisInputDStreamBuilderSuite.testJavaKinesisDStreamBuilderOldApi
 started
[info] Test 
org.apache.spark.streaming.kinesis.JavaKinesisInputDStreamBuilderSuite.testJavaKinesisDStreamBuilder
 started
[info] Test run 
org.apache.spark.streaming.kinesis.JavaKinesisInputDStreamBuilderSuite 
finished: 0 failed, 0 ignored, 2 total, 0.244s
[info] ScalaTest
[info] Run completed in 8 seconds, 542 milliseconds.
[info] Total number of tests run: 31
[info] Suites: completed 4, aborted 4
[info] Tests: succeeded 31, failed 0, canceled 0, ignored 0, pending 0
[info] *** 4 SUITES ABORTED ***
[error] Error: Total 37, Failed 0, Errors 4, Passed 33
[error] Error during tests:
[error]   
org.apache.spark.streaming.kinesis.WithoutAggregationKinesisBackedBlockRDDSuite
[error]   
org.apache.spark.streaming.kinesis.WithAggregationKinesisBackedBlockRDDSuite
[error]   org.apache.spark.streaming.kinesis.WithAggregationKinesisStreamSuite
[error]   

[jira] [Updated] (SPARK-46283) Avoid testing the `streaming-kinesis-asl` module in the daily tests of branch-3.x.

2023-12-06 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46283?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie updated SPARK-46283:
-
Summary: Avoid testing the `streaming-kinesis-asl` module in the daily 
tests of branch-3.x.  (was: "Avoid testing the `streaming-kinesis-asl` module 
in the daily tests of branch-3.x.")

> Avoid testing the `streaming-kinesis-asl` module in the daily tests of 
> branch-3.x.
> --
>
> Key: SPARK-46283
> URL: https://issues.apache.org/jira/browse/SPARK-46283
> Project: Spark
>  Issue Type: Improvement
>  Components: Project Infra
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Priority: Major
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46283) "Avoid testing the `streaming-kinesis-asl` module in the daily tests of branch-3.x."

2023-12-06 Thread Yang Jie (Jira)
Yang Jie created SPARK-46283:


 Summary: "Avoid testing the `streaming-kinesis-asl` module in the 
daily tests of branch-3.x."
 Key: SPARK-46283
 URL: https://issues.apache.org/jira/browse/SPARK-46283
 Project: Spark
  Issue Type: Improvement
  Components: Project Infra
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-46257) Upgrade Derby to 10.16.1.1

2023-12-05 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46257?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-46257.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 44174
[https://github.com/apache/spark/pull/44174]

> Upgrade Derby to 10.16.1.1
> --
>
> Key: SPARK-46257
> URL: https://issues.apache.org/jira/browse/SPARK-46257
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> https://db.apache.org/derby/releases/release-10_16_1_1.cgi



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Assigned] (SPARK-46257) Upgrade Derby to 10.16.1.1

2023-12-05 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46257?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie reassigned SPARK-46257:


Assignee: Yang Jie

> Upgrade Derby to 10.16.1.1
> --
>
> Key: SPARK-46257
> URL: https://issues.apache.org/jira/browse/SPARK-46257
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Assignee: Yang Jie
>Priority: Major
>  Labels: pull-request-available
>
> https://db.apache.org/derby/releases/release-10_16_1_1.cgi



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Resolved] (SPARK-44977) Upgrade Derby to 10.15.1.3+

2023-12-05 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-44977?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie resolved SPARK-44977.
--
Fix Version/s: 4.0.0
   Resolution: Fixed

Issue resolved by pull request 44174
[https://github.com/apache/spark/pull/44174]

> Upgrade Derby to 10.15.1.3+
> ---
>
> Key: SPARK-44977
> URL: https://issues.apache.org/jira/browse/SPARK-44977
> Project: Spark
>  Issue Type: Sub-task
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Dongjoon Hyun
>Priority: Minor
>  Labels: pull-request-available
> Fix For: 4.0.0
>
>
> Currently, Derby is fixed at 10.14.2.0 because 10.15.1.3 requires JDK9 at 
> least.
> This issue aims to upgrade it after dropping Java 8.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46263) Clean up unnecessary `SeqOps.view` and `ArrayOps.view` conversions.

2023-12-04 Thread Yang Jie (Jira)
Yang Jie created SPARK-46263:


 Summary: Clean up unnecessary `SeqOps.view` and `ArrayOps.view` 
conversions.
 Key: SPARK-46263
 URL: https://issues.apache.org/jira/browse/SPARK-46263
 Project: Spark
  Issue Type: Improvement
  Components: MLlib, Spark Core, SQL
Affects Versions: 4.0.0
Reporter: Yang Jie






--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-46257) Upgrade Derby to 10.16.1.1

2023-12-04 Thread Yang Jie (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-46257?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yang Jie updated SPARK-46257:
-
Summary: Upgrade Derby to 10.16.1.1  (was: Upgrade Derby to 10.17.1.0)

> Upgrade Derby to 10.16.1.1
> --
>
> Key: SPARK-46257
> URL: https://issues.apache.org/jira/browse/SPARK-46257
> Project: Spark
>  Issue Type: Improvement
>  Components: Build
>Affects Versions: 4.0.0
>Reporter: Yang Jie
>Priority: Major
>
> https://db.apache.org/derby/releases/release-10_17_1_0.cgi#New+Features



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-46257) Upgrade Derby to 10.17.1.0

2023-12-04 Thread Yang Jie (Jira)
Yang Jie created SPARK-46257:


 Summary: Upgrade Derby to 10.17.1.0
 Key: SPARK-46257
 URL: https://issues.apache.org/jira/browse/SPARK-46257
 Project: Spark
  Issue Type: Improvement
  Components: Build
Affects Versions: 4.0.0
Reporter: Yang Jie


https://db.apache.org/derby/releases/release-10_17_1_0.cgi#New+Features



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



<    1   2   3   4   5   6   7   8   9   10   >