[
https://issues.apache.org/jira/browse/SPARK-49844?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=18045416#comment-18045416
]
Andreas Thum commented on SPARK-49844:
--------------------------------------
As far as I understood, the Upgrade of Zookeeper will not be backported to
Spark 3.X, see
[https://github.com/apache/spark/pull/43844#issuecomment-3468731350]
> PySpark requiring vulnerable Apache Zookeeper version 3.6.3
> -----------------------------------------------------------
>
> Key: SPARK-49844
> URL: https://issues.apache.org/jira/browse/SPARK-49844
> Project: Spark
> Issue Type: Bug
> Components: Java API
> Affects Versions: 3.5.3
> Reporter: Zach Barnett
> Priority: Major
>
> When installing pyspark<4,
> The package includes the /deps/jars/zookeeper-3.6.3.jar
> This is being flagged as a high severity vulnerability in applications
> require pyspark 3.X as a dependency.
> [CVE-2023-44981|https://zookeeper.apache.org/security.html#CVE-2023-44981]
> This is despite the [pom.xml specifying zookeeper version
> 3.9.2|https://github.com/apache/spark/blob/3093ad68d2a3c6bab9c1605381d27e700766be22/pom.xml#L130].
> It seems there is a bug with how this zookeeper dependency is being resolved.
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]