FWIW here is the Databricks statement on it. Not the same as Spark but includes Spark of course.
https://databricks.com/blog/2021/12/13/log4j2-vulnerability-cve-2021-44228-research-and-assessment.html Yes the question is almost surely more whether user apps are affected, not Spark itself. On Tue, Dec 14, 2021, 7:55 AM Steve Loughran <ste...@cloudera.com.invalid> wrote: > log4j 1.2.17 is not vulnerable. There is an existing CVE there from a log > aggregation servlet; Cloudera products ship a patched release with that > servlet stripped...asf projects are not allowed to do that. > > But: some recent Cloudera Products do include log4j 2.x, so colleagues of > mine are busy patching and retesting everything. If anyone replaces the > vulnerable jars themselves, remember to look in spark.tar.gz on hdfs to > make sure it is safe. > > > hadoop stayed on log4j 1.2.17 because 2.x > * would have broken all cluster management tools which configured > log4j.properties files > * wouldn't let us use System properties to can I figure logging... That is > really useful when you want to run a job with debug logging > * didn't support the no capture we use in mockito and functional tests > > But: the SLF4J it's used throughout; spark doesn't need to be held back by > that choice and can use any backend you want > > I don't know what we will do now; akira has just suggested logback > https://issues.apache.org/jira/browse/HADOOP-12956 > > had I not just broken a collar bone and so unable to code, I would have > added a new command to audit the the hadoop class path to verify it wasn't > vulnerable. Someone could do the same for spark -where you would want an > RDD where the probe would also take place in worker tasks to validate the > the cluster safety more broadly, including the tarball. > > meanwhile, if your product is not exposed -probably worth mentioning on > the users mailing list so as to help people focus their attention. It's > probably best to work with everyone who produces spark based Products so > that you can have a single summary. > > On Tue, 14 Dec 2021 at 01:31, Qian Sun <qian.sun2...@gmail.com> wrote: > >> My understanding is that we don’t need to do anything. Log4j2-core not >> used in spark. >> >> > 2021年12月13日 下午12:45,Pralabh Kumar <pralabhku...@gmail.com> 写道: >> > >> > Hi developers, users >> > >> > Spark is built using log4j 1.2.17 . Is there a plan to upgrade based on >> recent CVE detected ? >> > >> > >> > Regards >> > Pralabh kumar >> >> >> --------------------------------------------------------------------- >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org >> >>