[jira] [Reopened] (SPARK-48505) Simplify the implementation of Utils#isG1GC
[ https://issues.apache.org/jira/browse/SPARK-48505?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie reopened SPARK-48505: -- > Simplify the implementation of Utils#isG1GC > --- > > Key: SPARK-48505 > URL: https://issues.apache.org/jira/browse/SPARK-48505 > Project: Spark > Issue Type: Improvement > Components: Spark Core >Affects Versions: 4.0.0 >Reporter: Yang Jie >Assignee: Yang Jie >Priority: Minor > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48522) Update Stream Library to 2.9.8
[ https://issues.apache.org/jira/browse/SPARK-48522?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie reassigned SPARK-48522: Assignee: Kent Yao > Update Stream Library to 2.9.8 > -- > > Key: SPARK-48522 > URL: https://issues.apache.org/jira/browse/SPARK-48522 > Project: Spark > Issue Type: Dependency upgrade > Components: Build >Affects Versions: 4.0.0 >Reporter: Kent Yao >Assignee: Kent Yao >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48522) Update Stream Library to 2.9.8
[ https://issues.apache.org/jira/browse/SPARK-48522?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie resolved SPARK-48522. -- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46861 [https://github.com/apache/spark/pull/46861] > Update Stream Library to 2.9.8 > -- > > Key: SPARK-48522 > URL: https://issues.apache.org/jira/browse/SPARK-48522 > Project: Spark > Issue Type: Dependency upgrade > Components: Build >Affects Versions: 4.0.0 >Reporter: Kent Yao >Assignee: Kent Yao >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48506) Compression codec short names are case insensitive expect for event logging
[ https://issues.apache.org/jira/browse/SPARK-48506?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie reassigned SPARK-48506: Assignee: Kent Yao > Compression codec short names are case insensitive expect for event logging > --- > > Key: SPARK-48506 > URL: https://issues.apache.org/jira/browse/SPARK-48506 > Project: Spark > Issue Type: Improvement > Components: Spark Core >Affects Versions: 3.0.3, 3.1.3, 3.2.4, 3.5.1, 3.3.4, 3.4.3 >Reporter: Kent Yao >Assignee: Kent Yao >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48506) Compression codec short names are case insensitive expect for event logging
[ https://issues.apache.org/jira/browse/SPARK-48506?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie resolved SPARK-48506. -- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46847 [https://github.com/apache/spark/pull/46847] > Compression codec short names are case insensitive expect for event logging > --- > > Key: SPARK-48506 > URL: https://issues.apache.org/jira/browse/SPARK-48506 > Project: Spark > Issue Type: Improvement > Components: Spark Core >Affects Versions: 3.0.3, 3.1.3, 3.2.4, 3.5.1, 3.3.4, 3.4.3 >Reporter: Kent Yao >Assignee: Kent Yao >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48519) Upgrade jetty to 11.0.21
[ https://issues.apache.org/jira/browse/SPARK-48519?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie resolved SPARK-48519. -- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46843 [https://github.com/apache/spark/pull/46843] > Upgrade jetty to 11.0.21 > > > Key: SPARK-48519 > URL: https://issues.apache.org/jira/browse/SPARK-48519 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 4.0.0 >Reporter: Yang Jie >Assignee: Yang Jie >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > > * https://github.com/jetty/jetty.project/releases/tag/jetty-11.0.21 -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48519) Upgrade jetty to 11.0.21
[ https://issues.apache.org/jira/browse/SPARK-48519?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie reassigned SPARK-48519: Assignee: Yang Jie > Upgrade jetty to 11.0.21 > > > Key: SPARK-48519 > URL: https://issues.apache.org/jira/browse/SPARK-48519 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 4.0.0 >Reporter: Yang Jie >Assignee: Yang Jie >Priority: Major > Labels: pull-request-available > > * https://github.com/jetty/jetty.project/releases/tag/jetty-11.0.21 -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-48519) Upgrade jetty to 11.0.21
Yang Jie created SPARK-48519: Summary: Upgrade jetty to 11.0.21 Key: SPARK-48519 URL: https://issues.apache.org/jira/browse/SPARK-48519 Project: Spark Issue Type: Improvement Components: Build Affects Versions: 4.0.0 Reporter: Yang Jie * https://github.com/jetty/jetty.project/releases/tag/jetty-11.0.21 -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-48505) Simplify the implementation of Utils#isG1GC
Yang Jie created SPARK-48505: Summary: Simplify the implementation of Utils#isG1GC Key: SPARK-48505 URL: https://issues.apache.org/jira/browse/SPARK-48505 Project: Spark Issue Type: Improvement Components: Spark Core Affects Versions: 4.0.0 Reporter: Yang Jie -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48433) Upgrade `checkstyle` to 10.17.0
[ https://issues.apache.org/jira/browse/SPARK-48433?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie resolved SPARK-48433. -- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46763 [https://github.com/apache/spark/pull/46763] > Upgrade `checkstyle` to 10.17.0 > --- > > Key: SPARK-48433 > URL: https://issues.apache.org/jira/browse/SPARK-48433 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 4.0.0 >Reporter: BingKun Pan >Assignee: BingKun Pan >Priority: Minor > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48484) V2Write use the same TaskAttemptId for different task attempts
[ https://issues.apache.org/jira/browse/SPARK-48484?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie reassigned SPARK-48484: Assignee: Jackey Lee > V2Write use the same TaskAttemptId for different task attempts > -- > > Key: SPARK-48484 > URL: https://issues.apache.org/jira/browse/SPARK-48484 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 4.0.0, 3.5.1, 3.4.3 >Reporter: Yang Jie >Assignee: Jackey Lee >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0, 3.5.2, 3.4.4 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48484) V2Write use the same TaskAttemptId for different task attempts
[ https://issues.apache.org/jira/browse/SPARK-48484?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie resolved SPARK-48484. -- Fix Version/s: 3.4.4 3.5.2 4.0.0 Resolution: Fixed Issue resolved by pull request 46811 [https://github.com/apache/spark/pull/46811] > V2Write use the same TaskAttemptId for different task attempts > -- > > Key: SPARK-48484 > URL: https://issues.apache.org/jira/browse/SPARK-48484 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 4.0.0, 3.5.1, 3.4.3 >Reporter: Yang Jie >Priority: Major > Labels: pull-request-available > Fix For: 3.4.4, 3.5.2, 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-48484) V2Write use the same TaskAttemptId for different task attempts
Yang Jie created SPARK-48484: Summary: V2Write use the same TaskAttemptId for different task attempts Key: SPARK-48484 URL: https://issues.apache.org/jira/browse/SPARK-48484 Project: Spark Issue Type: Bug Components: SQL Affects Versions: 3.4.3, 3.5.1, 4.0.0 Reporter: Yang Jie -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48439) Derby: Retain as many significant digits as possible when decimal precision greater than 31
[ https://issues.apache.org/jira/browse/SPARK-48439?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie reassigned SPARK-48439: Assignee: Kent Yao > Derby: Retain as many significant digits as possible when decimal precision > greater than 31 > > > Key: SPARK-48439 > URL: https://issues.apache.org/jira/browse/SPARK-48439 > Project: Spark > Issue Type: Sub-task > Components: SQL >Affects Versions: 4.0.0 >Reporter: Kent Yao >Assignee: Kent Yao >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48439) Derby: Retain as many significant digits as possible when decimal precision greater than 31
[ https://issues.apache.org/jira/browse/SPARK-48439?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie resolved SPARK-48439. -- Resolution: Fixed Resolved by https://github.com/apache/spark/pull/46776 > Derby: Retain as many significant digits as possible when decimal precision > greater than 31 > > > Key: SPARK-48439 > URL: https://issues.apache.org/jira/browse/SPARK-48439 > Project: Spark > Issue Type: Sub-task > Components: SQL >Affects Versions: 4.0.0 >Reporter: Kent Yao >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48420) Upgrade netty to `4.1.110.Final`
[ https://issues.apache.org/jira/browse/SPARK-48420?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie resolved SPARK-48420. -- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46744 [https://github.com/apache/spark/pull/46744] > Upgrade netty to `4.1.110.Final` > > > Key: SPARK-48420 > URL: https://issues.apache.org/jira/browse/SPARK-48420 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 4.0.0 >Reporter: BingKun Pan >Assignee: BingKun Pan >Priority: Minor > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48420) Upgrade netty to `4.1.110.Final`
[ https://issues.apache.org/jira/browse/SPARK-48420?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie reassigned SPARK-48420: Assignee: BingKun Pan > Upgrade netty to `4.1.110.Final` > > > Key: SPARK-48420 > URL: https://issues.apache.org/jira/browse/SPARK-48420 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 4.0.0 >Reporter: BingKun Pan >Assignee: BingKun Pan >Priority: Minor > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-48427) Upgrade scala-parser-combinators to 2.4
Yang Jie created SPARK-48427: Summary: Upgrade scala-parser-combinators to 2.4 Key: SPARK-48427 URL: https://issues.apache.org/jira/browse/SPARK-48427 Project: Spark Issue Type: Improvement Components: Build Affects Versions: 4.0.0 Reporter: Yang Jie https://github.com/scala/scala-parser-combinators/releases/tag/v2.4.0 -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48384) Exclude `io.netty:netty-tcnative-boringssl-static` from `zookeeper`
[ https://issues.apache.org/jira/browse/SPARK-48384?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie resolved SPARK-48384. -- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46695 [https://github.com/apache/spark/pull/46695] > Exclude `io.netty:netty-tcnative-boringssl-static` from `zookeeper` > --- > > Key: SPARK-48384 > URL: https://issues.apache.org/jira/browse/SPARK-48384 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 4.0.0 >Reporter: BingKun Pan >Assignee: BingKun Pan >Priority: Minor > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48384) Exclude `io.netty:netty-tcnative-boringssl-static` from `zookeeper`
[ https://issues.apache.org/jira/browse/SPARK-48384?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie reassigned SPARK-48384: Assignee: BingKun Pan > Exclude `io.netty:netty-tcnative-boringssl-static` from `zookeeper` > --- > > Key: SPARK-48384 > URL: https://issues.apache.org/jira/browse/SPARK-48384 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 4.0.0 >Reporter: BingKun Pan >Assignee: BingKun Pan >Priority: Minor > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-48406) Upgrade commons-cli to 1.8.0
Yang Jie created SPARK-48406: Summary: Upgrade commons-cli to 1.8.0 Key: SPARK-48406 URL: https://issues.apache.org/jira/browse/SPARK-48406 Project: Spark Issue Type: Improvement Components: Build Affects Versions: 4.0.0 Reporter: Yang Jie * [https://commons.apache.org/proper/commons-cli/changes-report.html#a1.7.0] * [https://commons.apache.org/proper/commons-cli/changes-report.html#a1.8.0|https://commons.apache.org/proper/commons-cli/changes-report.html#a1.7.8] -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48386) Replace JVM assert with JUnit Assert in tests
[ https://issues.apache.org/jira/browse/SPARK-48386?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie resolved SPARK-48386. -- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46698 [https://github.com/apache/spark/pull/46698] > Replace JVM assert with JUnit Assert in tests > - > > Key: SPARK-48386 > URL: https://issues.apache.org/jira/browse/SPARK-48386 > Project: Spark > Issue Type: Improvement > Components: Tests >Affects Versions: 4.0.0 >Reporter: BingKun Pan >Assignee: BingKun Pan >Priority: Minor > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-48366) Simplify statements related to `filter`
Yang Jie created SPARK-48366: Summary: Simplify statements related to `filter` Key: SPARK-48366 URL: https://issues.apache.org/jira/browse/SPARK-48366 Project: Spark Issue Type: Improvement Components: Connect, Spark Core, SQL Affects Versions: 4.0.0 Reporter: Yang Jie -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48238) Spark fail to start due to class o.a.h.yarn.server.webproxy.amfilter.AmIpFilter is not a jakarta.servlet.Filter
[ https://issues.apache.org/jira/browse/SPARK-48238?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie resolved SPARK-48238. -- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46611 [https://github.com/apache/spark/pull/46611] > Spark fail to start due to class > o.a.h.yarn.server.webproxy.amfilter.AmIpFilter is not a jakarta.servlet.Filter > --- > > Key: SPARK-48238 > URL: https://issues.apache.org/jira/browse/SPARK-48238 > Project: Spark > Issue Type: Bug > Components: Build >Affects Versions: 4.0.0 >Reporter: Cheng Pan >Assignee: Cheng Pan >Priority: Blocker > Labels: pull-request-available > Fix For: 4.0.0 > > > I tested the latest master branch, it failed to start on YARN mode > {code:java} > dev/make-distribution.sh --tgz -Phive,hive-thriftserver,yarn{code} > > {code:java} > $ bin/spark-sql --master yarn > WARNING: Using incubator modules: jdk.incubator.vector > Setting default log level to "WARN". > To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use > setLogLevel(newLevel). > 2024-05-10 17:58:17 WARN NativeCodeLoader: Unable to load native-hadoop > library for your platform... using builtin-java classes where applicable > 2024-05-10 17:58:18 WARN Client: Neither spark.yarn.jars nor > spark.yarn.archive} is set, falling back to uploading libraries under > SPARK_HOME. > 2024-05-10 17:58:25 ERROR SparkContext: Error initializing SparkContext. > org.sparkproject.jetty.util.MultiException: Multiple exceptions > at > org.sparkproject.jetty.util.MultiException.ifExceptionThrow(MultiException.java:117) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at > org.sparkproject.jetty.servlet.ServletHandler.initialize(ServletHandler.java:751) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at > org.sparkproject.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:392) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at > org.sparkproject.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:902) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at > org.sparkproject.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:306) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at > org.sparkproject.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:93) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at org.apache.spark.ui.ServerInfo.addHandler(JettyUtils.scala:514) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at > org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$2(SparkUI.scala:81) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at > org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$2$adapted(SparkUI.scala:81) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:619) > ~[scala-library-2.13.13.jar:?] > at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:617) > ~[scala-library-2.13.13.jar:?] > at scala.collection.AbstractIterable.foreach(Iterable.scala:935) > ~[scala-library-2.13.13.jar:?] > at > org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$1(SparkUI.scala:81) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at > org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$1$adapted(SparkUI.scala:79) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at scala.Option.foreach(Option.scala:437) ~[scala-library-2.13.13.jar:?] > at org.apache.spark.ui.SparkUI.attachAllHandlers(SparkUI.scala:79) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at org.apache.spark.SparkContext.$anonfun$new$31(SparkContext.scala:690) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at > org.apache.spark.SparkContext.$anonfun$new$31$adapted(SparkContext.scala:690) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at scala.Option.foreach(Option.scala:437) ~[scala-library-2.13.13.jar:?] > at org.apache.spark.SparkContext.(SparkContext.scala:690) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2963) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at > org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:1118) > ~[spark-sql_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at scala.Option.getOrElse(Option.scala:201) [scala-library-2.13.13.jar:?] > at > org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:1112) > [spark-sql_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at > org.apach
[jira] [Assigned] (SPARK-48238) Spark fail to start due to class o.a.h.yarn.server.webproxy.amfilter.AmIpFilter is not a jakarta.servlet.Filter
[ https://issues.apache.org/jira/browse/SPARK-48238?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie reassigned SPARK-48238: Assignee: Cheng Pan > Spark fail to start due to class > o.a.h.yarn.server.webproxy.amfilter.AmIpFilter is not a jakarta.servlet.Filter > --- > > Key: SPARK-48238 > URL: https://issues.apache.org/jira/browse/SPARK-48238 > Project: Spark > Issue Type: Bug > Components: Build >Affects Versions: 4.0.0 >Reporter: Cheng Pan >Assignee: Cheng Pan >Priority: Blocker > Labels: pull-request-available > > I tested the latest master branch, it failed to start on YARN mode > {code:java} > dev/make-distribution.sh --tgz -Phive,hive-thriftserver,yarn{code} > > {code:java} > $ bin/spark-sql --master yarn > WARNING: Using incubator modules: jdk.incubator.vector > Setting default log level to "WARN". > To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use > setLogLevel(newLevel). > 2024-05-10 17:58:17 WARN NativeCodeLoader: Unable to load native-hadoop > library for your platform... using builtin-java classes where applicable > 2024-05-10 17:58:18 WARN Client: Neither spark.yarn.jars nor > spark.yarn.archive} is set, falling back to uploading libraries under > SPARK_HOME. > 2024-05-10 17:58:25 ERROR SparkContext: Error initializing SparkContext. > org.sparkproject.jetty.util.MultiException: Multiple exceptions > at > org.sparkproject.jetty.util.MultiException.ifExceptionThrow(MultiException.java:117) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at > org.sparkproject.jetty.servlet.ServletHandler.initialize(ServletHandler.java:751) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at > org.sparkproject.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:392) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at > org.sparkproject.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:902) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at > org.sparkproject.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:306) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at > org.sparkproject.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:93) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at org.apache.spark.ui.ServerInfo.addHandler(JettyUtils.scala:514) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at > org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$2(SparkUI.scala:81) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at > org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$2$adapted(SparkUI.scala:81) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at scala.collection.IterableOnceOps.foreach(IterableOnce.scala:619) > ~[scala-library-2.13.13.jar:?] > at scala.collection.IterableOnceOps.foreach$(IterableOnce.scala:617) > ~[scala-library-2.13.13.jar:?] > at scala.collection.AbstractIterable.foreach(Iterable.scala:935) > ~[scala-library-2.13.13.jar:?] > at > org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$1(SparkUI.scala:81) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at > org.apache.spark.ui.SparkUI.$anonfun$attachAllHandlers$1$adapted(SparkUI.scala:79) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at scala.Option.foreach(Option.scala:437) ~[scala-library-2.13.13.jar:?] > at org.apache.spark.ui.SparkUI.attachAllHandlers(SparkUI.scala:79) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at org.apache.spark.SparkContext.$anonfun$new$31(SparkContext.scala:690) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at > org.apache.spark.SparkContext.$anonfun$new$31$adapted(SparkContext.scala:690) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at scala.Option.foreach(Option.scala:437) ~[scala-library-2.13.13.jar:?] > at org.apache.spark.SparkContext.(SparkContext.scala:690) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2963) > ~[spark-core_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at > org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:1118) > ~[spark-sql_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at scala.Option.getOrElse(Option.scala:201) [scala-library-2.13.13.jar:?] > at > org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:1112) > [spark-sql_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHOT] > at > org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:64) > [spark-hive-thriftserver_2.13-4.0.0-SNAPSHOT.jar:4.0.0-SNAPSHO
[jira] [Resolved] (SPARK-48242) Upgrade extra-enforcer-rules to 1.8.0
[ https://issues.apache.org/jira/browse/SPARK-48242?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie resolved SPARK-48242. -- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46538 [https://github.com/apache/spark/pull/46538] > Upgrade extra-enforcer-rules to 1.8.0 > - > > Key: SPARK-48242 > URL: https://issues.apache.org/jira/browse/SPARK-48242 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 4.0.0 >Reporter: BingKun Pan >Assignee: BingKun Pan >Priority: Minor > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48242) Upgrade extra-enforcer-rules to 1.8.0
[ https://issues.apache.org/jira/browse/SPARK-48242?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie reassigned SPARK-48242: Assignee: BingKun Pan > Upgrade extra-enforcer-rules to 1.8.0 > - > > Key: SPARK-48242 > URL: https://issues.apache.org/jira/browse/SPARK-48242 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 4.0.0 >Reporter: BingKun Pan >Assignee: BingKun Pan >Priority: Minor > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-48299) Upgrade scala-maven-plugin to 4.9.1
Yang Jie created SPARK-48299: Summary: Upgrade scala-maven-plugin to 4.9.1 Key: SPARK-48299 URL: https://issues.apache.org/jira/browse/SPARK-48299 Project: Spark Issue Type: Improvement Components: Build Affects Versions: 4.0.0 Reporter: Yang Jie -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48274) Upgrade GenJavadoc to 0.19
[ https://issues.apache.org/jira/browse/SPARK-48274?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie resolved SPARK-48274. -- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46579 [https://github.com/apache/spark/pull/46579] > Upgrade GenJavadoc to 0.19 > --- > > Key: SPARK-48274 > URL: https://issues.apache.org/jira/browse/SPARK-48274 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 4.0.0 >Reporter: BingKun Pan >Assignee: BingKun Pan >Priority: Minor > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48257) Polish POM for Hive dependencies
[ https://issues.apache.org/jira/browse/SPARK-48257?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie resolved SPARK-48257. -- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46558 [https://github.com/apache/spark/pull/46558] > Polish POM for Hive dependencies > > > Key: SPARK-48257 > URL: https://issues.apache.org/jira/browse/SPARK-48257 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 4.0.0 >Reporter: Cheng Pan >Assignee: Cheng Pan >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48257) Polish POM for Hive dependencies
[ https://issues.apache.org/jira/browse/SPARK-48257?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie reassigned SPARK-48257: Assignee: Cheng Pan > Polish POM for Hive dependencies > > > Key: SPARK-48257 > URL: https://issues.apache.org/jira/browse/SPARK-48257 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 4.0.0 >Reporter: Cheng Pan >Assignee: Cheng Pan >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-48171) Clean up the use of deprecated APIs related to `o.rocksdb.Logger`
Yang Jie created SPARK-48171: Summary: Clean up the use of deprecated APIs related to `o.rocksdb.Logger` Key: SPARK-48171 URL: https://issues.apache.org/jira/browse/SPARK-48171 Project: Spark Issue Type: Improvement Components: Spark Core Affects Versions: 4.0.0 Reporter: Yang Jie {code:java} /** * AbstractLogger constructor. * * Important: the log level set within * the {@link org.rocksdb.Options} instance will be used as * maximum log level of RocksDB. * * @param options {@link org.rocksdb.Options} instance. * * @deprecated Use {@link Logger#Logger(InfoLogLevel)} instead, e.g. {@code new * Logger(options.infoLogLevel())}. */ @Deprecated public Logger(final Options options) { this(options.infoLogLevel()); } {code} -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-48138) Disable a flaky `SparkSessionE2ESuite.interrupt tag` test
[ https://issues.apache.org/jira/browse/SPARK-48138?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie resolved SPARK-48138. -- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46396 [https://github.com/apache/spark/pull/46396] > Disable a flaky `SparkSessionE2ESuite.interrupt tag` test > - > > Key: SPARK-48138 > URL: https://issues.apache.org/jira/browse/SPARK-48138 > Project: Spark > Issue Type: Sub-task > Components: Connect, Tests >Affects Versions: 4.0.0 >Reporter: Dongjoon Hyun >Assignee: Dongjoon Hyun >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > > - https://github.com/apache/spark/actions/runs/8962353911/job/24611130573 > (Master, 5/5) > - https://github.com/apache/spark/actions/runs/8948176536/job/24581022674 > (Master, 5/4) -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-48138) Disable a flaky `SparkSessionE2ESuite.interrupt tag` test
[ https://issues.apache.org/jira/browse/SPARK-48138?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie reassigned SPARK-48138: Assignee: Dongjoon Hyun > Disable a flaky `SparkSessionE2ESuite.interrupt tag` test > - > > Key: SPARK-48138 > URL: https://issues.apache.org/jira/browse/SPARK-48138 > Project: Spark > Issue Type: Sub-task > Components: Connect, Tests >Affects Versions: 4.0.0 >Reporter: Dongjoon Hyun >Assignee: Dongjoon Hyun >Priority: Major > Labels: pull-request-available > > - https://github.com/apache/spark/actions/runs/8962353911/job/24611130573 > (Master, 5/5) > - https://github.com/apache/spark/actions/runs/8948176536/job/24581022674 > (Master, 5/4) -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-48001) Remove unused `private implicit def arrayToArrayWritable` from `SparkContext`
Yang Jie created SPARK-48001: Summary: Remove unused `private implicit def arrayToArrayWritable` from `SparkContext` Key: SPARK-48001 URL: https://issues.apache.org/jira/browse/SPARK-48001 Project: Spark Issue Type: Improvement Components: Spark Core Affects Versions: 4.0.0 Reporter: Yang Jie -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-47984) Change `MetricsAggregate/V2Aggregator`'s `serialize/deserialize` to call `SparkSerDeUtils`'s `serialize/deserialize` methods.
Yang Jie created SPARK-47984: Summary: Change `MetricsAggregate/V2Aggregator`'s `serialize/deserialize` to call `SparkSerDeUtils`'s `serialize/deserialize` methods. Key: SPARK-47984 URL: https://issues.apache.org/jira/browse/SPARK-47984 Project: Spark Issue Type: Improvement Components: MLlib, SQL Affects Versions: 4.0.0 Reporter: Yang Jie -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-47928) Speed up test "Add jar support Ivy URI in SQL"
[ https://issues.apache.org/jira/browse/SPARK-47928?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie resolved SPARK-47928. -- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46150 [https://github.com/apache/spark/pull/46150] > Speed up test "Add jar support Ivy URI in SQL" > -- > > Key: SPARK-47928 > URL: https://issues.apache.org/jira/browse/SPARK-47928 > Project: Spark > Issue Type: Test > Components: SQL >Affects Versions: 3.2.0 >Reporter: Cheng Pan >Assignee: Cheng Pan >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-47928) Speed up test "Add jar support Ivy URI in SQL"
[ https://issues.apache.org/jira/browse/SPARK-47928?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie reassigned SPARK-47928: Assignee: Cheng Pan > Speed up test "Add jar support Ivy URI in SQL" > -- > > Key: SPARK-47928 > URL: https://issues.apache.org/jira/browse/SPARK-47928 > Project: Spark > Issue Type: Test > Components: SQL >Affects Versions: 3.2.0 >Reporter: Cheng Pan >Assignee: Cheng Pan >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-47932) Avoid using legacy commons-lang
[ https://issues.apache.org/jira/browse/SPARK-47932?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie resolved SPARK-47932. -- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46154 [https://github.com/apache/spark/pull/46154] > Avoid using legacy commons-lang > --- > > Key: SPARK-47932 > URL: https://issues.apache.org/jira/browse/SPARK-47932 > Project: Spark > Issue Type: Test > Components: SQL, Tests >Affects Versions: 4.0.0 >Reporter: Cheng Pan >Assignee: Cheng Pan >Priority: Minor > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-47901) Upgrade commons-text to 1.12.0
[ https://issues.apache.org/jira/browse/SPARK-47901?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie reassigned SPARK-47901: Assignee: Yang Jie > Upgrade commons-text to 1.12.0 > -- > > Key: SPARK-47901 > URL: https://issues.apache.org/jira/browse/SPARK-47901 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 4.0.0 >Reporter: Yang Jie >Assignee: Yang Jie >Priority: Major > Labels: pull-request-available > > https://github.com/apache/commons-text/blob/rel/commons-text-1.12.0/RELEASE-NOTES.txt -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-47901) Upgrade commons-text to 1.12.0
[ https://issues.apache.org/jira/browse/SPARK-47901?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie resolved SPARK-47901. -- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46127 [https://github.com/apache/spark/pull/46127] > Upgrade commons-text to 1.12.0 > -- > > Key: SPARK-47901 > URL: https://issues.apache.org/jira/browse/SPARK-47901 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 4.0.0 >Reporter: Yang Jie >Assignee: Yang Jie >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > > https://github.com/apache/commons-text/blob/rel/commons-text-1.12.0/RELEASE-NOTES.txt -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-47901) UPgrade commons-text to 1.12.0
Yang Jie created SPARK-47901: Summary: UPgrade commons-text to 1.12.0 Key: SPARK-47901 URL: https://issues.apache.org/jira/browse/SPARK-47901 Project: Spark Issue Type: Improvement Components: Build Affects Versions: 4.0.0 Reporter: Yang Jie https://github.com/apache/commons-text/blob/rel/commons-text-1.12.0/RELEASE-NOTES.txt -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-47901) Upgrade commons-text to 1.12.0
[ https://issues.apache.org/jira/browse/SPARK-47901?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie updated SPARK-47901: - Summary: Upgrade commons-text to 1.12.0 (was: UPgrade commons-text to 1.12.0) > Upgrade commons-text to 1.12.0 > -- > > Key: SPARK-47901 > URL: https://issues.apache.org/jira/browse/SPARK-47901 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 4.0.0 >Reporter: Yang Jie >Priority: Major > > https://github.com/apache/commons-text/blob/rel/commons-text-1.12.0/RELEASE-NOTES.txt -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-47850) Support converting insert for unpartitioned Hive table
[ https://issues.apache.org/jira/browse/SPARK-47850?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie reassigned SPARK-47850: Assignee: Cheng Pan > Support converting insert for unpartitioned Hive table > -- > > Key: SPARK-47850 > URL: https://issues.apache.org/jira/browse/SPARK-47850 > Project: Spark > Issue Type: Improvement > Components: SQL >Affects Versions: 4.0.0 >Reporter: Cheng Pan >Assignee: Cheng Pan >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-47850) Support converting insert for unpartitioned Hive table
[ https://issues.apache.org/jira/browse/SPARK-47850?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie resolved SPARK-47850. -- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 46052 [https://github.com/apache/spark/pull/46052] > Support converting insert for unpartitioned Hive table > -- > > Key: SPARK-47850 > URL: https://issues.apache.org/jira/browse/SPARK-47850 > Project: Spark > Issue Type: Improvement > Components: SQL >Affects Versions: 4.0.0 >Reporter: Cheng Pan >Assignee: Cheng Pan >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-47887) Remove unused import `spark/connect/common.proto` from `spark/connect/relations.proto`
Yang Jie created SPARK-47887: Summary: Remove unused import `spark/connect/common.proto` from `spark/connect/relations.proto` Key: SPARK-47887 URL: https://issues.apache.org/jira/browse/SPARK-47887 Project: Spark Issue Type: Improvement Components: Connect Affects Versions: 4.0.0 Reporter: Yang Jie fix compile waring: {code:java} spark/connect/relations.proto:26:1: warning: Import spark/connect/common.proto is unused. {code} -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-47834) Mark deprecated functions with `@deprecated` in `SQLImplicits`
Yang Jie created SPARK-47834: Summary: Mark deprecated functions with `@deprecated` in `SQLImplicits` Key: SPARK-47834 URL: https://issues.apache.org/jira/browse/SPARK-47834 Project: Spark Issue Type: Improvement Components: Connect, Spark Core Affects Versions: 4.0.0 Reporter: Yang Jie -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-47770) Fix `GenerateMIMAIgnore.isPackagePrivateModule` to return false instead of failing
[ https://issues.apache.org/jira/browse/SPARK-47770?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie resolved SPARK-47770. -- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 45938 [https://github.com/apache/spark/pull/45938] > Fix `GenerateMIMAIgnore.isPackagePrivateModule` to return false instead of > failing > -- > > Key: SPARK-47770 > URL: https://issues.apache.org/jira/browse/SPARK-47770 > Project: Spark > Issue Type: Sub-task > Components: Project Infra >Affects Versions: 4.0.0 >Reporter: Dongjoon Hyun >Assignee: Dongjoon Hyun >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-47709) Upgrade tink to 1.13.0
Yang Jie created SPARK-47709: Summary: Upgrade tink to 1.13.0 Key: SPARK-47709 URL: https://issues.apache.org/jira/browse/SPARK-47709 Project: Spark Issue Type: Improvement Components: Build Affects Versions: 4.0.0 Reporter: Yang Jie [https://github.com/tink-crypto/tink-java/releases/tag/v1.13.0] * AES-GCM is now about 20% faster. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-47685) Should restore the handling of `Stream` in `RelationalGroupedDataset#toDF`
[ https://issues.apache.org/jira/browse/SPARK-47685?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie reassigned SPARK-47685: Assignee: Yang Jie > Should restore the handling of `Stream` in `RelationalGroupedDataset#toDF` > -- > > Key: SPARK-47685 > URL: https://issues.apache.org/jira/browse/SPARK-47685 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 4.0.0 >Reporter: Yang Jie >Assignee: Yang Jie >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-47685) Should restore the handling of `Stream` in `RelationalGroupedDataset#toDF`
[ https://issues.apache.org/jira/browse/SPARK-47685?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie resolved SPARK-47685. -- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 45811 [https://github.com/apache/spark/pull/45811] > Should restore the handling of `Stream` in `RelationalGroupedDataset#toDF` > -- > > Key: SPARK-47685 > URL: https://issues.apache.org/jira/browse/SPARK-47685 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 4.0.0 >Reporter: Yang Jie >Assignee: Yang Jie >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-47686) Use `=!=` instead of `!==` in `JoinHintSuite`
Yang Jie created SPARK-47686: Summary: Use `=!=` instead of `!==` in `JoinHintSuite` Key: SPARK-47686 URL: https://issues.apache.org/jira/browse/SPARK-47686 Project: Spark Issue Type: Improvement Components: SQL, Tests Affects Versions: 4.0.0 Reporter: Yang Jie -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-47685) Should restore the handling of `Stream` in `RelationalGroupedDataset#toDF`
[ https://issues.apache.org/jira/browse/SPARK-47685?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie updated SPARK-47685: - Summary: Should restore the handling of `Stream` in `RelationalGroupedDataset#toDF` (was: Restore the handling of `Stream` in `RelationalGroupedDataset#toDF`) > Should restore the handling of `Stream` in `RelationalGroupedDataset#toDF` > -- > > Key: SPARK-47685 > URL: https://issues.apache.org/jira/browse/SPARK-47685 > Project: Spark > Issue Type: Bug > Components: SQL >Affects Versions: 4.0.0 >Reporter: Yang Jie >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-47685) Restore the handling of `Stream` in `RelationalGroupedDataset#toDF`
Yang Jie created SPARK-47685: Summary: Restore the handling of `Stream` in `RelationalGroupedDataset#toDF` Key: SPARK-47685 URL: https://issues.apache.org/jira/browse/SPARK-47685 Project: Spark Issue Type: Bug Components: SQL Affects Versions: 4.0.0 Reporter: Yang Jie -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-45593) Building a runnable distribution from master code running spark-sql raise error "java.lang.ClassNotFoundException: org.sparkproject.guava.util.concurrent.internal.Intern
[ https://issues.apache.org/jira/browse/SPARK-45593?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie updated SPARK-45593: - Affects Version/s: 3.5.1 > Building a runnable distribution from master code running spark-sql raise > error "java.lang.ClassNotFoundException: > org.sparkproject.guava.util.concurrent.internal.InternalFutureFailureAccess" > --- > > Key: SPARK-45593 > URL: https://issues.apache.org/jira/browse/SPARK-45593 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 4.0.0, 3.5.1 >Reporter: yikaifei >Assignee: yikaifei >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0, 3.5.2 > > > Building a runnable distribution from master code running spark-sql raise > error "java.lang.ClassNotFoundException: > org.sparkproject.guava.util.concurrent.internal.InternalFutureFailureAccess"; > Reproducing steps, first, clone spark master code, then: > # Build runnable distribution from master code by : > `/dev/make-distribution.sh --name ui --pip --tgz -Phive -Phive-thriftserver > -Pyarn -Pconnect` > # Install runnable distribution package > # Run `bin/spark-sql` > Got error: > {code:java} > 23/10/18 20:51:46 WARN NativeCodeLoader: Unable to load native-hadoop > library for your platform... using builtin-java classes where applicable > Exception in thread "main" java.lang.NoClassDefFoundError: > org/sparkproject/guava/util/concurrent/internal/InternalFutureFailureAccess > at java.base/java.lang.ClassLoader.defineClass1(Native Method) > at java.base/java.lang.ClassLoader.defineClass(ClassLoader.java:1012) > at > java.base/java.security.SecureClassLoader.defineClass(SecureClassLoader.java:150) > at > java.base/jdk.internal.loader.BuiltinClassLoader.defineClass(BuiltinClassLoader.java:862) > at > java.base/jdk.internal.loader.BuiltinClassLoader.findClassOnClassPathOrNull(BuiltinClassLoader.java:760) > at > java.base/jdk.internal.loader.BuiltinClassLoader.loadClassOrNull(BuiltinClassLoader.java:681) > at > java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:639) > at > java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188) > at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:520) > at java.base/java.lang.ClassLoader.defineClass1(Native Method) > at java.base/java.lang.ClassLoader.defineClass(ClassLoader.java:1012) > at > java.base/java.security.SecureClassLoader.defineClass(SecureClassLoader.java:150) > at > java.base/jdk.internal.loader.BuiltinClassLoader.defineClass(BuiltinClassLoader.java:862) > at > java.base/jdk.internal.loader.BuiltinClassLoader.findClassOnClassPathOrNull(BuiltinClassLoader.java:760) > at > java.base/jdk.internal.loader.BuiltinClassLoader.loadClassOrNull(BuiltinClassLoader.java:681) > at > java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:639) > at > java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188) > at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:520) > at java.base/java.lang.ClassLoader.defineClass1(Native Method) > at java.base/java.lang.ClassLoader.defineClass(ClassLoader.java:1012) > at > java.base/java.security.SecureClassLoader.defineClass(SecureClassLoader.java:150) > at > java.base/jdk.internal.loader.BuiltinClassLoader.defineClass(BuiltinClassLoader.java:862) > at > java.base/jdk.internal.loader.BuiltinClassLoader.findClassOnClassPathOrNull(BuiltinClassLoader.java:760) > at > java.base/jdk.internal.loader.BuiltinClassLoader.loadClassOrNull(BuiltinClassLoader.java:681) > at > java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:639) > at > java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188) > at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:520) > at > org.sparkproject.guava.cache.LocalCache$LoadingValueReference.(LocalCache.java:3511) > at > org.sparkproject.guava.cache.LocalCache$LoadingValueReference.(LocalCache.java:3515) > at > org.sparkproject.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2168) > at > org.sparkproject.guava.cache.LocalCache$Segment.get(LocalCache.java:2079) > at org.sparkproject.guava.cache.LocalCache.get(LocalCache.java:4011) > at org.sparkproject.guava.cache.LocalCache.getOrLoad(LocalCache.java:4034) > at > org.sparkproject.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:5010) > at > org.a
[jira] [Updated] (SPARK-45593) Building a runnable distribution from master code running spark-sql raise error "java.lang.ClassNotFoundException: org.sparkproject.guava.util.concurrent.internal.Intern
[ https://issues.apache.org/jira/browse/SPARK-45593?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie updated SPARK-45593: - Fix Version/s: 3.5.2 > Building a runnable distribution from master code running spark-sql raise > error "java.lang.ClassNotFoundException: > org.sparkproject.guava.util.concurrent.internal.InternalFutureFailureAccess" > --- > > Key: SPARK-45593 > URL: https://issues.apache.org/jira/browse/SPARK-45593 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 4.0.0 >Reporter: yikaifei >Assignee: yikaifei >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0, 3.5.2 > > > Building a runnable distribution from master code running spark-sql raise > error "java.lang.ClassNotFoundException: > org.sparkproject.guava.util.concurrent.internal.InternalFutureFailureAccess"; > Reproducing steps, first, clone spark master code, then: > # Build runnable distribution from master code by : > `/dev/make-distribution.sh --name ui --pip --tgz -Phive -Phive-thriftserver > -Pyarn -Pconnect` > # Install runnable distribution package > # Run `bin/spark-sql` > Got error: > {code:java} > 23/10/18 20:51:46 WARN NativeCodeLoader: Unable to load native-hadoop > library for your platform... using builtin-java classes where applicable > Exception in thread "main" java.lang.NoClassDefFoundError: > org/sparkproject/guava/util/concurrent/internal/InternalFutureFailureAccess > at java.base/java.lang.ClassLoader.defineClass1(Native Method) > at java.base/java.lang.ClassLoader.defineClass(ClassLoader.java:1012) > at > java.base/java.security.SecureClassLoader.defineClass(SecureClassLoader.java:150) > at > java.base/jdk.internal.loader.BuiltinClassLoader.defineClass(BuiltinClassLoader.java:862) > at > java.base/jdk.internal.loader.BuiltinClassLoader.findClassOnClassPathOrNull(BuiltinClassLoader.java:760) > at > java.base/jdk.internal.loader.BuiltinClassLoader.loadClassOrNull(BuiltinClassLoader.java:681) > at > java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:639) > at > java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188) > at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:520) > at java.base/java.lang.ClassLoader.defineClass1(Native Method) > at java.base/java.lang.ClassLoader.defineClass(ClassLoader.java:1012) > at > java.base/java.security.SecureClassLoader.defineClass(SecureClassLoader.java:150) > at > java.base/jdk.internal.loader.BuiltinClassLoader.defineClass(BuiltinClassLoader.java:862) > at > java.base/jdk.internal.loader.BuiltinClassLoader.findClassOnClassPathOrNull(BuiltinClassLoader.java:760) > at > java.base/jdk.internal.loader.BuiltinClassLoader.loadClassOrNull(BuiltinClassLoader.java:681) > at > java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:639) > at > java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188) > at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:520) > at java.base/java.lang.ClassLoader.defineClass1(Native Method) > at java.base/java.lang.ClassLoader.defineClass(ClassLoader.java:1012) > at > java.base/java.security.SecureClassLoader.defineClass(SecureClassLoader.java:150) > at > java.base/jdk.internal.loader.BuiltinClassLoader.defineClass(BuiltinClassLoader.java:862) > at > java.base/jdk.internal.loader.BuiltinClassLoader.findClassOnClassPathOrNull(BuiltinClassLoader.java:760) > at > java.base/jdk.internal.loader.BuiltinClassLoader.loadClassOrNull(BuiltinClassLoader.java:681) > at > java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:639) > at > java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188) > at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:520) > at > org.sparkproject.guava.cache.LocalCache$LoadingValueReference.(LocalCache.java:3511) > at > org.sparkproject.guava.cache.LocalCache$LoadingValueReference.(LocalCache.java:3515) > at > org.sparkproject.guava.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2168) > at > org.sparkproject.guava.cache.LocalCache$Segment.get(LocalCache.java:2079) > at org.sparkproject.guava.cache.LocalCache.get(LocalCache.java:4011) > at org.sparkproject.guava.cache.LocalCache.getOrLoad(LocalCache.java:4034) > at > org.sparkproject.guava.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:5010) > at > org.apache.spark
[jira] [Updated] (SPARK-47645) Make Spark build with -release instead of -target
[ https://issues.apache.org/jira/browse/SPARK-47645?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie updated SPARK-47645: - Description: https://github.com/scala/scala/pull/9982 > Make Spark build with -release instead of -target > -- > > Key: SPARK-47645 > URL: https://issues.apache.org/jira/browse/SPARK-47645 > Project: Spark > Issue Type: Improvement > Components: Build, Spark Core, SQL, YARN >Affects Versions: 4.0.0 >Reporter: Yang Jie >Priority: Major > > https://github.com/scala/scala/pull/9982 -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-47645) Make Spark build with -release instead of -target
Yang Jie created SPARK-47645: Summary: Make Spark build with -release instead of -target Key: SPARK-47645 URL: https://issues.apache.org/jira/browse/SPARK-47645 Project: Spark Issue Type: Improvement Components: Build, Spark Core, SQL, YARN Affects Versions: 4.0.0 Reporter: Yang Jie -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-47629) Add `common/variant` and `connector/kinesis-asl` to maven daily test module list
[ https://issues.apache.org/jira/browse/SPARK-47629?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie resolved SPARK-47629. -- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 45754 [https://github.com/apache/spark/pull/45754] > Add `common/variant` and `connector/kinesis-asl` to maven daily test module > list > > > Key: SPARK-47629 > URL: https://issues.apache.org/jira/browse/SPARK-47629 > Project: Spark > Issue Type: Improvement > Components: Project Infra >Affects Versions: 4.0.0 >Reporter: Yang Jie >Assignee: Yang Jie >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-47629) Add `common/variant` and `connector/kinesis-asl` to maven daily test module list
[ https://issues.apache.org/jira/browse/SPARK-47629?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie reassigned SPARK-47629: Assignee: Yang Jie > Add `common/variant` and `connector/kinesis-asl` to maven daily test module > list > > > Key: SPARK-47629 > URL: https://issues.apache.org/jira/browse/SPARK-47629 > Project: Spark > Issue Type: Improvement > Components: Project Infra >Affects Versions: 4.0.0 >Reporter: Yang Jie >Assignee: Yang Jie >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-47642) Exclude `org.junit.jupiter` and `org.junit.platform` from `jmock-junit5`
[ https://issues.apache.org/jira/browse/SPARK-47642?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie updated SPARK-47642: - Summary: Exclude `org.junit.jupiter` and `org.junit.platform` from `jmock-junit5` (was: Exclude `junit-jupiter-api` and `org.junit.platform` from `jmock-junit5`) > Exclude `org.junit.jupiter` and `org.junit.platform` from `jmock-junit5` > > > Key: SPARK-47642 > URL: https://issues.apache.org/jira/browse/SPARK-47642 > Project: Spark > Issue Type: Bug > Components: Build >Affects Versions: 4.0.0 >Reporter: Yang Jie >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-47642) Exclude `junit-jupiter-api` and `org.junit.platform` from `jmock-junit5`
Yang Jie created SPARK-47642: Summary: Exclude `junit-jupiter-api` and `org.junit.platform` from `jmock-junit5` Key: SPARK-47642 URL: https://issues.apache.org/jira/browse/SPARK-47642 Project: Spark Issue Type: Bug Components: Build Affects Versions: 4.0.0 Reporter: Yang Jie -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-47629) Add `common/variant` and `connector/kinesis-asl` to maven daily test module list
[ https://issues.apache.org/jira/browse/SPARK-47629?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie updated SPARK-47629: - Summary: Add `common/variant` and `connector/kinesis-asl` to maven daily test module list (was: Add `common/variant` to maven daily test module list) > Add `common/variant` and `connector/kinesis-asl` to maven daily test module > list > > > Key: SPARK-47629 > URL: https://issues.apache.org/jira/browse/SPARK-47629 > Project: Spark > Issue Type: Improvement > Components: Project Infra >Affects Versions: 4.0.0 >Reporter: Yang Jie >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-47629) Add `common/variant` to maven daily test module list
Yang Jie created SPARK-47629: Summary: Add `common/variant` to maven daily test module list Key: SPARK-47629 URL: https://issues.apache.org/jira/browse/SPARK-47629 Project: Spark Issue Type: Improvement Components: Project Infra Affects Versions: 4.0.0 Reporter: Yang Jie -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-47610) Always set io.netty.tryReflectionSetAccessible=true
[ https://issues.apache.org/jira/browse/SPARK-47610?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie resolved SPARK-47610. -- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 45733 [https://github.com/apache/spark/pull/45733] > Always set io.netty.tryReflectionSetAccessible=true > --- > > Key: SPARK-47610 > URL: https://issues.apache.org/jira/browse/SPARK-47610 > Project: Spark > Issue Type: Improvement > Components: Build, Spark Core >Affects Versions: 4.0.0 >Reporter: Cheng Pan >Assignee: Cheng Pan >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-47610) Always set io.netty.tryReflectionSetAccessible=true
[ https://issues.apache.org/jira/browse/SPARK-47610?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie reassigned SPARK-47610: Assignee: Cheng Pan > Always set io.netty.tryReflectionSetAccessible=true > --- > > Key: SPARK-47610 > URL: https://issues.apache.org/jira/browse/SPARK-47610 > Project: Spark > Issue Type: Improvement > Components: Build, Spark Core >Affects Versions: 4.0.0 >Reporter: Cheng Pan >Assignee: Cheng Pan >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-47536) Upgrade jmock-junit5 to 2.13.1
Yang Jie created SPARK-47536: Summary: Upgrade jmock-junit5 to 2.13.1 Key: SPARK-47536 URL: https://issues.apache.org/jira/browse/SPARK-47536 Project: Spark Issue Type: Improvement Components: Build Affects Versions: 4.0.0 Reporter: Yang Jie https://github.com/jmock-developers/jmock-library/releases/tag/2.13.1 -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-47523) Replace Deprecated `JsonParser#getCurrentName` with `JsonParser#currentName`
Yang Jie created SPARK-47523: Summary: Replace Deprecated `JsonParser#getCurrentName` with `JsonParser#currentName` Key: SPARK-47523 URL: https://issues.apache.org/jira/browse/SPARK-47523 Project: Spark Issue Type: Improvement Components: SQL Affects Versions: 4.0.0 Reporter: Yang Jie [https://github.com/FasterXML/jackson-core/blob/8fba680579885bf9cdae72e93f16de557056d6e3/src/main/java/com/fasterxml/jackson/core/JsonParser.java#L1521-L1551] {code:java} /** * Deprecated alias of {@link #currentName()}. * * @return Name of the current field in the parsing context * * @throws IOException for low-level read issues, or * {@link JsonParseException} for decoding problems * * @deprecated Since 2.17 use {@link #currentName} instead. */ @Deprecated public abstract String getCurrentName() throws IOException; /** * Method that can be called to get the name associated with * the current token: for {@link JsonToken#FIELD_NAME}s it will * be the same as what {@link #getText} returns; * for field values it will be preceding field name; * and for others (array values, root-level values) null. * * @return Name of the current field in the parsing context * * @throws IOException for low-level read issues, or * {@link JsonParseException} for decoding problems * * @since 2.10 */ public String currentName() throws IOException { // !!! TODO: switch direction in 2.18 or later return getCurrentName(); } {code} -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-46920) Improve executor exit error message on YARN
[ https://issues.apache.org/jira/browse/SPARK-46920?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie reassigned SPARK-46920: Assignee: Cheng Pan > Improve executor exit error message on YARN > --- > > Key: SPARK-46920 > URL: https://issues.apache.org/jira/browse/SPARK-46920 > Project: Spark > Issue Type: Improvement > Components: YARN >Affects Versions: 4.0.0 >Reporter: Cheng Pan >Assignee: Cheng Pan >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-46920) Improve executor exit error message on YARN
[ https://issues.apache.org/jira/browse/SPARK-46920?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie resolved SPARK-46920. -- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 44951 [https://github.com/apache/spark/pull/44951] > Improve executor exit error message on YARN > --- > > Key: SPARK-46920 > URL: https://issues.apache.org/jira/browse/SPARK-46920 > Project: Spark > Issue Type: Improvement > Components: YARN >Affects Versions: 4.0.0 >Reporter: Cheng Pan >Assignee: Cheng Pan >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-47486) Remove unused private method `getString` from `ArrowDeserializers`
Yang Jie created SPARK-47486: Summary: Remove unused private method `getString` from `ArrowDeserializers` Key: SPARK-47486 URL: https://issues.apache.org/jira/browse/SPARK-47486 Project: Spark Issue Type: Improvement Components: Connect Affects Versions: 4.0.0 Reporter: Yang Jie -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-47455) Fix Resource Handling of `scalaStyleOnCompileConfig` in SparkBuild.scala
[ https://issues.apache.org/jira/browse/SPARK-47455?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie resolved SPARK-47455. -- Fix Version/s: 3.4.3 3.5.2 4.0.0 Resolution: Fixed Issue resolved by pull request 45582 [https://github.com/apache/spark/pull/45582] > Fix Resource Handling of `scalaStyleOnCompileConfig` in SparkBuild.scala > > > Key: SPARK-47455 > URL: https://issues.apache.org/jira/browse/SPARK-47455 > Project: Spark > Issue Type: Bug > Components: Project Infra >Affects Versions: 3.4.2, 4.0.0, 3.5.1 >Reporter: Yang Jie >Assignee: Yang Jie >Priority: Minor > Labels: pull-request-available > Fix For: 3.4.3, 3.5.2, 4.0.0 > > > [https://github.com/apache/spark/blob/e01ed0da22f24204fe23143032ff39be7f4b56af/project/SparkBuild.scala#L157-L173] > > {code:java} > val scalaStyleOnCompileConfig: String = { > val in = "scalastyle-config.xml" > val out = "scalastyle-on-compile.generated.xml" > val replacements = Map( > """customId="println" level="error -> """customId="println" > level="warn > ) > var contents = Source.fromFile(in).getLines.mkString("\n") > for ((k, v) <- replacements) { > require(contents.contains(k), s"Could not rewrite '$k' in original > scalastyle config.") > contents = contents.replace(k, v) > } > new PrintWriter(out) { > write(contents) > close() > } > out > } {code} > `Source.fromFile(in)` opens a `BufferedSource` resource handle, but it does > not close it. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-47455) Fix Resource Handling of `scalaStyleOnCompileConfig` in SparkBuild.scala
[ https://issues.apache.org/jira/browse/SPARK-47455?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie reassigned SPARK-47455: Assignee: Yang Jie > Fix Resource Handling of `scalaStyleOnCompileConfig` in SparkBuild.scala > > > Key: SPARK-47455 > URL: https://issues.apache.org/jira/browse/SPARK-47455 > Project: Spark > Issue Type: Bug > Components: Project Infra >Affects Versions: 3.4.2, 4.0.0, 3.5.1 >Reporter: Yang Jie >Assignee: Yang Jie >Priority: Minor > Labels: pull-request-available > > [https://github.com/apache/spark/blob/e01ed0da22f24204fe23143032ff39be7f4b56af/project/SparkBuild.scala#L157-L173] > > {code:java} > val scalaStyleOnCompileConfig: String = { > val in = "scalastyle-config.xml" > val out = "scalastyle-on-compile.generated.xml" > val replacements = Map( > """customId="println" level="error -> """customId="println" > level="warn > ) > var contents = Source.fromFile(in).getLines.mkString("\n") > for ((k, v) <- replacements) { > require(contents.contains(k), s"Could not rewrite '$k' in original > scalastyle config.") > contents = contents.replace(k, v) > } > new PrintWriter(out) { > write(contents) > close() > } > out > } {code} > `Source.fromFile(in)` opens a `BufferedSource` resource handle, but it does > not close it. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-47474) Revert change of SPARK-47461 and add some comments
Yang Jie created SPARK-47474: Summary: Revert change of SPARK-47461 and add some comments Key: SPARK-47474 URL: https://issues.apache.org/jira/browse/SPARK-47474 Project: Spark Issue Type: Improvement Components: Spark Core Affects Versions: 4.0.0 Reporter: Yang Jie -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-47461) Remove unused private function `totalRunningTasksPerResourceProfile` from `ExecutorAllocationManager`
[ https://issues.apache.org/jira/browse/SPARK-47461?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie updated SPARK-47461: - Summary: Remove unused private function `totalRunningTasksPerResourceProfile` from `ExecutorAllocationManager` (was: Remove the unused private function `totalRunningTasksPerResourceProfile` from `ExecutorAllocationManager`) > Remove unused private function `totalRunningTasksPerResourceProfile` from > `ExecutorAllocationManager` > - > > Key: SPARK-47461 > URL: https://issues.apache.org/jira/browse/SPARK-47461 > Project: Spark > Issue Type: Improvement > Components: Spark Core >Affects Versions: 4.0.0 >Reporter: Yang Jie >Priority: Minor > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-47461) Remove the unused private function `totalRunningTasksPerResourceProfile` from `ExecutorAllocationManager`
Yang Jie created SPARK-47461: Summary: Remove the unused private function `totalRunningTasksPerResourceProfile` from `ExecutorAllocationManager` Key: SPARK-47461 URL: https://issues.apache.org/jira/browse/SPARK-47461 Project: Spark Issue Type: Improvement Components: Spark Core Affects Versions: 4.0.0 Reporter: Yang Jie -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-47455) Fix Resource Handling of `scalaStyleOnCompileConfig` in SparkBuild.scala
Yang Jie created SPARK-47455: Summary: Fix Resource Handling of `scalaStyleOnCompileConfig` in SparkBuild.scala Key: SPARK-47455 URL: https://issues.apache.org/jira/browse/SPARK-47455 Project: Spark Issue Type: Bug Components: Project Infra Affects Versions: 3.5.1, 3.4.2, 4.0.0 Reporter: Yang Jie [https://github.com/apache/spark/blob/e01ed0da22f24204fe23143032ff39be7f4b56af/project/SparkBuild.scala#L157-L173] {code:java} val scalaStyleOnCompileConfig: String = { val in = "scalastyle-config.xml" val out = "scalastyle-on-compile.generated.xml" val replacements = Map( """customId="println" level="error -> """customId="println" level="warn ) var contents = Source.fromFile(in).getLines.mkString("\n") for ((k, v) <- replacements) { require(contents.contains(k), s"Could not rewrite '$k' in original scalastyle config.") contents = contents.replace(k, v) } new PrintWriter(out) { write(contents) close() } out } {code} `Source.fromFile(in)` opens a `BufferedSource` resource handle, but it does not close it. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-47369) Fix performance regression in JDK 17 caused from RocksDB logging
[ https://issues.apache.org/jira/browse/SPARK-47369?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17825945#comment-17825945 ] Yang Jie commented on SPARK-47369: -- >From the current Spark code, it appears that the {{Logger}} is only set for >the {{RocksDB}} instance built for external shuffle db(inRocksDBProvider), and >not for other parts. However, it seems that the Spark code does not actively >print RocksDB-related logs (perhaps my confirmation method is incorrect, could >you provide a way to confirm it? [~neilramaswamy] ) > Fix performance regression in JDK 17 caused from RocksDB logging > > > Key: SPARK-47369 > URL: https://issues.apache.org/jira/browse/SPARK-47369 > Project: Spark > Issue Type: Bug > Components: Structured Streaming >Affects Versions: 3.3.0, 3.3.1, 3.3.3, 3.4.2, 3.3.2, 3.4.0, 3.4.1, 3.5.0, > 3.5.1, 3.3.4 >Reporter: Neil Ramaswamy >Priority: Major > > JDK 17 has a performance regression in the JNI's AttachCurrentThread and > DetachCurrentThread calls, as reported here: > [https://bugs.openjdk.org/browse/JDK-8314859]. You can find a minimal > reproduction of the JDK issue in that bug report. I have marked as affected > versions 3.3.0^ since that is when JDK 17 started being offered in Spark. > For context, every time RocksDB logs, it currently [attaches itself to the > JVM|https://github.com/facebook/rocksdb/blob/main/java/rocksjni/loggerjnicallback.cc#L140], > invokes the RocksDB [logging callback that we > specify|https://github.com/apache/spark/blob/8fcef1657a02189f91d5485eabb5b165706cdce9/sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/state/RocksDB.scala#L839], > and then [detaches itself from the > JVM|https://github.com/facebook/rocksdb/blob/main/java/rocksjni/loggerjnicallback.cc#L170]. > These attach/detach calls regressed, causing JDK 17 SS queries to run up to > 10-15% slower than their respective JDK 8 queries. > For example, a 100K record/second dropDuplicates had a p95 latency regression > of 12%. A regression of 12% and 21% (at the p95) was observed for a query > with 1M record/second, 100K keys, 10 second windows, and 0 second watermark. > Because the Hotspot folks marked this as "Won't fix," one way to fix this is > to avoid the JNI entirely and write the RocksDB to stderr. RocksDB [8.11.3 > natively supports > this|https://github.com/facebook/rocksdb/wiki/Logging-in-RocksJava#configuring-a-native-logger] > (I implemented that feature in RocksJava). We can configure our RocksDB > logger to do its logging this way. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-47278) Upgrade rocksdbjni to 8.11.3
Yang Jie created SPARK-47278: Summary: Upgrade rocksdbjni to 8.11.3 Key: SPARK-47278 URL: https://issues.apache.org/jira/browse/SPARK-47278 Project: Spark Issue Type: Sub-task Components: Build Affects Versions: 4.0.0 Reporter: Yang Jie -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-47269) Upgrade jetty to 11.0.20
Yang Jie created SPARK-47269: Summary: Upgrade jetty to 11.0.20 Key: SPARK-47269 URL: https://issues.apache.org/jira/browse/SPARK-47269 Project: Spark Issue Type: Improvement Components: Build Affects Versions: 4.0.0 Reporter: Yang Jie fix * [CVE-2024-22201|https://github.com/advisories/GHSA-rggv-cv7r-mw98] - HTTP/2 connection not closed after idle timeout when TCP congested -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-44173) Make Spark an sbt build only project
[ https://issues.apache.org/jira/browse/SPARK-44173?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17822747#comment-17822747 ] Yang Jie commented on SPARK-44173: -- Hi, [~dongjoon] ~ Sorry, I missed the previous message. This Jira was created based on some discussions in https://github.com/apache/spark/pull/40317. With the establishment of the Maven daily test pipeline, we now have a way to discover problems in Maven tests in a timely manner, so the description in this Jira's `Description` has become less critical. I agree with your point, thank you for converting this to a normal Jira :) > Make Spark an sbt build only project > > > Key: SPARK-44173 > URL: https://issues.apache.org/jira/browse/SPARK-44173 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 4.0.0 >Reporter: Yang Jie >Priority: Minor > > Supporting both Maven and SBT always brings various testing problems and > increases the complexity of testing code writing > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-47243) Correct the package name of `StateMetadataSource.scala`
Yang Jie created SPARK-47243: Summary: Correct the package name of `StateMetadataSource.scala` Key: SPARK-47243 URL: https://issues.apache.org/jira/browse/SPARK-47243 Project: Spark Issue Type: Improvement Components: Structured Streaming Affects Versions: 4.0.0 Reporter: Yang Jie -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-47229) Change the never changed 'var' to 'val'
Yang Jie created SPARK-47229: Summary: Change the never changed 'var' to 'val' Key: SPARK-47229 URL: https://issues.apache.org/jira/browse/SPARK-47229 Project: Spark Issue Type: Improvement Components: Spark Core, SQL, YARN Affects Versions: 4.0.0 Reporter: Yang Jie -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-46919) Upgrade `grpcio*` and `grpc-java` to 1.62
[ https://issues.apache.org/jira/browse/SPARK-46919?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie updated SPARK-46919: - Parent: SPARK-47046 Issue Type: Sub-task (was: Improvement) > Upgrade `grpcio*` and `grpc-java` to 1.62 > - > > Key: SPARK-46919 > URL: https://issues.apache.org/jira/browse/SPARK-46919 > Project: Spark > Issue Type: Sub-task > Components: Build, Connect >Affects Versions: 4.0.0 >Reporter: Yang Jie >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-46919) Upgrade `grpcio*` and `grpc-java` to 1.62
[ https://issues.apache.org/jira/browse/SPARK-46919?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie updated SPARK-46919: - Summary: Upgrade `grpcio*` and `grpc-java` to 1.62 (was: Upgrade `grpcio*` to 1.60.0 and `grpc-java` to 1.61.0) > Upgrade `grpcio*` and `grpc-java` to 1.62 > - > > Key: SPARK-46919 > URL: https://issues.apache.org/jira/browse/SPARK-46919 > Project: Spark > Issue Type: Improvement > Components: Build, Connect >Affects Versions: 4.0.0 >Reporter: Yang Jie >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-47209) Upgrade slf4j to 2.0.12
Yang Jie created SPARK-47209: Summary: Upgrade slf4j to 2.0.12 Key: SPARK-47209 URL: https://issues.apache.org/jira/browse/SPARK-47209 Project: Spark Issue Type: Sub-task Components: Build Affects Versions: 4.0.0 Reporter: Yang Jie https://www.slf4j.org/news.html#2.0.12 -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Comment Edited] (SPARK-47194) Upgrade log4j2 to 2.23.0
[ https://issues.apache.org/jira/browse/SPARK-47194?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17821543#comment-17821543 ] Yang Jie edited comment on SPARK-47194 at 2/28/24 7:19 AM: --- It seems that the `-Dlog4j2.debug` option may not be working in 2.23.0, perhaps we should skip this upgrade. I have tested the following scenarios: 1. run `dev/make-distribution.sh --tgz` to build a Spark Client 2. add `log4j2.properties` and `spark-defaults.conf` with the same content as test case `Verify logging configuration is picked from the provided SPARK_CONF_DIR/log4j2.properties` ``` log4j2.properties # This log4j config file is for integration test SparkConfPropagateSuite. rootLogger.level = debug rootLogger.appenderRef.stdout.ref = console appender.console.type = Console appender.console.name = console appender.console.target = SYSTEM_ERR appender.console.layout.type = PatternLayout appender.console.layout.pattern = %d\{HH:mm:ss.SSS} %p %c: %maxLen\{%m} {512} %n%ex\{8}%n ``` ``` spark-defaults.conf spark.driver.extraJavaOptions -Dlog4j2.debug spark.executor.extraJavaOptions -Dlog4j2.debug spark.kubernetes.executor.deleteOnTermination false ``` 3. run `bin/run-example SparkPi` When using log4j 2.22.1, we can have the following log: ``` ... TRACE StatusLogger DefaultConfiguration cleaning Appenders from 1 LoggerConfigs. DEBUG StatusLogger Stopped org.apache.logging.log4j.core.config.DefaultConfiguration@384ad17b OK TRACE StatusLogger Reregistering MBeans after reconfigure. Selector=org.apache.logging.log4j.core.selector.ClassLoaderContextSelector@5852c06f TRACE StatusLogger Reregistering context (1/1): '5ffd2b27' org.apache.logging.log4j.core.LoggerContext@31190526 TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=5ffd2b27' TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=5ffd2b27,component=StatusLogger' TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=5ffd2b27,component=ContextSelector' TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=5ffd2b27,component=Loggers,name=*' TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=5ffd2b27,component=Appenders,name=*' TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=5ffd2b27,component=AsyncAppenders,name=*' TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=5ffd2b27,component=AsyncLoggerRingBuffer' TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=5ffd2b27,component=Loggers,name=*,subtype=RingBuffer' DEBUG StatusLogger Registering MBean org.apache.logging.log4j2:type=5ffd2b27 DEBUG StatusLogger Registering MBean org.apache.logging.log4j2:type=5ffd2b27,component=StatusLogger DEBUG StatusLogger Registering MBean org.apache.logging.log4j2:type=5ffd2b27,component=ContextSelector DEBUG StatusLogger Registering MBean org.apache.logging.log4j2:type=5ffd2b27,component=Loggers,name= DEBUG StatusLogger Registering MBean org.apache.logging.log4j2:type=5ffd2b27,component=Appenders,name=console TRACE StatusLogger Using default SystemClock for timestamps. DEBUG StatusLogger org.apache.logging.log4j.core.util.SystemClock supports precise timestamps. TRACE StatusLogger Using DummyNanoClock for nanosecond timestamps. DEBUG StatusLogger Reconfiguration complete for context[name=5ffd2b27] at URI /Users/yangjie01/Tools/4.0/spark-4.0.0-SNAPSHOT-bin-3.3.6/conf/log4j2.properties (org.apache.logging.log4j.core.LoggerContext@31190526) with optional ClassLoader: null DEBUG StatusLogger Shutdown hook enabled. Registering a new one. ... ``` But when using log4j 2.23.0, no logs related to `StatusLogger` are printed. So let's skip this upgrade was (Author: luciferyang): It seems that the `-Dlog4j2.debug` option may not be working in 2.23.0, perhaps we should skip this upgrade. I have tested the following scenarios: 1. run `dev/make-distribution.sh --tgz` to build a Spark Client 2. add `log4j2.properties` and `spark-defaults.conf` with the same content as test case `Verify logging configuration is picked from the provided SPARK_CONF_DIR/log4j2.properties` ``` log4j2.properties # This log4j config file is for integration test SparkConfPropagateSuite. rootLogger.level = debug rootLogger.appenderRef.stdout.ref = console appender.console.type = Console appender.console.name = console appender.console.target = SYSTEM_ERR appender.console.layout.type = PatternLayout appender.console.layout.pattern = %d\{HH:mm:ss.SSS} %p %c: %maxLen\{%m}{512}%n%ex\{8}%n ``` ``` spark-defaults.conf spark.driver.extraJavaOptions -Dlog4j2.debug spark.executor.extraJavaOptions -Dlog4j2.debug spark.kubernetes.executor.deleteOnT
[jira] [Resolved] (SPARK-47194) Upgrade log4j2 to 2.23.0
[ https://issues.apache.org/jira/browse/SPARK-47194?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie resolved SPARK-47194. -- Resolution: Won't Fix It seems that the `-Dlog4j2.debug` option may not be working in 2.23.0, perhaps we should skip this upgrade. I have tested the following scenarios: 1. run `dev/make-distribution.sh --tgz` to build a Spark Client 2. add `log4j2.properties` and `spark-defaults.conf` with the same content as test case `Verify logging configuration is picked from the provided SPARK_CONF_DIR/log4j2.properties` ``` log4j2.properties # This log4j config file is for integration test SparkConfPropagateSuite. rootLogger.level = debug rootLogger.appenderRef.stdout.ref = console appender.console.type = Console appender.console.name = console appender.console.target = SYSTEM_ERR appender.console.layout.type = PatternLayout appender.console.layout.pattern = %d\{HH:mm:ss.SSS} %p %c: %maxLen\{%m}{512}%n%ex\{8}%n ``` ``` spark-defaults.conf spark.driver.extraJavaOptions -Dlog4j2.debug spark.executor.extraJavaOptions -Dlog4j2.debug spark.kubernetes.executor.deleteOnTermination false ``` 3. run `bin/run-example SparkPi` When using log4j 2.22.1, we can have the following log: ``` ... TRACE StatusLogger DefaultConfiguration cleaning Appenders from 1 LoggerConfigs. DEBUG StatusLogger Stopped org.apache.logging.log4j.core.config.DefaultConfiguration@384ad17b OK TRACE StatusLogger Reregistering MBeans after reconfigure. Selector=org.apache.logging.log4j.core.selector.ClassLoaderContextSelector@5852c06f TRACE StatusLogger Reregistering context (1/1): '5ffd2b27' org.apache.logging.log4j.core.LoggerContext@31190526 TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=5ffd2b27' TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=5ffd2b27,component=StatusLogger' TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=5ffd2b27,component=ContextSelector' TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=5ffd2b27,component=Loggers,name=*' TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=5ffd2b27,component=Appenders,name=*' TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=5ffd2b27,component=AsyncAppenders,name=*' TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=5ffd2b27,component=AsyncLoggerRingBuffer' TRACE StatusLogger Unregistering but no MBeans found matching 'org.apache.logging.log4j2:type=5ffd2b27,component=Loggers,name=*,subtype=RingBuffer' DEBUG StatusLogger Registering MBean org.apache.logging.log4j2:type=5ffd2b27 DEBUG StatusLogger Registering MBean org.apache.logging.log4j2:type=5ffd2b27,component=StatusLogger DEBUG StatusLogger Registering MBean org.apache.logging.log4j2:type=5ffd2b27,component=ContextSelector DEBUG StatusLogger Registering MBean org.apache.logging.log4j2:type=5ffd2b27,component=Loggers,name= DEBUG StatusLogger Registering MBean org.apache.logging.log4j2:type=5ffd2b27,component=Appenders,name=console TRACE StatusLogger Using default SystemClock for timestamps. DEBUG StatusLogger org.apache.logging.log4j.core.util.SystemClock supports precise timestamps. TRACE StatusLogger Using DummyNanoClock for nanosecond timestamps. DEBUG StatusLogger Reconfiguration complete for context[name=5ffd2b27] at URI /Users/yangjie01/Tools/4.0/spark-4.0.0-SNAPSHOT-bin-3.3.6/conf/log4j2.properties (org.apache.logging.log4j.core.LoggerContext@31190526) with optional ClassLoader: null DEBUG StatusLogger Shutdown hook enabled. Registering a new one. ... ``` But when using log4j 2.23.0, no logs related to `StatusLogger` are printed. cc @dongjoon-hyun > Upgrade log4j2 to 2.23.0 > > > Key: SPARK-47194 > URL: https://issues.apache.org/jira/browse/SPARK-47194 > Project: Spark > Issue Type: Sub-task > Components: Build >Affects Versions: 4.0.0 >Reporter: Yang Jie >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-47194) Upgrade log4j2 to 2.23.0
Yang Jie created SPARK-47194: Summary: Upgrade log4j2 to 2.23.0 Key: SPARK-47194 URL: https://issues.apache.org/jira/browse/SPARK-47194 Project: Spark Issue Type: Sub-task Components: Build Affects Versions: 4.0.0 Reporter: Yang Jie -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-47100) Upgrade netty to 4.1.107.Final and netty-tcnative to 2.0.62.Final
[ https://issues.apache.org/jira/browse/SPARK-47100?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie resolved SPARK-47100. -- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 45178 [https://github.com/apache/spark/pull/45178] > Upgrade netty to 4.1.107.Final and netty-tcnative to 2.0.62.Final > - > > Key: SPARK-47100 > URL: https://issues.apache.org/jira/browse/SPARK-47100 > Project: Spark > Issue Type: Sub-task > Components: Build >Affects Versions: 4.0.0 >Reporter: Dongjoon Hyun >Assignee: Dongjoon Hyun >Priority: Major > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-47089) Migrate mockito 4 to mockito5
[ https://issues.apache.org/jira/browse/SPARK-47089?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17818338#comment-17818338 ] Yang Jie commented on SPARK-47089: -- Thanks [~panbingkun] > Migrate mockito 4 to mockito5 > - > > Key: SPARK-47089 > URL: https://issues.apache.org/jira/browse/SPARK-47089 > Project: Spark > Issue Type: Improvement > Components: Build, Tests >Affects Versions: 4.0.0 >Reporter: Yang Jie >Priority: Major > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-47089) Migrate mockito 4 to mockito5
Yang Jie created SPARK-47089: Summary: Migrate mockito 4 to mockito5 Key: SPARK-47089 URL: https://issues.apache.org/jira/browse/SPARK-47089 Project: Spark Issue Type: Improvement Components: Build, Tests Affects Versions: 4.0.0 Reporter: Yang Jie -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-47084) Upgrade joda-time to 2.12.7
[ https://issues.apache.org/jira/browse/SPARK-47084?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie resolved SPARK-47084. -- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 45153 [https://github.com/apache/spark/pull/45153] > Upgrade joda-time to 2.12.7 > > > Key: SPARK-47084 > URL: https://issues.apache.org/jira/browse/SPARK-47084 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 4.0.0 >Reporter: BingKun Pan >Assignee: BingKun Pan >Priority: Minor > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-47073) Upgrade several Maven plugins to the latest versions
[ https://issues.apache.org/jira/browse/SPARK-47073?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie resolved SPARK-47073. -- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 45136 [https://github.com/apache/spark/pull/45136] > Upgrade several Maven plugins to the latest versions > > > Key: SPARK-47073 > URL: https://issues.apache.org/jira/browse/SPARK-47073 > Project: Spark > Issue Type: Sub-task > Components: Build >Affects Versions: 4.0.0 >Reporter: Dongjoon Hyun >Assignee: Dongjoon Hyun >Priority: Minor > Labels: pull-request-available > Fix For: 4.0.0 > > > * {{versions-maven-plugin}} from 2.16.0 to 2.16.2. > * {{maven-enforcer-plugin}} from 3.3.0 to 3.4.1. > * {{maven-compiler-plugin}} from 3.11.0 to 3.12.1. > * {{maven-surefire-plugin}} from 3.1.2 to 3.2.5. > * {{maven-clean-plugin}} from 3.3.1 to 3.3.2. > * {{maven-javadoc-plugin}} from 3.5.0 to 3.6.3. > * {{maven-shade-plugin}} from 3.5.0 to 3.5.1. > * {{maven-dependency-plugin}} from 3.6.0 to 3.6.1. > * {{maven-checkstyle-plugin}} from 3.3.0 to 3.3.1. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Assigned] (SPARK-47025) Switch `Guava 19.0` dependency scope from `provided` to `test`
[ https://issues.apache.org/jira/browse/SPARK-47025?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie reassigned SPARK-47025: Assignee: Dongjoon Hyun > Switch `Guava 19.0` dependency scope from `provided` to `test` > -- > > Key: SPARK-47025 > URL: https://issues.apache.org/jira/browse/SPARK-47025 > Project: Spark > Issue Type: Test > Components: Build, SQL, Tests >Affects Versions: 4.0.0 >Reporter: Dongjoon Hyun >Assignee: Dongjoon Hyun >Priority: Minor > Labels: pull-request-available > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Resolved] (SPARK-47025) Switch `Guava 19.0` dependency scope from `provided` to `test`
[ https://issues.apache.org/jira/browse/SPARK-47025?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie resolved SPARK-47025. -- Fix Version/s: 4.0.0 Resolution: Fixed Issue resolved by pull request 45088 [https://github.com/apache/spark/pull/45088] > Switch `Guava 19.0` dependency scope from `provided` to `test` > -- > > Key: SPARK-47025 > URL: https://issues.apache.org/jira/browse/SPARK-47025 > Project: Spark > Issue Type: Test > Components: Build, SQL, Tests >Affects Versions: 4.0.0 >Reporter: Dongjoon Hyun >Assignee: Dongjoon Hyun >Priority: Minor > Labels: pull-request-available > Fix For: 4.0.0 > > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-47016) Upgrade scalatest related dependencies to the 3.2.18 series
[ https://issues.apache.org/jira/browse/SPARK-47016?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17816030#comment-17816030 ] Yang Jie commented on SPARK-47016: -- It seems that for `org.scalatestplus:mockito` corresponding to 3.2.18, there is only a version for mockito5. Therefore, it may be necessary to first upgrade to use mockito5 in order to upgrade this test dependency as a whole. > Upgrade scalatest related dependencies to the 3.2.18 series > --- > > Key: SPARK-47016 > URL: https://issues.apache.org/jira/browse/SPARK-47016 > Project: Spark > Issue Type: Improvement > Components: Build >Affects Versions: 4.0.0 >Reporter: Yang Jie >Priority: Major > -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-47016) Upgrade scalatest related dependencies to the 3.2.18 series
Yang Jie created SPARK-47016: Summary: Upgrade scalatest related dependencies to the 3.2.18 series Key: SPARK-47016 URL: https://issues.apache.org/jira/browse/SPARK-47016 Project: Spark Issue Type: Improvement Components: Build Affects Versions: 4.0.0 Reporter: Yang Jie -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-47006) Refactor refill() method to isExhausted() in NioBufferedFileInputStream
[ https://issues.apache.org/jira/browse/SPARK-47006?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Yang Jie updated SPARK-47006: - Description: Currently, in NioBufferedFileInputStream, the refill() method is always invoked in a negated context (!refill()), which can be confusing and counter-intuitive. We can refactor the method so that it's no longer necessary to invert the result of the method call. > Refactor refill() method to isExhausted() in NioBufferedFileInputStream > --- > > Key: SPARK-47006 > URL: https://issues.apache.org/jira/browse/SPARK-47006 > Project: Spark > Issue Type: Improvement > Components: Spark Core >Affects Versions: 4.0.0 >Reporter: Yang Jie >Priority: Minor > > Currently, in NioBufferedFileInputStream, the refill() method is always > invoked in a negated context (!refill()), which can be confusing and > counter-intuitive. We can refactor the method so that it's no longer > necessary to invert the result of the method call. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-47006) Refactor refill() method to isExhausted() in NioBufferedFileInputStream
Yang Jie created SPARK-47006: Summary: Refactor refill() method to isExhausted() in NioBufferedFileInputStream Key: SPARK-47006 URL: https://issues.apache.org/jira/browse/SPARK-47006 Project: Spark Issue Type: Improvement Components: Spark Core Affects Versions: 4.0.0 Reporter: Yang Jie -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org