LuciferYang commented on code in PR #43796:
URL: https://github.com/apache/spark/pull/43796#discussion_r1777165654
##########
sql/catalyst/src/main/java/org/apache/spark/sql/connector/expressions/aggregate/Aggregation.java:
##########
@@ -28,16 +28,7 @@
* @since 3.2.0
*/
@Evolving
-public final class Aggregation implements Serializable {
- private final AggregateFunc[] aggregateExpressions;
- private final Expression[] groupByExpressions;
-
- public Aggregation(AggregateFunc[] aggregateExpressions, Expression[]
groupByExpressions) {
- this.aggregateExpressions = aggregateExpressions;
- this.groupByExpressions = groupByExpressions;
- }
-
- public AggregateFunc[] aggregateExpressions() { return aggregateExpressions;
}
-
- public Expression[] groupByExpressions() { return groupByExpressions; }
+public record Aggregation(
Review Comment:
I wrote the following test code relying on Spark 3.5.3(with Scala 2.13):
```java
package org.apache.spark.sql.catalyst.expressions;
import org.apache.spark.sql.connector.expressions.aggregate.Aggregation;
import org.apache.spark.sql.connector.expressions.Expression;
import org.apache.spark.sql.connector.expressions.aggregate.CountStar;
import java.util.Arrays;
public class AggregationTest {
public void testCompatibility() {
CountStar star = new CountStar();
Aggregation aggregation =
new Aggregation(new CountStar[]{star},
Expression.EMPTY_EXPRESSION);
System.out.println(Arrays.toString(aggregation.aggregateExpressions()));
System.out.println(Arrays.toString(aggregation.groupByExpressions()));
System.out.println(aggregation.hashCode());
}
}
```
And I compiled it using Java 8(I packaged this class into
spark-catalyst_2.13-3.5.3-tests.jar, and placed
spark-catalyst_2.13-3.5.3-tests.jar in the
spark-4.0.0-preview1-bin-hadoop3/jars directory), then invoked this code on
4.0.0-preview1:
```
bin/spark-shell --master local
WARNING: Using incubator modules: jdk.incubator.vector
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 4.0.0-preview1
/_/
Using Scala version 2.13.14 (OpenJDK 64-Bit Server VM, Java 17.0.12)
Type in expressions to have them evaluated.
Type :help for more information.
Spark context Web UI available at http://192.168.43.26:4040
Spark context available as 'sc' (master = local, app id =
local-1727359585773).
Spark session available as 'spark'.
scala> val aggTest = new
org.apache.spark.sql.catalyst.expressions.AggregationTest()
val aggTest: org.apache.spark.sql.catalyst.expressions.AggregationTest =
org.apache.spark.sql.catalyst.expressions.AggregationTest@7689b31
scala> aggTest.testCompatibility
[COUNT(*)]
[]
-2108932933
```
The old code can be executed in 4.0 without recompilation, so I think this
is binary compatible.
Additionally, this PR did not modify the MiMa exclude file. If there are
binary compatibility issues, the MiMa check should report an error, right?
@cloud-fan Did you encounter any compatibility issues? Do you have any more
detailed information? Thanks ~
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]