slfan1989 commented on code in PR #1587:
URL: https://github.com/apache/auron/pull/1587#discussion_r2484234531
##########
native-engine/auron-serde/proto/auron.proto:
##########
@@ -267,6 +267,8 @@ enum ScalarFunction {
Factorial=65;
Hex=66;
Power=67;
+
Review Comment:
Avoid blank lines.
##########
spark-extension-shims-spark/src/test/scala/org/apache/spark/sql/auron/AuronFunctionSuite.scala:
##########
@@ -319,4 +319,21 @@ class AuronFunctionSuite
val row = df.collect().head
assert(row.isNullAt(0) && row.isNullAt(1) && row.isNullAt(2))
}
+
+ test("test function make_date") {
+ withTable("t1") {
+ sql(
+ "create table t1 using parquet as select '2025'" +
+ " as year, '03' as month, '01' as day")
+ val functions =
+ """
+ |select
+ | make_date(year, month, day)
+ |from t1
+ """.stripMargin
+
+ val df = sql(functions)
+ checkAnswer(df, Seq(Row("2025-03-01")))
Review Comment:
Spark’s `make_date(year, month, day)` returns a `DateType (invalid inputs
return NULL when ANSI is disabled)`. The current test uses `checkAnswer(df,
Seq(Row("2025-03-01")))` to assert a `string` literal; a more robust approach
is to assert the type—e.g., expect `Date.valueOf("2025-03-01")`, or at least
verify `df.schema.fields(0).dataType == DateType`.
https://spark.apache.org/docs/latest/api/sql/index.html#make_date
##########
native-engine/auron-serde/src/from_proto.rs:
##########
@@ -786,6 +786,7 @@ impl From<protobuf::ScalarFunction> for Arc<ScalarUDF> {
ScalarFunction::NullIf => f::core::nullif(),
ScalarFunction::DatePart => f::datetime::date_part(),
ScalarFunction::DateTrunc => f::datetime::date_trunc(),
+
Review Comment:
Avoid blank lines.
##########
spark-extension-shims-spark/src/test/scala/org/apache/spark/sql/auron/AuronFunctionSuite.scala:
##########
@@ -319,4 +319,21 @@ class AuronFunctionSuite
val row = df.collect().head
assert(row.isNullAt(0) && row.isNullAt(1) && row.isNullAt(2))
}
+
+ test("test function make_date") {
+ withTable("t1") {
+ sql(
+ "create table t1 using parquet as select '2025'" +
+ " as year, '03' as month, '01' as day")
+ val functions =
+ """
+ |select
+ | make_date(year, month, day)
+ |from t1
+ """.stripMargin
+
+ val df = sql(functions)
+ checkAnswer(df, Seq(Row("2025-03-01")))
Review Comment:
Cover invalid inputs and NULL semantics (suggest adding test cases)
Additional cases: `make_date(2019, 13, 1) -> NULL`, `make_date(2019, 2, 30)
-> NULL`, and any argument being `NULL` -> `NULL`. These are consistent with
the official/Databricks documentation.
##########
spark-extension-shims-spark/src/test/scala/org/apache/spark/sql/auron/AuronFunctionSuite.scala:
##########
@@ -319,4 +319,21 @@ class AuronFunctionSuite
val row = df.collect().head
assert(row.isNullAt(0) && row.isNullAt(1) && row.isNullAt(2))
}
+
+ test("test function make_date") {
+ withTable("t1") {
+ sql(
+ "create table t1 using parquet as select '2025'" +
+ " as year, '03' as month, '01' as day")
+ val functions =
+ """
+ |select
+ | make_date(year, month, day)
+ |from t1
+ """.stripMargin
+
+ val df = sql(functions)
+ checkAnswer(df, Seq(Row("2025-03-01")))
Review Comment:
Nit:
Avoid using aliases that have the same names as Spark functions. It’s better
to write:
```
select '2025' as `year`, '03' as `month`, '01' as `day`
```
instead of:
```
select '2025' as year, '03' as month, '01' as day
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]