zabetak commented on code in PR #3733:
URL: https://github.com/apache/calcite/pull/3733#discussion_r1602910060
##########
druid/src/test/java/org/apache/calcite/test/DruidAdapter2IT.java:
##########
@@ -1912,7 +1912,11 @@ private void checkGroupBySingleSortLimit(boolean approx)
{
+ "'expression':'case_searched((\\'$f3\\' == 0),1.0,CAST(\\'$f3\\'";
final String plan = "PLAN="
+ "EnumerableInterpreter\n"
- + " DruidQuery(table=[[foodmart, foodmart]],
intervals=[[1900-01-09T00:00:00.000Z/2992-01-10T00:00:00.000Z]],
projects=[[$63, $90, $91, $89]], groups=[{0}], aggs=[[SUM($1), SUM($2),
SUM($3)]], post_projects=[[$0, /($1, $2), CASE(=($3, 0), 1.0:DECIMAL(19, 0),
CAST($3):DECIMAL(19, 0))]], sort0=[1], dir0=[DESC])\n";
+ + " DruidQuery(table=[[foodmart, foodmart]],"
+ + " intervals=[[1900-01-09T00:00:00.000Z/2992-01-10T00:00:00.000Z]],"
+ + " projects=[[$63, $90, $91, $89]], groups=[{0}], aggs=[[SUM($1),
SUM($2), SUM($3)]],"
+ + " post_projects=[[$0, /($1, $2), CASE(=($3, 0), 1.0:DECIMAL(19, 0),"
+ + " CAST($3):DECIMAL(19, 0))]], sort0=[1], dir0=[DESC])\n";
Review Comment:
Apart from formatting is there anything else that changed? If not then
please restore the formatting to avoid polluting the git history with
unnecessary changes.
##########
druid/src/test/java/org/apache/calcite/test/DruidAdapter2IT.java:
##########
@@ -1909,10 +1909,14 @@ private void checkGroupBySingleSortLimit(boolean
approx) {
+ "end as b from \"foodmart\" group by \"store_state\" order by a
desc";
final String postAggString =
"'postAggregations':[{'type':'expression','name':'A',"
+ "'expression':'(\\'$f1\\' /
\\'$f2\\')'},{'type':'expression','name':'B',"
- + "'expression':'case_searched((\\'$f3\\' == 0),1.0,CAST(\\'$f3\\'";
+ + "'expression':'case_searched((\\'$f3\\' == 0),1,CAST(\\'$f3\\'";
final String plan = "PLAN="
+ "EnumerableInterpreter\n"
- + " DruidQuery(table=[[foodmart, foodmart]],
intervals=[[1900-01-09T00:00:00.000Z/2992-01-10T00:00:00.000Z]],
projects=[[$63, $90, $91, $89]], groups=[{0}], aggs=[[SUM($1), SUM($2),
SUM($3)]], post_projects=[[$0, /($1, $2), CASE(=($3, 0), 1.0:DECIMAL(19, 0),
CAST($3):DECIMAL(19, 0))]], sort0=[1], dir0=[DESC])\n";
+ + " DruidQuery(table=[[foodmart, foodmart]],"
+ + " intervals=[[1900-01-09T00:00:00.000Z/2992-01-10T00:00:00.000Z]],"
+ + " projects=[[$63, $90, $91, $89]], groups=[{0}], aggs=[[SUM($1),
SUM($2), SUM($3)]],"
+ + " post_projects=[[$0, /($1, $2), CASE(=($3, 0), CAST(1:DECIMAL(19,
0)):DECIMAL(19, 0),"
Review Comment:
I suppose this was addressed by other PRs? If that's the case you can mark
this discussion as resolved.
##########
babel/src/test/resources/sql/redshift.iq:
##########
@@ -523,78 +523,78 @@ select deptno, ratio_to_report(sal) over (partition by
deptno) from emp;
!}
# STDDEV_POP
-select empno, stddev_pop(comm) over (order by empno rows unbounded preceding)
from emp where deptno = 30 order by 1;
+select empno, stddev_pop(CAST(comm AS DECIMAL(12, 4))) over (order by empno
rows unbounded preceding) from emp where deptno = 30 order by 1;
Review Comment:
Requiring an explicit CAST in order to make the query work is not user
friendly and I would be a bit surprised if other databases behave the same way.
Since there is no doc about the expected types from aggregate function it is
not completely accurate to say that the old result was wrong.
How about relaxing the precision overflow check till a more complete
solution is found? In fact, I recall that there are some nuances across engines
on what they consider "loss of precision" and "significant digits".
I am bit worried that this kind of breaking change (requiring a CAST) will
make some users unhappy. Nevertheless, I don't feel strongly about it so I am
leaving the final decision to you. Feel free to advance this as you see fit.
##########
testkit/src/main/java/org/apache/calcite/test/SqlOperatorTest.java:
##########
@@ -1725,13 +1746,13 @@ void testCastToBoolean(CastType castType,
SqlOperatorFixture f) {
f.checkString("case 1 when 1 then cast('a' as varchar(1)) "
+ "when 2 then cast('bcd' as varchar(3)) end",
"a", "VARCHAR(3)");
+ f.checkScalarExact("case 2 when 1 then 11.2 "
+ + "when 2 then 4.543 else null end",
+ "DECIMAL(5, 3)", "4.543");
if (DECIMAL) {
- f.checkScalarExact("case 2 when 1 then 11.2 "
- + "when 2 then 4.543 else null end",
- "DECIMAL(5, 3)", "4.543");
f.checkScalarExact("case 1 when 1 then 11.2 "
+ "when 2 then 4.543 else null end",
- "DECIMAL(5, 3)", "11.200");
+ "DECIMAL(5, 3)", "11.2");
Review Comment:
This test is still under DECIMAL flag. It is not fixed?
##########
testkit/src/main/java/org/apache/calcite/test/SqlOperatorTest.java:
##########
@@ -11861,8 +11874,8 @@ private static void
checkArrayConcatAggFuncFails(SqlOperatorFixture t) {
f.checkScalarExact("ceil(cast(3 as integer))", "DOUBLE NOT NULL", "3.0");
f.checkScalarExact("ceil(cast(3 as bigint))", "DOUBLE NOT NULL", "3.0");
f.checkScalarExact("ceil(cast(3.5 as double))", "DOUBLE NOT NULL", "4.0");
- f.checkScalarExact("ceil(cast(3.45 as decimal))",
- "DECIMAL(19, 0) NOT NULL", "4");
+ f.checkScalarExact("ceil(cast(3.45 as decimal(19, 1)))",
+ "DECIMAL(19, 1) NOT NULL", "4");
Review Comment:
Why do we change the default type (`DECIMAL(19,0)`)?
##########
druid/src/test/java/org/apache/calcite/test/DruidAdapterIT.java:
##########
@@ -2220,7 +2220,8 @@ private void checkGroupBySingleSortLimit(boolean approx) {
"DruidQuery(table=[[foodmart, foodmart]],
intervals=[[1900-01-09T00:00:00.000Z/"
+ "2992-01-10T00:00:00.000Z]], projects=[[$63, $90, $91, $89]],
groups=[{0}], "
+ "aggs=[[SUM($1), SUM($2), SUM($3)]], post_projects=[[$0, /($1,
$2), "
- + "CASE(=($3, 0), 1.0:DECIMAL(19, 0), CAST($3):DECIMAL(19, 0))]],
sort0=[1], dir0=[DESC])";
+ + "CASE(=($3, 0), 1.0:DECIMAL(19, 0), "
+ + "CAST($3):DECIMAL(19, 0))]], sort0=[1], dir0=[DESC])";
Review Comment:
If it is just formatting please revert.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]