[jira] [Commented] (SPARK-28322) DIV support decimal type
[ https://issues.apache.org/jira/browse/SPARK-28322?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16881798#comment-16881798 ] Marco Gaido commented on SPARK-28322: - Thanks for pinging me [~yumwang], I'll work on this on the weekend. Thanks! > DIV support decimal type > > > Key: SPARK-28322 > URL: https://issues.apache.org/jira/browse/SPARK-28322 > Project: Spark > Issue Type: Sub-task > Components: SQL >Affects Versions: 3.0.0 >Reporter: Yuming Wang >Priority: Major > > Spark SQL: > {code:sql} > spark-sql> SELECT DIV(CAST(10 AS DECIMAL), CAST(3 AS DECIMAL)); > Error in query: cannot resolve '(CAST(10 AS DECIMAL(10,0)) div CAST(3 AS > DECIMAL(10,0)))' due to data type mismatch: '(CAST(10 AS DECIMAL(10,0)) div > CAST(3 AS DECIMAL(10,0)))' requires integral type, not decimal(10,0); line 1 > pos 7; > 'Project [unresolvedalias((cast(10 as decimal(10,0)) div cast(3 as > decimal(10,0))), None)] > +- OneRowRelation > {code} > PostgreSQL: > {code:sql} > postgres=# SELECT DIV(CAST(10 AS DECIMAL), CAST(3 AS DECIMAL)); > div > - >3 > (1 row) > {code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-28322) DIV support decimal type
[ https://issues.apache.org/jira/browse/SPARK-28322?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16881694#comment-16881694 ] Yuming Wang commented on SPARK-28322: - {{DIV}} and {{/}} are a little different: {code:sql} select 12345678901234567890 / 123; ?column? 100371373180768845 (1 row) select div(12345678901234567890, 123); div 100371373180768844 (1 row) {code} [https://github.com/postgres/postgres/blob/REL_12_BETA2/src/test/regress/expected/numeric.out#L1564-L1574] > DIV support decimal type > > > Key: SPARK-28322 > URL: https://issues.apache.org/jira/browse/SPARK-28322 > Project: Spark > Issue Type: Sub-task > Components: SQL >Affects Versions: 3.0.0 >Reporter: Yuming Wang >Priority: Major > > Spark SQL: > {code:sql} > spark-sql> SELECT DIV(CAST(10 AS DECIMAL), CAST(3 AS DECIMAL)); > Error in query: cannot resolve '(CAST(10 AS DECIMAL(10,0)) div CAST(3 AS > DECIMAL(10,0)))' due to data type mismatch: '(CAST(10 AS DECIMAL(10,0)) div > CAST(3 AS DECIMAL(10,0)))' requires integral type, not decimal(10,0); line 1 > pos 7; > 'Project [unresolvedalias((cast(10 as decimal(10,0)) div cast(3 as > decimal(10,0))), None)] > +- OneRowRelation > {code} > PostgreSQL: > {code:sql} > postgres=# SELECT DIV(CAST(10 AS DECIMAL), CAST(3 AS DECIMAL)); > div > - >3 > (1 row) > {code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-28322) DIV support decimal type
[ https://issues.apache.org/jira/browse/SPARK-28322?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16881691#comment-16881691 ] Yuming Wang commented on SPARK-28322: - cc [~mgaido] > DIV support decimal type > > > Key: SPARK-28322 > URL: https://issues.apache.org/jira/browse/SPARK-28322 > Project: Spark > Issue Type: Sub-task > Components: SQL >Affects Versions: 3.0.0 >Reporter: Yuming Wang >Priority: Major > > Spark SQL: > {code:sql} > spark-sql> SELECT DIV(CAST(10 AS DECIMAL), CAST(3 AS DECIMAL)); > Error in query: cannot resolve '(CAST(10 AS DECIMAL(10,0)) div CAST(3 AS > DECIMAL(10,0)))' due to data type mismatch: '(CAST(10 AS DECIMAL(10,0)) div > CAST(3 AS DECIMAL(10,0)))' requires integral type, not decimal(10,0); line 1 > pos 7; > 'Project [unresolvedalias((cast(10 as decimal(10,0)) div cast(3 as > decimal(10,0))), None)] > +- OneRowRelation > {code} > PostgreSQL: > {code:sql} > postgres=# SELECT DIV(CAST(10 AS DECIMAL), CAST(3 AS DECIMAL)); > div > - >3 > (1 row) > {code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org