[jira] [Commented] (SPARK-28324) The LOG function using 10 as the base, but Spark using E
[ https://issues.apache.org/jira/browse/SPARK-28324?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17004653#comment-17004653 ] Takeshi Yamamuro commented on SPARK-28324: -- I'll close this based on the discussion above. Thanks, all. > The LOG function using 10 as the base, but Spark using E > > > Key: SPARK-28324 > URL: https://issues.apache.org/jira/browse/SPARK-28324 > Project: Spark > Issue Type: Sub-task > Components: SQL >Affects Versions: 3.0.0 >Reporter: Yuming Wang >Priority: Major > > Spark SQL: > {code:sql} > spark-sql> select log(10); > 2.302585092994046 > {code} > PostgreSQL: > {code:sql} > postgres=# select log(10); > log > - >1 > (1 row) > {code} -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-28324) The LOG function using 10 as the base, but Spark using E
[ https://issues.apache.org/jira/browse/SPARK-28324?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16884397#comment-16884397 ] Marco Gaido commented on SPARK-28324: - +1 for [~srowen]'s opinion. I don't think it is a good idea to change the behavior here. > The LOG function using 10 as the base, but Spark using E > > > Key: SPARK-28324 > URL: https://issues.apache.org/jira/browse/SPARK-28324 > Project: Spark > Issue Type: Sub-task > Components: SQL >Affects Versions: 3.0.0 >Reporter: Yuming Wang >Priority: Major > > Spark SQL: > {code:sql} > spark-sql> select log(10); > 2.302585092994046 > {code} > PostgreSQL: > {code:sql} > postgres=# select log(10); > log > - >1 > (1 row) > {code} -- This message was sent by Atlassian JIRA (v7.6.14#76016) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-28324) The LOG function using 10 as the base, but Spark using E
[ https://issues.apache.org/jira/browse/SPARK-28324?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16882288#comment-16882288 ] Sean Owen commented on SPARK-28324: --- I don't think we should change this as it will break code and there isn't a 'standard' here AFAICT. As you say, Hive treats this as log base e, as does Java, Scala, etc. You can add log10() etc. > The LOG function using 10 as the base, but Spark using E > > > Key: SPARK-28324 > URL: https://issues.apache.org/jira/browse/SPARK-28324 > Project: Spark > Issue Type: Sub-task > Components: SQL >Affects Versions: 3.0.0 >Reporter: Yuming Wang >Priority: Major > > Spark SQL: > {code:sql} > spark-sql> select log(10); > 2.302585092994046 > {code} > PostgreSQL: > {code:sql} > postgres=# select log(10); > log > - >1 > (1 row) > {code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-28324) The LOG function using 10 as the base, but Spark using E
[ https://issues.apache.org/jira/browse/SPARK-28324?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16881748#comment-16881748 ] Yuming Wang commented on SPARK-28324: - PostgreSQL, Vertica, Teradata using 10 as the base. DB2, SQL Server, Hive and MySQL using E as the base. > The LOG function using 10 as the base, but Spark using E > > > Key: SPARK-28324 > URL: https://issues.apache.org/jira/browse/SPARK-28324 > Project: Spark > Issue Type: Sub-task > Components: SQL >Affects Versions: 3.0.0 >Reporter: Yuming Wang >Priority: Major > > Spark SQL: > {code:sql} > spark-sql> select log(10); > 2.302585092994046 > {code} > PostgreSQL: > {code:sql} > postgres=# select log(10); > log > - >1 > (1 row) > {code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org