[ https://issues.apache.org/jira/browse/SPARK-20581?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15997832#comment-15997832 ]
Xiao Li commented on SPARK-20581: --------------------------------- Can you try the latest version? It correctly works in 2.2. > Using AVG or SUM on a INT/BIGINT column with fraction operator will yield > BIGINT instead of DOUBLE > -------------------------------------------------------------------------------------------------- > > Key: SPARK-20581 > URL: https://issues.apache.org/jira/browse/SPARK-20581 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 1.6.2 > Reporter: Dominic Ricard > > We stumbled on this multiple times and every time we are baffled by the > behavior of AVG and SUM. > Given the following SQL (Executed through Thrift): > {noformat} > SELECT SUM(col/2) FROM > (SELECT 3 as `col`) t > {noformat} > The result will be "1", when the expected and accurate result is 1.5 > Here's the explain plan: > {noformat} > == Physical Plan == > TungstenAggregate(key=[], functions=[(sum(cast((cast(col#1519342 as double) / > 2.0) as bigint)),mode=Final,isDistinct=false)], output=[_c0#1519344L]) > +- TungstenExchange SinglePartition, None > +- TungstenAggregate(key=[], functions=[(sum(cast((cast(col#1519342 as > double) / 2.0) as bigint)),mode=Partial,isDistinct=false)], > output=[sum#1519347L]) > +- Project [3 AS col#1519342] > +- Scan OneRowRelation[] > {noformat} > Why the extra cast to BIGINT? -- This message was sent by Atlassian JIRA (v6.3.15#6346) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org