[ 
https://issues.apache.org/jira/browse/SPARK-27425?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Chaerim Yeo updated SPARK-27425:
--------------------------------
    Description: 
Add aggregation function which returns the number of records satisfying a given 
condition.

For Presto, 
[{{count_if}}|https://prestodb.github.io/docs/current/functions/aggregate.html] 
function is supported, we can write concisely.

However, Spark does not support yet, we need to write like {{count(case when 
some_condition then 1 end)}} or {{sum(case when some_condition then 1 end)}}, 
which looks painful.

  was:
Add aggregation function which returns the number of records satisfying a given 
condition.

For Presto, 
[{{count_if}}|https://prestodb.github.io/docs/current/functions/aggregate.html] 
function is supported, we can write concisely.

However, Spark does not support yet, we need to write like {{count(case when 
some_condition then 1)}} or {{sum(case when some_condition then 1 end)}}, which 
looks painful.


> Add count_if functions
> ----------------------
>
>                 Key: SPARK-27425
>                 URL: https://issues.apache.org/jira/browse/SPARK-27425
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 2.4.1
>            Reporter: Chaerim Yeo
>            Priority: Minor
>
> Add aggregation function which returns the number of records satisfying a 
> given condition.
> For Presto, 
> [{{count_if}}|https://prestodb.github.io/docs/current/functions/aggregate.html]
>  function is supported, we can write concisely.
> However, Spark does not support yet, we need to write like {{count(case when 
> some_condition then 1 end)}} or {{sum(case when some_condition then 1 end)}}, 
> which looks painful.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to