[ https://issues.apache.org/jira/browse/SPARK-27297?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Takuya Ueshin reassigned SPARK-27297: ------------------------------------- Assignee: Nikolas Vanderhoof > Add higher order functions to Scala API > --------------------------------------- > > Key: SPARK-27297 > URL: https://issues.apache.org/jira/browse/SPARK-27297 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 3.0.0 > Reporter: Nikolas Vanderhoof > Assignee: Nikolas Vanderhoof > Priority: Major > > There is currently no existing Scala API equivalent for the higher order > functions introduced in Spark 2.4.0. > * transform > * aggregate > * filter > * exists > * zip_with > * map_zip_with > * map_filter > * transform_values > * transform_keys > Equivalent column based functions should be added to the Scala API for > org.apache.spark.sql.functions with the following signatures: > > {code:scala} > def transform(column: Column, f: Column => Column): Column = ??? > def transform(column: Column, f: (Column, Column) => Column): Column = ??? > def exists(column: Column, f: Column => Column): Column = ??? > def filter(column: Column, f: Column => Column): Column = ??? > def aggregate( > expr: Column, > zero: Column, > merge: (Column, Column) => Column, > finish: Column => Column): Column = ??? > def aggregate( > expr: Column, > zero: Column, > merge: (Column, Column) => Column): Column = ??? > def zip_with( > left: Column, > right: Column, > f: (Column, Column) => Column): Column = ??? > def transform_keys(expr: Column, f: (Column, Column) => Column): Column = ??? > def transform_values(expr: Column, f: (Column, Column) => Column): Column = > ??? > def map_filter(expr: Column, f: (Column, Column) => Column): Column = ??? > def map_zip_with(left: Column, right: Column, f: (Column, Column, Column) => > Column): Column = ??? > {code} -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org