[ https://issues.apache.org/jira/browse/SPARK-39832?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-39832: ------------------------------------ Assignee: Apache Spark > regexp_replace should support column arguments > ---------------------------------------------- > > Key: SPARK-39832 > URL: https://issues.apache.org/jira/browse/SPARK-39832 > Project: Spark > Issue Type: Improvement > Components: PySpark > Affects Versions: 3.3.0 > Reporter: Brian Schaefer > Assignee: Apache Spark > Priority: Major > Labels: starter > > {{F.regexp_replace}} in PySpark currently only supports strings for the > second and third argument: > [https://github.com/apache/spark/blob/1df6006ea977ae3b8c53fe33630e277e8c1bc49c/python/pyspark/sql/functions.py#L3265] > In Scala, columns are also supported: > [https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/functions.scala#L2836|https://github.com/apache/spark/blob/1df6006ea977ae3b8c53fe33630e277e8c1bc49c/sql/core/src/main/scala/org/apache/spark/sql/functions.scala#L2836] > The desire to use columns as arguments for the function has been raised > previously on StackExchange: > [https://stackoverflow.com/questions/64613761/in-pyspark-using-regexp-replace-how-to-replace-a-group-with-value-from-another|https://stackoverflow.com/questions/64613761/in-pyspark-using-regexp-replace-how-to-replace-a-group-with-value-from-another,], > where the suggested fix was to use {{{}F.expr{}}}. > It should be relatively straightforward to support in PySpark the two > function signatures supported in Scala. -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org