Hi, As you know there's no string "replace" function inside pyspark.sql.functions for PySpark nor in org.apache.sql.functions for Scala/Java and was wondering why is that so? And I know there's regexp_replace instead and na.replace or SQL with expr.
I think it's one of the fundamental functions in users/developers toolset and available almost in every language. It takes time for new Spark devs to realise it's not there and to use alternative ones. So, I think it would be nice to have one. I had already got a prototype for Scala (which is just a sugar over regexp_replace) and works like a charm:) Would like to know your opinion to contribute or not needed... Thanks Khalid