HyukjinKwon commented on a change in pull request #22227: [SPARK-25202] [SQL]
Implements split with limit sql function
URL: https://github.com/apache/spark/pull/22227#discussion_r386185496
##########
File path: sql/core/src/main/scala/org/apache/spark/sql/functions.scala
##########
@@ -2546,15 +2546,39 @@ object functions {
def soundex(e: Column): Column = withExpr { SoundEx(e.expr) }
/**
- * Splits str around pattern (pattern is a regular expression).
+ * Splits str around matches of the given regex.
*
- * @note Pattern is a string representation of the regular expression.
+ * @param str a string expression to split
+ * @param regex a string representing a regular expression. The regex string
should be
+ * a Java regular expression.
*
* @group string_funcs
* @since 1.5.0
*/
- def split(str: Column, pattern: String): Column = withExpr {
- StringSplit(str.expr, lit(pattern).expr)
+ def split(str: Column, regex: String): Column = withExpr {
Review comment:
Okay, but for the record such changes already have been made so far not only
in SQL but SS sides if I am not remembering wrongly because users are expected
to likely edit their source when they compile against Spark 3.0, and it doesn't
break existing compiled apps. I am not sure why this one is special but sure
it's easy to keep the compat with a minimal change.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
With regards,
Apache Git Services
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]