panbingkun commented on code in PR #41561:
URL: https://github.com/apache/spark/pull/41561#discussion_r1230879572


##########
connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/functions.scala:
##########
@@ -3495,6 +3495,151 @@ object functions {
    */
   def to_number(e: Column, format: Column): Column = Column.fn("to_number", e, 
format)
 
+  /**
+   * Returns the ASCII character having the binary equivalent to `n`. If n is 
larger than 256 the
+   * result is equivalent to char(n % 256)
+   *
+   * @group string_funcs
+   * @since 3.5.0
+   */
+  def char(n: Column): Column = Column.fn("char", n)
+
+  /**
+   * Removes the leading and trailing space characters from `str`.
+   *
+   * @group string_funcs
+   * @since 3.5.0
+   */
+  def btrim(str: Column): Column = Column.fn("btrim", str)
+
+  /**
+   * Remove the leading and trailing `trim` characters from `str`.
+   *
+   * @group string_funcs
+   * @since 3.5.0
+   */
+  def btrim(str: Column, trim: Column): Column = Column.fn("btrim", str, trim)
+
+  /**
+   * Returns the character length of string data or number of bytes of binary 
data. The length of
+   * string data includes the trailing spaces. The length of binary data 
includes binary zeros.
+   *
+   * @group string_funcs
+   * @since 3.5.0
+   */
+  def char_length(str: Column): Column = Column.fn("char_length", str)
+
+  /**
+   * Returns the character length of string data or number of bytes of binary 
data. The length of
+   * string data includes the trailing spaces. The length of binary data 
includes binary zeros.
+   *
+   * @group string_funcs
+   * @since 3.5.0
+   */
+  def character_length(str: Column): Column = Column.fn("character_length", 
str)
+
+  /**
+   * Returns the ASCII character having the binary equivalent to `n`. If n is 
larger than 256 the
+   * result is equivalent to chr(n % 256)
+   *
+   * @group string_funcs
+   * @since 3.5.0
+   */
+  def chr(n: Column): Column = Column.fn("chr", n)
+
+  /**
+   * Returns a boolean. The value is True if right is found inside left. 
Returns NULL if either
+   * input expression is NULL. Otherwise, returns False. Both left or right 
must be of STRING or
+   * BINARY type.
+   *
+   * @group string_funcs
+   * @since 3.5.0
+   */
+  def contains(left: Column, right: Column): Column = Column.fn("contains", 
left, right)
+
+  /**
+   * Returns the `n`-th input, e.g., returns `input2` when `n` is 2. The 
function returns NULL if
+   * the index exceeds the length of the array and `spark.sql.ansi.enabled` is 
set to false. If
+   * `spark.sql.ansi.enabled` is set to true, it throws 
ArrayIndexOutOfBoundsException for invalid
+   * indices.
+   *
+   * @group string_funcs
+   * @since 3.5.0
+   */
+  @scala.annotation.varargs
+  def elt(inputs: Column*): Column = Column.fn("elt", inputs: _*)
+
+  /**
+   * Returns the index (1-based) of the given string (`str`) in the 
comma-delimited list
+   * (`strArray`). Returns 0, if the string was not found or if the given 
string (`str`) contains
+   * a comma.
+   *
+   * @group string_funcs
+   * @since 3.5.0
+   */
+  def find_in_set(str: Column, strArray: Column): Column = 
Column.fn("find_in_set", str, strArray)
+
+  /**
+   * Returns true if str matches `pattern` with `escape`, null if any 
arguments are null, false
+   * otherwise.
+   *
+   * @group string_funcs
+   * @since 3.5.0
+   */
+  def like(str: Column, pattern: Column): Column = Column.fn("like", str, 
pattern)

Review Comment:
   @beliefer @zhengruifeng 
   I don't think it's necessary to add this function for the following reasons:
   - 
https://spark.apache.org/docs/latest/sql-ref-functions-builtin.html#string-functions
   <img width="238" alt="image" 
src="https://github.com/apache/spark/assets/15246973/ac39bdfc-45c1-46fa-98e9-a82bb6c47449";>
   
   It only has two parameter usage forms. If a third parameter needs to be 
used, its usage form is:
   <img width="569" alt="image" 
src="https://github.com/apache/spark/assets/15246973/66da2991-c62c-44b4-bc27-19406fbe4ee6";>
   rather than `like(str, pattern, escapeChar)`
   
   - In my first local version, I actually implemented this method and reported 
an error when executing 'SPARK_GENERATE_GOLDEN_FILES=1 build/sbt 
"connect/testOnly org.apache.spark.sql.connect.ProtoToParsedPlanTestSuite"'
   <img width="1412" alt="image" 
src="https://github.com/apache/spark/assets/15246973/35e04901-0ba3-4cac-a5b5-9d659315fb1b";>
   
   It triggers the following detection logic:
   
https://github.com/apache/spark/blob/f44d9238e40448689477b26dd2ddb4012e5bca5e/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/FunctionRegistry.scala#L139-L145
   
   
   
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to