[jira] [Updated] (SPARK-38063) Support SQL split_part function

2022-02-10 Thread Rui Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-38063?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Rui Wang updated SPARK-38063:
-
Description: 
`split_part()` is a commonly supported function by other systems such as 
Postgres and some other systems. The Spark equivalent  is 
`element_at(split(arg, delim), part)`



h5. Function Specificaiton

h6. Syntax

{code:java}
split_part(str, delimiter, partNum)
{code}

h6. Arguments
{code:java}
str: string type
delimiter: string type
partNum: Integer type
{code}

h6. Note
{code:java}
1. This function splits `str` by `delimiter` and return requested part of the 
split (1-based). 
2. If any input parameter is NULL, return NULL.
3. If the index is out of range of split parts, returns empty stirng.
4. If `partNum` is 0, throws an error.
5. If `partNum` is negative, the parts are counted backward from the end of the 
string
6. when delimiter is empty, str is considered not split thus there is just 1 
split part. 
{code}

h6. Examples
{code:java}
> SELECT _FUNC_('11.12.13', '.', 3);
13
> SELECT _FUNC_(NULL, '.', 3);
NULL
> SELECT _FUNC_('11.12.13', '', 1);
'11.12.13'
{code}






  was:
`split_part()` is a commonly supported function by other systems such as 
Postgres and some other systems. The Spark equivalent  is 
`element_at(split(arg, delim), part)`



h5. Function Specificaiton

h6. Syntax

{code:java}
split_part(str, delimiter, partNum)
{code}

h6. Arguments
{code:java}
str: string type
delimiter: string type
partNum: Integer type
{code}

h6. Note
{code:java}
1. This function splits `str` by `delimiter` and return requested part of the 
split (1-based). 
2. If any input parameter is NULL, return NULL.
3. If  the index is out of range of split parts, returns empty stirng.
4. If `partNum` is 0, throws an error.
5. If `partNum` is negative, the parts are counted backward from the end of the 
string
6. when delimiter is empty, str is considered not split thus there is just 1 
split part. 
{code}

h6. Examples
{code:java}
> SELECT _FUNC_('11.12.13', '.', 3);
13
> SELECT _FUNC_(NULL, '.', 3);
NULL
> SELECT _FUNC_('11.12.13', '', 1);
'11.12.13'
{code}







> Support SQL split_part function
> ---
>
> Key: SPARK-38063
> URL: https://issues.apache.org/jira/browse/SPARK-38063
> Project: Spark
>  Issue Type: Task
>  Components: SQL
>Affects Versions: 3.3.0
>Reporter: Rui Wang
>Priority: Major
>
> `split_part()` is a commonly supported function by other systems such as 
> Postgres and some other systems. The Spark equivalent  is 
> `element_at(split(arg, delim), part)`
> h5. Function Specificaiton
> h6. Syntax
> {code:java}
> split_part(str, delimiter, partNum)
> {code}
> h6. Arguments
> {code:java}
> str: string type
> delimiter: string type
> partNum: Integer type
> {code}
> h6. Note
> {code:java}
> 1. This function splits `str` by `delimiter` and return requested part of the 
> split (1-based). 
> 2. If any input parameter is NULL, return NULL.
> 3. If the index is out of range of split parts, returns empty stirng.
> 4. If `partNum` is 0, throws an error.
> 5. If `partNum` is negative, the parts are counted backward from the end of 
> the string
> 6. when delimiter is empty, str is considered not split thus there is just 1 
> split part. 
> {code}
> h6. Examples
> {code:java}
> > SELECT _FUNC_('11.12.13', '.', 3);
> 13
> > SELECT _FUNC_(NULL, '.', 3);
> NULL
> > SELECT _FUNC_('11.12.13', '', 1);
> '11.12.13'
> {code}



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-38063) Support SQL split_part function

2022-02-10 Thread Rui Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-38063?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Rui Wang updated SPARK-38063:
-
Description: 
`split_part()` is a commonly supported function by other systems such as 
Postgres and some other systems. The Spark equivalent  is 
`element_at(split(arg, delim), part)`



h5. Function Specificaiton

h6. Syntax

{code:java}
split_part(str, delimiter, partNum)
{code}

h6. Arguments
{code:java}
str: string type
delimiter: string type
partNum: Integer type
{code}

h6. Note
{code:java}
1. This function splits `str` by `delimiter` and return requested part of the 
split (1-based). 
2. If any input parameter is NULL, return NULL.
3. If  the index is out of range of split parts, returns empty stirng.
4. If `partNum` is 0, throws an error.
5. If `partNum` is negative, the parts are counted backward from the end of the 
string
6. when delimiter is empty, str is considered not split thus there is just 1 
split part. 
{code}

h6. Examples
{code:java}
> SELECT _FUNC_('11.12.13', '.', 3);
13
> SELECT _FUNC_(NULL, '.', 3);
NULL
> SELECT _FUNC_('11.12.13', '', 1);
'11.12.13'
{code}






  was:
`split_part()` is a commonly supported function by other systems such as 
Postgres and some other systems. The Spark equivalent  is 
`element_at(split(arg, delim), part)`



h5. Function Specificaiton

h6. Syntax

{code:java}
split_part(str, delimiter, partNum)
{code}

h6. Arguments
{code:java}
str: string type
delimiter: string type
partNum: Integer type
{code}

h6. Note
{code:java}
1. This function splits `str` by `delimiter` and return requested part of the 
split (1-based). 
2. If any input parameter is NULL, return NULL.
3. If  the index is out of range of split parts, returns null.
4. If `partNum` is 0, throws an error.
5. If `partNum` is negative, the parts are counted backward from the end of the 
string
6. when delimiter is empty, str is considered not split thus there is just 1 
split part. 
{code}

h6. Examples
{code:java}
> SELECT _FUNC_('11.12.13', '.', 3);
13
> SELECT _FUNC_(NULL, '.', 3);
NULL
> SELECT _FUNC_('11.12.13', '', 1);
'11.12.13'
{code}







> Support SQL split_part function
> ---
>
> Key: SPARK-38063
> URL: https://issues.apache.org/jira/browse/SPARK-38063
> Project: Spark
>  Issue Type: Task
>  Components: SQL
>Affects Versions: 3.3.0
>Reporter: Rui Wang
>Priority: Major
>
> `split_part()` is a commonly supported function by other systems such as 
> Postgres and some other systems. The Spark equivalent  is 
> `element_at(split(arg, delim), part)`
> h5. Function Specificaiton
> h6. Syntax
> {code:java}
> split_part(str, delimiter, partNum)
> {code}
> h6. Arguments
> {code:java}
> str: string type
> delimiter: string type
> partNum: Integer type
> {code}
> h6. Note
> {code:java}
> 1. This function splits `str` by `delimiter` and return requested part of the 
> split (1-based). 
> 2. If any input parameter is NULL, return NULL.
> 3. If  the index is out of range of split parts, returns empty stirng.
> 4. If `partNum` is 0, throws an error.
> 5. If `partNum` is negative, the parts are counted backward from the end of 
> the string
> 6. when delimiter is empty, str is considered not split thus there is just 1 
> split part. 
> {code}
> h6. Examples
> {code:java}
> > SELECT _FUNC_('11.12.13', '.', 3);
> 13
> > SELECT _FUNC_(NULL, '.', 3);
> NULL
> > SELECT _FUNC_('11.12.13', '', 1);
> '11.12.13'
> {code}



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-38063) Support SQL split_part function

2022-02-09 Thread Rui Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-38063?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Rui Wang updated SPARK-38063:
-
Description: 
`split_part()` is a commonly supported function by other systems such as 
Postgres and some other systems. The Spark equivalent  is 
`element_at(split(arg, delim), part)`



h5. Function Specificaiton

h6. Syntax

{code:java}
split_part(str, delimiter, partNum)
{code}

h6. Arguments
{code:java}
str: string type
delimiter: string type
partNum: Integer type
{code}

h6. Note
{code:java}
1. This function splits `str` by `delimiter` and return requested part of the 
split (1-based). 
2. If any input parameter is NULL, return NULL.
3. If  the index is out of range of split parts, returns null.
4. If `partNum` is 0, throws an error.
5. If `partNum` is negative, the parts are counted backward from the end of the 
string
6. when delimiter is empty, str is considered not split thus there is just 1 
split part. 
{code}

h6. Examples
{code:java}
> SELECT _FUNC_('11.12.13', '.', 3);
13
> SELECT _FUNC_(NULL, '.', 3);
NULL
> SELECT _FUNC_('11.12.13', '', 1);
'11.12.13'
{code}






  was:
`split_part()` is a commonly supported function by other systems such as 
Postgres and some other systems. The Spark equivalent  is 
`element_at(split(arg, delim), part)`



h5. Function Specificaiton

h6. Syntax

{code:java}
split_part(str, delimiter, partNum)
{code}

h6. Arguments
{code:java}
str: string type
delimiter: string type
partNum: Integer type
{code}

h6. Note
{code:java}
1. This function splits `str` by `delimiter` and return requested part of the 
split (1-based). 
2. If any input parameter is NULL, return NULL.
3. If  the index is out of range of split parts, returns null.
4. If `partNum` is 0, throws an error.
5. If `partNum` is negative, the parts are counted backward from the end of the 
string
6. when delimiter is empty, str is considered not split thus there is just 1 
split part. 
{code}

> SELECT _FUNC_('11.12.13', '.', 3);
13
> SELECT _FUNC_(NULL, '.', 3);
NULL
> SELECT _FUNC_('11.12.13', '', 1);
'11.12.13'
{code}







> Support SQL split_part function
> ---
>
> Key: SPARK-38063
> URL: https://issues.apache.org/jira/browse/SPARK-38063
> Project: Spark
>  Issue Type: Task
>  Components: SQL
>Affects Versions: 3.3.0
>Reporter: Rui Wang
>Priority: Major
>
> `split_part()` is a commonly supported function by other systems such as 
> Postgres and some other systems. The Spark equivalent  is 
> `element_at(split(arg, delim), part)`
> h5. Function Specificaiton
> h6. Syntax
> {code:java}
> split_part(str, delimiter, partNum)
> {code}
> h6. Arguments
> {code:java}
> str: string type
> delimiter: string type
> partNum: Integer type
> {code}
> h6. Note
> {code:java}
> 1. This function splits `str` by `delimiter` and return requested part of the 
> split (1-based). 
> 2. If any input parameter is NULL, return NULL.
> 3. If  the index is out of range of split parts, returns null.
> 4. If `partNum` is 0, throws an error.
> 5. If `partNum` is negative, the parts are counted backward from the end of 
> the string
> 6. when delimiter is empty, str is considered not split thus there is just 1 
> split part. 
> {code}
> h6. Examples
> {code:java}
> > SELECT _FUNC_('11.12.13', '.', 3);
> 13
> > SELECT _FUNC_(NULL, '.', 3);
> NULL
> > SELECT _FUNC_('11.12.13', '', 1);
> '11.12.13'
> {code}



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-38063) Support SQL split_part function

2022-02-09 Thread Rui Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-38063?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Rui Wang updated SPARK-38063:
-
Description: 
`split_part()` is a commonly supported function by other systems such as 
Postgres and some other systems. The Spark equivalent  is 
`element_at(split(arg, delim), part)`



h5. Function Specificaiton

h6. Syntax

{code:java}
split_part(str, delimiter, partNum)
{code}

h6. Arguments
{code:java}
str: string type
delimiter: string type
partNum: Integer type
{code}

h6. Note
{code:java}
1. This function splits `str` by `delimiter` and return requested part of the 
split (1-based). 
2. If any input parameter is NULL, return NULL.
3. If  the index is out of range of split parts, returns null.
4. If `partNum` is 0, throws an error.
5. If `partNum` is negative, the parts are counted backward from the end of the 
string
6. when delimiter is empty, str is considered not split thus there is just 1 
split part. 
{code}

h6.Examples:
{code:java}
  > SELECT _FUNC_('11.12.13', '.', 3);
   13
  > SELECT _FUNC_(NULL, '.', 3);
  NULL
  > SELECT _FUNC_('11.12.13', '', 1);
  '11.12.13'
{code}






  was:
`split_part()` is a commonly supported function by other systems such as 
Postgres and some other systems. The Spark equivalent  is 
`element_at(split(arg, delim), part)`



h5. Function Specificaiton

h6. Syntax

{code:java}
`split_part(str, delimiter, partNum)`
{code}

h6. Arguments
{code:java}
str: string type
delimiter: string type
partNum: Integer type
{code}

h6. Note
{code:java}
1. This function splits `str` by `delimiter` and return requested part of the 
split (1-based). 
2. If any input parameter is NULL, return NULL.
3. If  the index is out of range of split parts, returns null.
4. If `partNum` is 0, throws an error.
5. If `partNum` is negative, the parts are counted backward from the
  end of the string
6. when delimiter is empty, str is considered not split thus there is just 1 
split part. 
{code}

h6.Examples:
{code:java}
  > SELECT _FUNC_('11.12.13', '.', 3);
   13
  > SELECT _FUNC_(NULL, '.', 3);
  NULL
  > SELECT _FUNC_('11.12.13', '', 1);
  '11.12.13'
{code}







> Support SQL split_part function
> ---
>
> Key: SPARK-38063
> URL: https://issues.apache.org/jira/browse/SPARK-38063
> Project: Spark
>  Issue Type: Task
>  Components: SQL
>Affects Versions: 3.3.0
>Reporter: Rui Wang
>Priority: Major
>
> `split_part()` is a commonly supported function by other systems such as 
> Postgres and some other systems. The Spark equivalent  is 
> `element_at(split(arg, delim), part)`
> h5. Function Specificaiton
> h6. Syntax
> {code:java}
> split_part(str, delimiter, partNum)
> {code}
> h6. Arguments
> {code:java}
> str: string type
> delimiter: string type
> partNum: Integer type
> {code}
> h6. Note
> {code:java}
> 1. This function splits `str` by `delimiter` and return requested part of the 
> split (1-based). 
> 2. If any input parameter is NULL, return NULL.
> 3. If  the index is out of range of split parts, returns null.
> 4. If `partNum` is 0, throws an error.
> 5. If `partNum` is negative, the parts are counted backward from the end of 
> the string
> 6. when delimiter is empty, str is considered not split thus there is just 1 
> split part. 
> {code}
> h6.Examples:
> {code:java}
>   > SELECT _FUNC_('11.12.13', '.', 3);
>13
>   > SELECT _FUNC_(NULL, '.', 3);
>   NULL
>   > SELECT _FUNC_('11.12.13', '', 1);
>   '11.12.13'
> {code}



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-38063) Support SQL split_part function

2022-02-09 Thread Rui Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-38063?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Rui Wang updated SPARK-38063:
-
Description: 
`split_part()` is a commonly supported function by other systems such as 
Postgres and some other systems. The Spark equivalent  is 
`element_at(split(arg, delim), part)`



h5. Function Specificaiton

h6. Syntax

{code:java}
split_part(str, delimiter, partNum)
{code}

h6. Arguments
{code:java}
str: string type
delimiter: string type
partNum: Integer type
{code}

h6. Note
{code:java}
1. This function splits `str` by `delimiter` and return requested part of the 
split (1-based). 
2. If any input parameter is NULL, return NULL.
3. If  the index is out of range of split parts, returns null.
4. If `partNum` is 0, throws an error.
5. If `partNum` is negative, the parts are counted backward from the end of the 
string
6. when delimiter is empty, str is considered not split thus there is just 1 
split part. 
{code}

> SELECT _FUNC_('11.12.13', '.', 3);
13
> SELECT _FUNC_(NULL, '.', 3);
NULL
> SELECT _FUNC_('11.12.13', '', 1);
'11.12.13'
{code}






  was:
`split_part()` is a commonly supported function by other systems such as 
Postgres and some other systems. The Spark equivalent  is 
`element_at(split(arg, delim), part)`



h5. Function Specificaiton

h6. Syntax

{code:java}
split_part(str, delimiter, partNum)
{code}

h6. Arguments
{code:java}
str: string type
delimiter: string type
partNum: Integer type
{code}

h6. Note
{code:java}
1. This function splits `str` by `delimiter` and return requested part of the 
split (1-based). 
2. If any input parameter is NULL, return NULL.
3. If  the index is out of range of split parts, returns null.
4. If `partNum` is 0, throws an error.
5. If `partNum` is negative, the parts are counted backward from the end of the 
string
6. when delimiter is empty, str is considered not split thus there is just 1 
split part. 
{code}

h6.Examples:
{code:java}
  > SELECT _FUNC_('11.12.13', '.', 3);
   13
  > SELECT _FUNC_(NULL, '.', 3);
  NULL
  > SELECT _FUNC_('11.12.13', '', 1);
  '11.12.13'
{code}







> Support SQL split_part function
> ---
>
> Key: SPARK-38063
> URL: https://issues.apache.org/jira/browse/SPARK-38063
> Project: Spark
>  Issue Type: Task
>  Components: SQL
>Affects Versions: 3.3.0
>Reporter: Rui Wang
>Priority: Major
>
> `split_part()` is a commonly supported function by other systems such as 
> Postgres and some other systems. The Spark equivalent  is 
> `element_at(split(arg, delim), part)`
> h5. Function Specificaiton
> h6. Syntax
> {code:java}
> split_part(str, delimiter, partNum)
> {code}
> h6. Arguments
> {code:java}
> str: string type
> delimiter: string type
> partNum: Integer type
> {code}
> h6. Note
> {code:java}
> 1. This function splits `str` by `delimiter` and return requested part of the 
> split (1-based). 
> 2. If any input parameter is NULL, return NULL.
> 3. If  the index is out of range of split parts, returns null.
> 4. If `partNum` is 0, throws an error.
> 5. If `partNum` is negative, the parts are counted backward from the end of 
> the string
> 6. when delimiter is empty, str is considered not split thus there is just 1 
> split part. 
> {code}
> > SELECT _FUNC_('11.12.13', '.', 3);
> 13
> > SELECT _FUNC_(NULL, '.', 3);
> NULL
> > SELECT _FUNC_('11.12.13', '', 1);
> '11.12.13'
> {code}



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-38063) Support SQL split_part function

2022-02-09 Thread Rui Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-38063?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Rui Wang updated SPARK-38063:
-
Description: 
`split_part()` is a commonly supported function by other systems such as 
Postgres and some other systems. The Spark equivalent  is 
`element_at(split(arg, delim), part)`



h5. Function Specificaiton

h6. Syntax

{code:java}
`split_part(str, delimiter, partNum)`
{code}

h6. Arguments
{code:java}
str: string type
delimiter: string type
partNum: Integer type
{code}

h6. Note
{code:java}
1. This function splits `str` by `delimiter` and return requested part of the 
split (1-based). 
2. If any input parameter is NULL, return NULL.
3. If  the index is out of range of split parts, returns null.
4. If `partNum` is 0, throws an error.
5. If `partNum` is negative, the parts are counted backward from the
  end of the string
6. when delimiter is empty, str is considered not split thus there is just 1 
split part. 
{code}

h6.Examples:
{code:java}
  > SELECT _FUNC_('11.12.13', '.', 3);
   13
  > SELECT _FUNC_(NULL, '.', 3);
  NULL
  > SELECT _FUNC_('11.12.13', '', 1);
  '11.12.13'
{code}






  was:
`split_part()` is a commonly supported function by other systems such as 
Postgres and some other systems. The Spark equivalent  is 
`element_at(split(arg, delim), part)`



h5. Function Specificaiton


{code:java}

`split_part(str, delimiter, partNum)`

str: string type
delimiter: string type
partNum: Integer type

1. This function splits `str` by `delimiter` and return requested part of the 
split (1-based). 
2. If any input parameter is NULL, return NULL.
3. If  the index is out of range of split parts, returns null.
4. If `partNum` is 0, throws an error.
5. If `partNum` is negative, the parts are counted backward from the
  end of the string
6. when delimiter is empty, str is considered not split thus there is just 1 
split part. 

Examples:
```
  > SELECT _FUNC_('11.12.13', '.', 3);
   13
  > SELECT _FUNC_(NULL, '.', 3);
  NULL
  > SELECT _FUNC_('11.12.13', '', 1);
  '11.12.13'
```
{code}






> Support SQL split_part function
> ---
>
> Key: SPARK-38063
> URL: https://issues.apache.org/jira/browse/SPARK-38063
> Project: Spark
>  Issue Type: Task
>  Components: SQL
>Affects Versions: 3.3.0
>Reporter: Rui Wang
>Priority: Major
>
> `split_part()` is a commonly supported function by other systems such as 
> Postgres and some other systems. The Spark equivalent  is 
> `element_at(split(arg, delim), part)`
> h5. Function Specificaiton
> h6. Syntax
> {code:java}
> `split_part(str, delimiter, partNum)`
> {code}
> h6. Arguments
> {code:java}
> str: string type
> delimiter: string type
> partNum: Integer type
> {code}
> h6. Note
> {code:java}
> 1. This function splits `str` by `delimiter` and return requested part of the 
> split (1-based). 
> 2. If any input parameter is NULL, return NULL.
> 3. If  the index is out of range of split parts, returns null.
> 4. If `partNum` is 0, throws an error.
> 5. If `partNum` is negative, the parts are counted backward from the
>   end of the string
> 6. when delimiter is empty, str is considered not split thus there is just 1 
> split part. 
> {code}
> h6.Examples:
> {code:java}
>   > SELECT _FUNC_('11.12.13', '.', 3);
>13
>   > SELECT _FUNC_(NULL, '.', 3);
>   NULL
>   > SELECT _FUNC_('11.12.13', '', 1);
>   '11.12.13'
> {code}



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-38063) Support SQL split_part function

2022-02-09 Thread Rui Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-38063?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Rui Wang updated SPARK-38063:
-
Description: 
`split_part()` is a commonly supported function by other systems such as 
Postgres and some other systems. The Spark equivalent  is 
`element_at(split(arg, delim), part)`



h5. Function Specificaiton


{code:java}

`split_part(str, delimiter, partNum)`

str: string type
delimiter: string type
partNum: Integer type

1. This function splits `str` by `delimiter` and return requested part of the 
split (1-based). 
2. If any input parameter is NULL, return NULL.
3. If  the index is out of range of split parts, returns null.
4. If `partNum` is 0, throws an error.
5. If `partNum` is negative, the parts are counted backward from the
  end of the string
6. when delimiter is empty, str is considered not split thus there is just 1 
split part. 

Examples:
```
  > SELECT _FUNC_('11.12.13', '.', 3);
   13
  > SELECT _FUNC_(NULL, '.', 3);
  NULL
  > SELECT _FUNC_('11.12.13', '', 1);
  '11.12.13'
```
{code}





  was:
`split_part()` is a commonly supported function by other systems such as 
Postgres and some other systems. The Spark equivalent  is 
`element_at(split(arg, delim), part)`



h5. Function Specificaiton


{code:java}

`split_part(str, delimiter, partNum)`

str: string type
delimiter: string type
partNum: Integer type

1. This function splits `str` by `delimiter` and return requested part of the 
split (1-based). 
2. If any input parameter is NULL, return NULL.
3. If  the index is out of range of split parts, returns null.
4. If `partNum` is 0, throws an error.
5. If `partNum` is negative, the parts are counted backward from the
  end of the string
6. when delimiter is empty, str is considered not split thus there is just 1 
split part. 
{code}


Examples:
```
  > SELECT _FUNC_('11.12.13', '.', 3);
   13
  > SELECT _FUNC_(NULL, '.', 3);
  NULL
  > SELECT _FUNC_('11.12.13', '', 1);
  '11.12.13'
```



> Support SQL split_part function
> ---
>
> Key: SPARK-38063
> URL: https://issues.apache.org/jira/browse/SPARK-38063
> Project: Spark
>  Issue Type: Task
>  Components: SQL
>Affects Versions: 3.3.0
>Reporter: Rui Wang
>Priority: Major
>
> `split_part()` is a commonly supported function by other systems such as 
> Postgres and some other systems. The Spark equivalent  is 
> `element_at(split(arg, delim), part)`
> h5. Function Specificaiton
> {code:java}
> `split_part(str, delimiter, partNum)`
> str: string type
> delimiter: string type
> partNum: Integer type
> 1. This function splits `str` by `delimiter` and return requested part of the 
> split (1-based). 
> 2. If any input parameter is NULL, return NULL.
> 3. If  the index is out of range of split parts, returns null.
> 4. If `partNum` is 0, throws an error.
> 5. If `partNum` is negative, the parts are counted backward from the
>   end of the string
> 6. when delimiter is empty, str is considered not split thus there is just 1 
> split part. 
> Examples:
> ```
>   > SELECT _FUNC_('11.12.13', '.', 3);
>13
>   > SELECT _FUNC_(NULL, '.', 3);
>   NULL
>   > SELECT _FUNC_('11.12.13', '', 1);
>   '11.12.13'
> ```
> {code}



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-38063) Support SQL split_part function

2022-02-09 Thread Rui Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-38063?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Rui Wang updated SPARK-38063:
-
Description: 
`split_part()` is a commonly supported function by other systems such as 
Postgres and some other systems. The Spark equivalent  is 
`element_at(split(arg, delim), part)`



h5. Function Specificaiton


{code:java}

`split_part(str, delimiter, partNum)`

str: string type
delimiter: string type
partNum: Integer type

1. This function splits `str` by `delimiter` and return requested part of the 
split (1-based). 
2. If any input parameter is NULL, return NULL.
3. If  the index is out of range of split parts, returns null.
4. If `partNum` is 0, throws an error.
5. If `partNum` is negative, the parts are counted backward from the
  end of the string
6. when delimiter is empty, str is considered not split thus there is just 1 
split part. 
{code}


Examples:
```
  > SELECT _FUNC_('11.12.13', '.', 3);
   13
  > SELECT _FUNC_(NULL, '.', 3);
  NULL
  > SELECT _FUNC_('11.12.13', '', 1);
  '11.12.13'
```


  was:
`split_part()` is a commonly supported function by other systems such as 
Postgres and some other systems. The Spark equivalent  is 
`element_at(split(arg, delim), part)`



The following demonstrates more about the new function:


{code:java}

`split_part(str, delimiter, partNum)`

str: string type
delimiter: string type
partNum: Integer type

1. This function splits `str` by `delimiter` and return requested part of the 
split (1-based). 
2. If any input parameter is NULL, return NULL.
3. If  the index is out of range of split parts, returns null.
4. If `partNum` is 0, throws an error.
5. If `partNum` is negative, the parts are counted backward from the
  end of the string
6. when delimiter is empty, str is considered not split thus there is just 1 
split part. 
{code}


Examples:
```
  > SELECT _FUNC_('11.12.13', '.', 3);
   13
  > SELECT _FUNC_(NULL, '.', 3);
  NULL
  > SELECT _FUNC_('11.12.13', '', 1);
  '11.12.13'
```



> Support SQL split_part function
> ---
>
> Key: SPARK-38063
> URL: https://issues.apache.org/jira/browse/SPARK-38063
> Project: Spark
>  Issue Type: Task
>  Components: SQL
>Affects Versions: 3.3.0
>Reporter: Rui Wang
>Priority: Major
>
> `split_part()` is a commonly supported function by other systems such as 
> Postgres and some other systems. The Spark equivalent  is 
> `element_at(split(arg, delim), part)`
> h5. Function Specificaiton
> {code:java}
> `split_part(str, delimiter, partNum)`
> str: string type
> delimiter: string type
> partNum: Integer type
> 1. This function splits `str` by `delimiter` and return requested part of the 
> split (1-based). 
> 2. If any input parameter is NULL, return NULL.
> 3. If  the index is out of range of split parts, returns null.
> 4. If `partNum` is 0, throws an error.
> 5. If `partNum` is negative, the parts are counted backward from the
>   end of the string
> 6. when delimiter is empty, str is considered not split thus there is just 1 
> split part. 
> {code}
> Examples:
> ```
>   > SELECT _FUNC_('11.12.13', '.', 3);
>13
>   > SELECT _FUNC_(NULL, '.', 3);
>   NULL
>   > SELECT _FUNC_('11.12.13', '', 1);
>   '11.12.13'
> ```



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-38063) Support SQL split_part function

2022-02-09 Thread Rui Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-38063?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Rui Wang updated SPARK-38063:
-
Description: 
`split_part()` is a commonly supported function by other systems such as 
Postgres and some other systems. The Spark equivalent  is 
`element_at(split(arg, delim), part)`



The following demonstrates more about the new function:


{code:java}

`split_part(str, delimiter, partNum)`

str: string type
delimiter: string type
partNum: Integer type

1. This function splits `str` by `delimiter` and return requested part of the 
split (1-based). 
2. If any input parameter is NULL, return NULL.
3. If  the index is out of range of split parts, returns null.
4. If `partNum` is 0, throws an error.
5. If `partNum` is negative, the parts are counted backward from the
  end of the string
6. when delimiter is empty, str is considered not split thus there is just 1 
split part. 
{code}


Examples:
```
  > SELECT _FUNC_('11.12.13', '.', 3);
   13
  > SELECT _FUNC_(NULL, '.', 3);
  NULL
  > SELECT _FUNC_('11.12.13', '', 1);
  '11.12.13'
```


  was:
`split_part()` is a commonly supported function by other systems such as 
Postgres and some other systems. The Spark equivalent  is 
`element_at(split(arg, delim), part)`



The following demonstrates more about the new function:

`split_part(str, delimiter, partNum)`

This function splits `str` by `delimiter` and return requested part of the 
split (1-based). If any input parameter is NULL, return NULL.

`str` and `delimiter` are the same type as `string`. `partNum` is `integer` type

Examples:
```
  > SELECT _FUNC_('11.12.13', '.', 3);
   13
  > SELECT _FUNC_(NULL, '.', 3);
  NULL
```



> Support SQL split_part function
> ---
>
> Key: SPARK-38063
> URL: https://issues.apache.org/jira/browse/SPARK-38063
> Project: Spark
>  Issue Type: Task
>  Components: SQL
>Affects Versions: 3.3.0
>Reporter: Rui Wang
>Priority: Major
>
> `split_part()` is a commonly supported function by other systems such as 
> Postgres and some other systems. The Spark equivalent  is 
> `element_at(split(arg, delim), part)`
> The following demonstrates more about the new function:
> {code:java}
> `split_part(str, delimiter, partNum)`
> str: string type
> delimiter: string type
> partNum: Integer type
> 1. This function splits `str` by `delimiter` and return requested part of the 
> split (1-based). 
> 2. If any input parameter is NULL, return NULL.
> 3. If  the index is out of range of split parts, returns null.
> 4. If `partNum` is 0, throws an error.
> 5. If `partNum` is negative, the parts are counted backward from the
>   end of the string
> 6. when delimiter is empty, str is considered not split thus there is just 1 
> split part. 
> {code}
> Examples:
> ```
>   > SELECT _FUNC_('11.12.13', '.', 3);
>13
>   > SELECT _FUNC_(NULL, '.', 3);
>   NULL
>   > SELECT _FUNC_('11.12.13', '', 1);
>   '11.12.13'
> ```



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-38063) Support SQL split_part function

2022-01-31 Thread Rui Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-38063?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Rui Wang updated SPARK-38063:
-
Description: 
`split_part()` is a commonly supported function by other systems such as 
Postgres and some other systems. The Spark equivalent  is 
`element_at(split(arg, delim), part)`



The following demonstrates more about the new function:

`split_part(str, delimiter, partNum)`

This function splits `str` by `delimiter` and return requested part of the 
split (1-based). If any input parameter is NULL, return NULL.

`str` and `delimiter` are the same type as `string`. `partNum` is `integer` type

Examples:
```
  > SELECT _FUNC_('11.12.13', '.', 3);
   13
  > SELECT _FUNC_(NULL, '.', 3);
  NULL
```


  was:
`split_part()` is a commonly supported function by other systems such as 
Postgres and some other systems.

The Spark equivalent  is `element_at(split(arg, delim), part)`

If any input parameter is NULL, return NULL.


> Support SQL split_part function
> ---
>
> Key: SPARK-38063
> URL: https://issues.apache.org/jira/browse/SPARK-38063
> Project: Spark
>  Issue Type: Task
>  Components: SQL
>Affects Versions: 3.3.0
>Reporter: Rui Wang
>Priority: Major
>
> `split_part()` is a commonly supported function by other systems such as 
> Postgres and some other systems. The Spark equivalent  is 
> `element_at(split(arg, delim), part)`
> The following demonstrates more about the new function:
> `split_part(str, delimiter, partNum)`
> This function splits `str` by `delimiter` and return requested part of the 
> split (1-based). If any input parameter is NULL, return NULL.
> `str` and `delimiter` are the same type as `string`. `partNum` is `integer` 
> type
> Examples:
> ```
>   > SELECT _FUNC_('11.12.13', '.', 3);
>13
>   > SELECT _FUNC_(NULL, '.', 3);
>   NULL
> ```



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-38063) Support SQL split_part function

2022-01-31 Thread Rui Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-38063?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Rui Wang updated SPARK-38063:
-
Description: 
`split_part()` is a commonly supported function by other systems such as 
Postgres and some other systems.

The Spark equivalent  is `element_at(split(arg, delim), part)`

If any input parameter is NULL, return NULL.

  was:
`split_part()` is a commonly supported function by other systems such as 
Postgres and some other systems.

The Spark equivalent (please double check) is `element_at(split(arg, delim), 
part)`

If any input parameter is NULL, return NULL.


> Support SQL split_part function
> ---
>
> Key: SPARK-38063
> URL: https://issues.apache.org/jira/browse/SPARK-38063
> Project: Spark
>  Issue Type: Task
>  Components: SQL
>Affects Versions: 3.3.0
>Reporter: Rui Wang
>Priority: Major
>
> `split_part()` is a commonly supported function by other systems such as 
> Postgres and some other systems.
> The Spark equivalent  is `element_at(split(arg, delim), part)`
> If any input parameter is NULL, return NULL.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-38063) Support SQL split_part function

2022-01-31 Thread Rui Wang (Jira)


 [ 
https://issues.apache.org/jira/browse/SPARK-38063?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Rui Wang updated SPARK-38063:
-
Description: 
`split_part()` is a commonly supported function by other systems such as 
Postgres and some other systems.

The Spark equivalent (please double check) is `element_at(split(arg, delim), 
part)`

If any input parameter is NULL, return NULL.

  was:
`split_part()` is a commonly supported function by other systems such as 
Postgres, Snowflake, Redshift.

The Spark equivalent (please double check) is `element_at(split(arg, delim), 
part)`

Note that Snowflake supports negative index which counts from the back. We 
should automatically support that by using `element_at()`.

If any input parameter is NULL, return NULL.


> Support SQL split_part function
> ---
>
> Key: SPARK-38063
> URL: https://issues.apache.org/jira/browse/SPARK-38063
> Project: Spark
>  Issue Type: Task
>  Components: SQL
>Affects Versions: 3.3.0
>Reporter: Rui Wang
>Priority: Major
>
> `split_part()` is a commonly supported function by other systems such as 
> Postgres and some other systems.
> The Spark equivalent (please double check) is `element_at(split(arg, delim), 
> part)`
> If any input parameter is NULL, return NULL.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org