bersprockets opened a new pull request, #37763:
URL: https://github.com/apache/spark/pull/37763

   ### What changes were proposed in this pull request?
   
   Remove the check for foldable delimiter arguments from 
`StringToMap#checkInputDataTypes`.
   
   Except for `checkInputDataTypes`, `StringToMap` is already able to handle 
non-foldable delimiter arguments (no other changes are required).
   
   ### Why are the changes needed?
   
   Hive 2.3.9 allows non-foldable delimiter arguments, e.g.:
   ```
   drop table if exists maptbl;
   create table maptbl as select ',' as del1, ':' as del2, 'a:1,b:2,c:3' as str;
   insert into table maptbl select '%' as del1, '-' as del2, 'a-1%b-2%c-3' as 
str;
   select str, str_to_map(str, del1, del2) from maptbl;
   ```
   This returns
   ```
   +--------------+----------------------------+
   |     str      |            _c1             |
   +--------------+----------------------------+
   | a:1,b:2,c:3  | {"a":"1","b":"2","c":"3"}  |
   | a-1%b-2%c-3  | {"a":"1","b":"2","c":"3"}  |
   +--------------+----------------------------+
   2 rows selected (0.13 seconds)
   ```
   However, Spark returns an error:
   ```
   str_to_map's delimiters must be foldable.; line 1 pos 12;
   ```
   
   The use-case is more likely to be something like this:
   ```
   select
     str,
     str_to_map(str, ',', if(region = 0, ':', '#')) as m
   from
     maptbl2;
   
   ```
   
   ### Does this PR introduce _any_ user-facing change?
   
   Yes, users can now specify non-foldable delimiter arguments to `str_to_map`.
   
   Literals are still accepted, so the change is backwardly compatible.
   
   ### How was this patch tested?
   
   New unit tests.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to