Daniel-Davies commented on PR #38867:
URL: https://github.com/apache/spark/pull/38867#issuecomment-1345642760

   @LuciferYang @zhengruifeng @HyukjinKwon Think this one is more or less ready 
to review fully; I've tried to follow the comments & guidance given on the 
[array_append](https://github.com/apache/spark/pull/38865) PR. The main 
question I'd appreciate guidance on is what happens in the following two edge 
cases:
   - Input item parameter is null: this has been discussed already in 
array_append, where the decision was to return a null result if the input item 
parameter is null, but let me know if it should be different for this function
   - Index out of bounds: if the user tries to put an item into the array at an 
index that is larger than the (new) array size, this PR returns a result of 
null. Alternatives here are to (a) perform an array_append (or prepend, if 
index is negative) (b) implement the snowflake behaviour of filling the array 
with nulls up to the index (see existing discussion on this PR) or (c) no-op
   
   Thanks all for the input so far, I hope the code is sufficient in it's 
current form


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to