kumarUjjawal commented on code in PR #19065:
URL: https://github.com/apache/datafusion/pull/19065#discussion_r2585322649


##########
datafusion/spark/src/function/math/width_bucket.rs:
##########
@@ -77,7 +123,16 @@ impl ScalarUDFImpl for SparkWidthBucket {
     }
 
     fn invoke_with_args(&self, args: ScalarFunctionArgs) -> 
Result<ColumnarValue> {
-        make_scalar_function(width_bucket_kern, vec![])(&args.args)
+        let [value, minv, maxv, buckets] = take_function_args(self.name(), 
&args.args)?;
+
+        let arrays = vec![
+            value.to_array(args.number_rows)?,
+            minv.to_array(args.number_rows)?,
+            maxv.to_array(args.number_rows)?,
+            buckets.to_array(args.number_rows)?,
+        ];
+
+        width_bucket_kern(&arrays).map(ColumnarValue::Array)

Review Comment:
   Correct me if my understanding is wrong, I only swapped out 
`make_scalar_function` so I could call `take_function_args` up front, once the 
UDF no longer uses Signature::user_defined, we need the shared 
`take_function_args` guard to  produce the standard “failed to match any 
signature” planner error.  If we want to keep the helper, I can move the 
`take_function_args` call into the closure, but the behavior change was just to 
hook into the common arity handling.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to