Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22063#discussion_r212800597
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/functions.scala ---
    @@ -3713,7 +3726,7 @@ object functions {
           | */
           |def udf[$typeTags](f: Function$x[$types]): UserDefinedFunction = {
           |  val ScalaReflection.Schema(dataType, nullable) = 
ScalaReflection.schemaFor[RT]
    -      |  val inputTypes = Try($inputTypes).toOption
    +      |  val inputTypes = Try($argSchema).toOption
    --- End diff --
    
    @cloud-fan might be worth another look now. So, after making this change, I 
realize, maybe the whole reason it started failing was that I had moved the 
schema inference outside the Try(). Now it's back inside. Maybe that makes the 
whole problem go back to being silent. Did you mean you preferred tackling the 
problem directly and not suppressing the failure to infer a schema? I added 
udfInternal above for that.
    
    But maybe this isn't the best approach as user UDFs could fail for the same 
reason. Maybe I need to back this whole thing out after all, now that I 
understand what's happening after your comments.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to