Github user viirya commented on the issue:

    https://github.com/apache/spark/pull/20217
  
    > Like other people mentioned before, it's really confusing to have so many 
ways to register a UDF in PySpark, while Java/Scala API is cleaner.
    
    I agree with it. When I looked at this change, I have a thought why 
there're many ways to do it. And seems it is inconsistent that we have 
`registerFunction` in `Catalog` and `SQLContext`, but in `UDFRegistration` we 
have `register`.



---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to