wmoustafa commented on pull request #24559:
URL: https://github.com/apache/spark/pull/24559#issuecomment-731427427


   > I don't think that we would want to build support for a generic framework 
into Spark itself. I think Spark's API should be specific to Spark, just like 
the data source APIs are specific to Spark. That avoids complications like 
converting to Row or another representation for Hive.
   
   I think there are 2 types of APIs: Function Catalog APIs and UDF expression 
APIs (e.g., Generic UDFs). I mentioned the Transport API as a way to do the 
latter, and wanted to get your thoughts on the friendly-ship of the Function 
Catalog APIs to UDF expression APIs like Transport. To the user, Transport 
provides tools to make type validation and inference user-friendly (declarative 
using type signatures), and Java types that map to their SQL counterparts. To 
Spark, it is just an Expression API processing InternalRows.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to