rdblue commented on pull request #3367: URL: https://github.com/apache/iceberg/pull/3367#issuecomment-951337115
@snazy, @aokolnychyi, I'm not sure that adding the ability to plug in stored procedures is a good idea for Iceberg or for Spark in general. Stored procedures are part of Iceberg's Spark extensions because we wanted a way to call actions before we could get stored procedures upstream in Spark. Adding more complexity here reduces the chances that we will be able to get the interfaces added to Spark, so I would prefer not to add too much here that risks not being able to move it upstream. In addition, I'm not sure that this fits with how code is plugged into Spark. I think that stored procedures should be added through a catalog that can define them. Catalogs are the primary way to plug things like this into Spark, so I'm not sure why Spark would support a second way to plug in specific procedures. How a catalog manages procedures is up to the implementation. We could add this as an Iceberg catalog feature, but then this may be different. So we should first clarify: is this an Iceberg-only feature or is this something that would go upstream? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
