RussellSpitzer commented on code in PR #38823:
URL: https://github.com/apache/spark/pull/38823#discussion_r1049247458
##########
sql/catalyst/src/main/java/org/apache/spark/sql/connector/catalog/TableProvider.java:
##########
@@ -93,4 +93,18 @@ default Transform[]
inferPartitioning(CaseInsensitiveStringMap options) {
default boolean supportsExternalMetadata() {
return false;
}
+
+ /**
+ * Returns true if the source supports defining generated columns upon table
creation in SQL.
+ * When false: any create/replace table statements with a generated column
defined in the table
+ * schema will throw an exception during analysis.
+ *
+ * A generated column is defined with syntax: {@code colName colType
GENERATED ALWAYS AS (expr)}
+ * The generation expression is stored in the column metadata with key
"generationExpression".
+ *
+ * Override this method to allow defining generated columns in
create/replace table statements.
+ */
+ default boolean supportsGeneratedColumnsOnCreation() {
Review Comment:
This really should be a part of the greater the catalog capabilities since
that's where create table is usually going to be invoked. I'm very nervous with
saying that it is up to the Datasource to decide what is valid because
different engines may decide the same sql means different things and this would
require that the Datasource somehow make sure non spark engines can access the
same table in the same. We spent a lot of time making an public expression
class for the connectors but it feels like that should probably be invoked here
as well?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]