srielau commented on code in PR #38861:
URL: https://github.com/apache/spark/pull/38861#discussion_r1039709066


##########
core/src/main/resources/error/error-classes.json:
##########
@@ -876,6 +876,13 @@
     ],
     "sqlState" : "42000"
   },
+  "NOT_ENOUGH_DATA_COLUMNS" : {
+    "message" : [
+      "Cannot write to <tableName>, not enough data columns:",

Review Comment:
   Given that we have columns defaults, at least in the case of insert it's not 
really about the target table, but the column list.
   I take it the same error is raised on UPDATE T SET  (...) = (...)?  and 
UNION/EXCEPT/...  ( ) .. IN ( SELECT ...) ?
   
   How about \<operator> expects matching number of columns. But the left side 
(target) has \<leftNumCols> while the right side (source> has \<rightNumCols>.
   
   Alternatively we could talk about "assignments" using a different error 
class (?)
    



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to