sujith71955 opened a new pull request #24075: [SPARK-26176][SQL] Invalid column names validation is been added when we create a table using the Hive serde "STORED AS URL: https://github.com/apache/spark/pull/24075 ## What changes were proposed in this pull request? Currently, users meet job abortions while creating a table using the Hive serde "STORED AS" with invalid column names. We had better prevent this by raising **AnalysisException** with a guide to use aliases instead like Paquet data source tables. thus making compatible with error message shown while creating Parquet/ORC native table. **BEFORE** ` scala> sql("CREATE TABLE TAB1TEST STORED AS PARQUET AS SELECT COUNT(ID) FROM TAB1") Caused by: java.lang.IllegalArgumentException: No enum constant parquet.schema.OriginalType.col1 at java.lang.Enum.valueOf(Enum.java:238) at parquet.schema.OriginalType.valueOf(OriginalType.java:21) at parquet.schema.MessageTypeParser.addPrimitiveType(MessageTypeParser.java:160) at parquet.schema.MessageTypeParser.addType(MessageTypeParser.java:111) at parquet.schema.MessageTypeParser.addGroupTypeFields(MessageTypeParser.java:99) at parquet.schema.MessageTypeParser.parse(MessageTypeParser.java:92) at parquet.schema.MessageTypeParser.parseMessageType(MessageTypeParser.java:82) at **AFTER** ``` CREATE TABLE TAB1TEST STORED AS PARQUET AS SELECT COUNT(ID) FROM TAB1; Please use alias to rename it.;eption: Attribute name "count(ID)" contains invalid character(s) among " ,;{}()\n\t=". at org.apache.spark.sql.execution.datasources.parquet.ParquetSchemaConverter$.checkConversionRequirement(ParquetSchemaConverter.scala:58 ## How was this patch tested? A new test case is been added.
---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: [email protected] With regards, Apache Git Services --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
