tedyu commented on a change in pull request #30984:
URL: https://github.com/apache/spark/pull/30984#discussion_r553712976



##########
File path: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSourceStrategy.scala
##########
@@ -737,6 +737,10 @@ abstract class PushableColumnBase {
         }
       case s: GetStructField if nestedPredicatePushdownEnabled =>
         helper(s.child).map(_ :+ s.childSchema(s.ordinal).name)
+      case GetJsonObject(col, field) =>
+        val colName = col.toString.split("#")(0)
+        val fieldName = field.toString.split("\\.").last
+        Some(Seq("GetJsonObject(" + col + "," + field + ")"))

Review comment:
       I tried creating a table with `GetJsonObject(phone,'$.code')` as column
   ```
   scala> spark.sql("CREATE TABLE IF NOT EXISTS 
t1(`GetJsonObject(phone,'$.code')` int)")
   21/01/08 02:56:34 WARN ResolveSessionCatalog: A Hive serde table will be 
created as there is no table provider specified. You can set 
spark.sql.legacy.createHiveTableByDefault to false so that native data source 
table will be created instead.
   21/01/08 02:56:36 WARN HiveConf: HiveConf of name hive.stats.jdbc.timeout 
does not exist
   21/01/08 02:56:36 WARN HiveConf: HiveConf of name hive.stats.retries.wait 
does not exist
   21/01/08 02:56:40 WARN ObjectStore: Version information not found in 
metastore. hive.metastore.schema.verification is not enabled so recording the 
schema version 2.3.0
   21/01/08 02:56:40 WARN ObjectStore: setMetaStoreSchemaVersion called but 
recording version is disabled: version = 2.3.0, comment = Set by MetaStore 
[email protected]
   org.apache.spark.sql.AnalysisException: Cannot create a table having a 
column whose name contains commas in Hive metastore. Table: `default`.`t1`; 
Column: GetJsonObject(phone,'$.code')
     at 
org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$verifyDataSchema$4(HiveExternalCatalog.scala:175)
     at 
org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$verifyDataSchema$4$adapted(HiveExternalCatalog.scala:171)
     at scala.collection.Iterator.foreach(Iterator.scala:941)
     at scala.collection.Iterator.foreach$(Iterator.scala:941)
     at scala.collection.AbstractIterator.foreach(Iterator.scala:1429)
     at scala.collection.IterableLike.foreach(IterableLike.scala:74)
     at scala.collection.IterableLike.foreach$(IterableLike.scala:73)
     at org.apache.spark.sql.types.StructType.foreach(StructType.scala:102)
     at 
org.apache.spark.sql.hive.HiveExternalCatalog.verifyDataSchema(HiveExternalCatalog.scala:171)
     at 
org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$createTable$1(HiveExternalCatalog.scala:252)
     at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
     at 
org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:102)
     at 
org.apache.spark.sql.hive.HiveExternalCatalog.createTable(HiveExternalCatalog.scala:245)
     at 
org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.createTable(ExternalCatalogWithListener.scala:94)
     at 
org.apache.spark.sql.catalyst.catalog.SessionCatalog.createTable(SessionCatalog.scala:351)
     at 
org.apache.spark.sql.execution.command.CreateTableCommand.run(tables.scala:167)
     at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
     at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
     at 
org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79)
   ```




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to