tedyu commented on a change in pull request #30984:
URL: https://github.com/apache/spark/pull/30984#discussion_r551661480



##########
File path: 
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/DataSourceStrategy.scala
##########
@@ -737,6 +737,10 @@ abstract class PushableColumnBase {
         }
       case s: GetStructField if nestedPredicatePushdownEnabled =>
         helper(s.child).map(_ :+ s.childSchema(s.ordinal).name)
+      case GetJsonObject(col, field) =>
+        val colName = col.toString.split("#")(0)
+        val fieldName = field.toString.split("\\.").last
+        Some(Seq("GetJsonObject(" + col + "," + field + ")"))

Review comment:
       I rebuilt Spark locally and was able to get the same output for the 
query you gave above.
   
   Please note that get_json_object should be used here instead of 
GetJsonObject().
   
   I tried the following:
   ```
   scala> spark.sql("CREATE TABLE IF NOT EXISTS t(id int , phone string)")
   ...
   scala> spark.sql("select get_json_object(phone,'$.code') from t")
   res10: org.apache.spark.sql.DataFrame = [get_json_object(phone, $.code): 
string]
   ```
   If an invalid column is specified:
   ```
   scala> spark.sql("select get_json_object(phon,'$.code') from t")
   org.apache.spark.sql.AnalysisException: cannot resolve '`phon`' given input 
columns: [spark_catalog.default.t.id, spark_catalog.default.t.phone]; line 1 
pos 23;
   'Project [unresolvedalias('get_json_object('phon, $.code), None)]
   +- SubqueryAlias spark_catalog.default.t
      +- HiveTableRelation [`default`.`t`, 
org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, Data Cols: [id#17, 
phone#18], Partition Cols: []]
   ```




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to