Github user MaxGekk commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21472#discussion_r192309953
  
    --- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/jsonExpressions.scala
 ---
    @@ -747,8 +748,13 @@ case class StructsToJson(
     
     object JsonExprUtils {
     
    -  def validateSchemaLiteral(exp: Expression): StructType = exp match {
    -    case Literal(s, StringType) => 
CatalystSqlParser.parseTableSchema(s.toString)
    +  def validateSchemaLiteral(exp: Expression): DataType = exp match {
    +    case Literal(s, StringType) =>
    +      try {
    +        DataType.fromJson(s.toString)
    --- End diff --
    
    I believe we should support JSON format because:
    - Functionality of SQL and Scala (and other languages) DSL should be equal 
otherwise we push users to use Scala DSL because SQL has less features.
    - The feature allows to save/restore schema in JSON format. Customer's use 
case is to have data in JSON format + meta info including schema in JSON format 
too. Schema in JSON format gives them more opportunities for processing in 
programatic way. 
    - For now JSON format give us more flexibility and allows `MapType` (and 
`ArrayType`) as the root type for result of `from_json` 


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to