Github user MaxGekk commented on a diff in the pull request:
https://github.com/apache/spark/pull/21472#discussion_r192358871
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/jsonExpressions.scala
---
@@ -747,8 +748,13 @@ case class StructsToJson(
object JsonExprUtils {
- def validateSchemaLiteral(exp: Expression): StructType = exp match {
- case Literal(s, StringType) =>
CatalystSqlParser.parseTableSchema(s.toString)
+ def validateSchemaLiteral(exp: Expression): DataType = exp match {
+ case Literal(s, StringType) =>
+ try {
+ DataType.fromJson(s.toString)
--- End diff --
> Shall we add the support with type itself with
CatalystSqlParser.parseDataType too?
I will do but it won't solve customer's problem fully.
> Also, are you able to use catalogString?
I just check that:
```scala
val schema = MapType(StringType, IntegerType).catalogString
val ds = spark.sql(
s"""
|select from_json('{"a":1}', '$schema')
""".stripMargin)
ds.show()
```
and got this one:
```
extraneous input '<' expecting {'SELECT', 'FROM', ...}(line 1, pos 3)
== SQL ==
map<string,int>
---^^^
; line 2 pos 7
```
The same with ` val schema = new StructType().add("a",
IntegerType).catalogString
`
```
== SQL ==
struct<a:int>
------^^^
; line 2 pos 7
org.apache.spark.sql.AnalysisException
```
Am I doing something wrong?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]