[
https://issues.apache.org/jira/browse/SPARK-8093?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14575592#comment-14575592
]
Yin Huai commented on SPARK-8093:
---------------------------------
For now, to keep the same behavior with 1.3, the workaround is to set
"spark.sql.json.useJacksonStreamingAPI" to false (use the old code path).
> Spark 1.4 branch's new JSON schema inference has changed the behavior of
> handling innner empty JSON object.
> -----------------------------------------------------------------------------------------------------------
>
> Key: SPARK-8093
> URL: https://issues.apache.org/jira/browse/SPARK-8093
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.4.0
> Reporter: Harish Butani
> Priority: Critical
> Attachments: t1.json
>
>
> This is similar to SPARK-3365. Sample json is attached. Code to reproduce
> {code}
> var jsonDF = read.json("/tmp/t1.json")
> jsonDF.write.parquet("/tmp/t1.parquet")
> {code}
> The 'integration' object is empty in the json.
> StackTrace:
> {code}
> ....
> Caused by: java.io.IOException: Could not read footer:
> java.lang.IllegalStateException: Cannot build an empty group
> at
> parquet.hadoop.ParquetFileReader.readAllFootersInParallel(ParquetFileReader.java:238)
> at
> org.apache.spark.sql.parquet.ParquetRelation2$MetadataCache.refresh(newParquet.scala:369)
> at
> org.apache.spark.sql.parquet.ParquetRelation2.org$apache$spark$sql$parquet$ParquetRelation2$$metadataCache$lzycompute(newParquet.scala:154)
> at
> org.apache.spark.sql.parquet.ParquetRelation2.org$apache$spark$sql$parquet$ParquetRelation2$$metadataCache(newParquet.scala:152)
> at
> org.apache.spark.sql.parquet.ParquetRelation2.refresh(newParquet.scala:197)
> at
> org.apache.spark.sql.sources.InsertIntoHadoopFsRelation.insert(commands.scala:134)
> ... 69 more
> Caused by: java.lang.IllegalStateException: Cannot build an empty group
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]