Repository: spark
Updated Branches:
  refs/heads/branch-1.3 b83a93e08 -> 25a109e42


[SPARK-6037][SQL] Avoiding duplicate Parquet schema merging

`FilteringParquetRowInputFormat` manually merges Parquet schemas before 
computing splits. However, it is duplicate because the schemas are already 
merged in `ParquetRelation2`. We don't need to re-merge them at `InputFormat`.

Author: Liang-Chi Hsieh <[email protected]>

Closes #4786 from viirya/dup_parquet_schemas_merge and squashes the following 
commits:

ef78a5a [Liang-Chi Hsieh] Avoiding duplicate Parquet schema merging.

(cherry picked from commit 4ad5153f5449319a7e82c9013ccff4494ab58ef1)
Signed-off-by: Cheng Lian <[email protected]>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/25a109e4
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/25a109e4
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/25a109e4

Branch: refs/heads/branch-1.3
Commit: 25a109e4228c125acf2ac25625e798f9d34947cf
Parents: b83a93e
Author: Liang-Chi Hsieh <[email protected]>
Authored: Fri Feb 27 11:06:47 2015 +0800
Committer: Cheng Lian <[email protected]>
Committed: Fri Feb 27 11:07:08 2015 +0800

----------------------------------------------------------------------
 .../sql/parquet/ParquetTableOperations.scala    | 23 ++++++--------------
 1 file changed, 7 insertions(+), 16 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/25a109e4/sql/core/src/main/scala/org/apache/spark/sql/parquet/ParquetTableOperations.scala
----------------------------------------------------------------------
diff --git 
a/sql/core/src/main/scala/org/apache/spark/sql/parquet/ParquetTableOperations.scala
 
b/sql/core/src/main/scala/org/apache/spark/sql/parquet/ParquetTableOperations.scala
index 9061d3f..4e4f647 100644
--- 
a/sql/core/src/main/scala/org/apache/spark/sql/parquet/ParquetTableOperations.scala
+++ 
b/sql/core/src/main/scala/org/apache/spark/sql/parquet/ParquetTableOperations.scala
@@ -434,22 +434,13 @@ private[parquet] class FilteringParquetRowInputFormat
      return splits
     }
 
-    
Option(globalMetaData.getKeyValueMetaData.get(RowReadSupport.SPARK_METADATA_KEY)).foreach
 {
-      schemas =>
-        val mergedSchema = schemas
-          .map(DataType.fromJson(_).asInstanceOf[StructType])
-          .reduce(_ merge _)
-          .json
-
-        val mergedMetadata = globalMetaData
-          .getKeyValueMetaData
-          .updated(RowReadSupport.SPARK_METADATA_KEY, 
setAsJavaSet(Set(mergedSchema)))
-
-        globalMetaData = new GlobalMetaData(
-          globalMetaData.getSchema,
-          mergedMetadata,
-          globalMetaData.getCreatedBy)
-    }
+    val metadata = configuration.get(RowWriteSupport.SPARK_ROW_SCHEMA)
+    val mergedMetadata = globalMetaData
+      .getKeyValueMetaData
+      .updated(RowReadSupport.SPARK_METADATA_KEY, setAsJavaSet(Set(metadata)))
+
+    globalMetaData = new GlobalMetaData(globalMetaData.getSchema,
+      mergedMetadata, globalMetaData.getCreatedBy)
 
     val readContext = getReadSupport(configuration).init(
       new InitContext(configuration,


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to