[ 
https://issues.apache.org/jira/browse/SPARK-32225?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17454622#comment-17454622
 ] 

Stijn De Haes commented on SPARK-32225:
---------------------------------------

Could this be the reason that when you read a Parquet data source and write it 
back from S3 without doing anythong to the data then spark tells you it read 
double the amount of data that it wrote:

!image-2021-12-07-13-37-12-197.png!

> Parquet footer information is read twice
> ----------------------------------------
>
>                 Key: SPARK-32225
>                 URL: https://issues.apache.org/jira/browse/SPARK-32225
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: Rajesh Balamohan
>            Priority: Minor
>         Attachments: image-2021-12-07-13-37-12-197.png, 
> spark_parquet_footer_reads.png
>
>
> When running queries, spark reads parquet footer information twice. In cloud 
> env, this would turn out to be expensive (depending on the jobs, # of 
> splits). It would be nice to reuse the footer information already read via 
> "ParquetInputFormat::buildReaderWithPartitionValues"
>  
> !spark_parquet_footer_reads.png|width=640,height=644!
> Lines of interest:
> [https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala#L271]
> [https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetFileFormat.scala#L326]
>  
> [https://github.com/apache/spark/blob/master/sql/core/src/main/java/org/apache/spark/sql/execution/datasources/parquet/SpecificParquetRecordReaderBase.java#L105]
> [https://github.com/apache/spark/blob/master/sql/core/src/main/java/org/apache/spark/sql/execution/datasources/parquet/SpecificParquetRecordReaderBase.java#L111]
>  



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to