[ 
https://issues.apache.org/jira/browse/SPARK-37450?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17448412#comment-17448412
 ] 

Jiri Humpolicek commented on SPARK-37450:
-----------------------------------------

thanks for answer. Maybe I don't know internals of parquet, but when I need to 
know number of rows in parquet:
{code:java}
read.select(count(lit(1))).explain(true)
// ReadSchema: struct<>{code}
there is empty read schema (I suppose none column accessed).

So is there any way how to get size of array in parquet without reading whole 
sub-structure? If it is not, you showed at least optimization, read the 
"smallest" attribute in sub-structure.

> Spark SQL reads unnecessary nested fields (another type of pruning case)
> ------------------------------------------------------------------------
>
>                 Key: SPARK-37450
>                 URL: https://issues.apache.org/jira/browse/SPARK-37450
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.2.0
>            Reporter: Jiri Humpolicek
>            Priority: Major
>
> Based on this [SPARK-34638|https://issues.apache.org/jira/browse/SPARK-34638] 
> Maybe I found another nested fields pruning case. In this case I found full 
> read with `count` function
> Example:
> 1) Loading data
> {code:scala}
> val jsonStr = """{
>  "items": [
>    {"itemId": 1, "itemData": "a"},
>    {"itemId": 2, "itemData": "b"}
>  ]
> }"""
> val df = spark.read.json(Seq(jsonStr).toDS)
> df.write.format("parquet").mode("overwrite").saveAsTable("persisted")
> {code}
> 2) read query with explain
> {code:scala}
> val read = spark.table("persisted")
> spark.conf.set("spark.sql.optimizer.nestedSchemaPruning.enabled", true)
> read.select(explode($"items").as('item)).select(count(lit(true))).explain(true)
> // ReadSchema: struct<items:array<struct<itemData:string,itemId:bigint>>>
> {code}



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to