[ 
https://issues.apache.org/jira/browse/SPARK-29542?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

feiwang updated SPARK-29542:
----------------------------
    Description: 
Hi,the description of `spark.sql.files.maxPartitionBytes` is shown as below.

{code:java}
The maximum number of bytes to pack into a single partition when reading files.
{code}

It seems that it can ensure each partition at most process bytes of that value 
for spark sql.

But as shown in the attachment,  

I checked the code,  it is only effective for data source table.
So, its description is confused.

  was:
Hi,the description of `spark.sql.files.maxPartitionBytes` is shown as below.

{code:java}
The maximum number of bytes to pack into a single partition when reading files.
{code}

It seems that it can ensure each partition at most process bytes of that value 
for spark sql.

But as shown in the attachment, it can not.

I checked the code,  it is only effective for data source table.
So, its description is confused.


> [DOC] The description of `spark.sql.files.maxPartitionBytes` is confused.
> -------------------------------------------------------------------------
>
>                 Key: SPARK-29542
>                 URL: https://issues.apache.org/jira/browse/SPARK-29542
>             Project: Spark
>          Issue Type: Documentation
>          Components: Documentation
>    Affects Versions: 2.4.4
>            Reporter: feiwang
>            Priority: Minor
>         Attachments: screenshot-1.png
>
>
> Hi,the description of `spark.sql.files.maxPartitionBytes` is shown as below.
> {code:java}
> The maximum number of bytes to pack into a single partition when reading 
> files.
> {code}
> It seems that it can ensure each partition at most process bytes of that 
> value for spark sql.
> But as shown in the attachment,  
> I checked the code,  it is only effective for data source table.
> So, its description is confused.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to