GitHub user BryanCutler opened a pull request:

    https://github.com/apache/spark/pull/20423

    [SPARK-22221][SQL][FOLLOWUP] Externalize 
spark.sql.execution.arrow.maxRecordsPerBatch

    ## What changes were proposed in this pull request?
    
    This is a followup to #19575 which added a section on setting max Arrow 
record batches and this will externalize the conf that was referenced in the 
docs.
    
    ## How was this patch tested?
    NA

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/BryanCutler/spark 
arrow-user-doc-externalize-maxRecordsPerBatch-SPARK-22221

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/20423.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #20423
    
----
commit 8408b455f871aec275cbf03779f00b3dafa8644b
Author: Bryan Cutler <cutlerb@...>
Date:   2018-01-29T18:31:36Z

    externalize Arrow maxRecordsPerBatch

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to