Github user tejasapatil commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16909#discussion_r105094904
  
    --- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala ---
    @@ -674,6 +675,24 @@ object SQLConf {
           .stringConf
           .createWithDefault(TimeZone.getDefault().getID())
     
    +  val WINDOW_EXEC_BUFFER_SPILL_THRESHOLD =
    +    buildConf("spark.sql.windowExec.buffer.spill.threshold")
    +      .doc("Threshold for number of rows buffered in window operator")
    +      .intConf
    +      .createWithDefault(4096)
    +
    +  val SORT_MERGE_JOIN_EXEC_BUFFER_SPILL_THRESHOLD =
    +    buildConf("spark.sql.sortMergeJoinExec.buffer.spill.threshold")
    +      .doc("Threshold for number of rows buffered in sort merge join 
operator")
    +      .intConf
    +      .createWithDefault(Int.MaxValue)
    +
    +  val CARTESIAN_PRODUCT_EXEC_BUFFER_SPILL_THRESHOLD =
    +    buildConf("spark.sql.cartesianProductExec.buffer.spill.threshold")
    +      .doc("Threshold for number of rows buffered in cartesian product 
operator")
    +      .intConf
    +      
.createWithDefault(UnsafeExternalSorter.DEFAULT_NUM_ELEMENTS_FOR_SPILL_THRESHOLD.toInt)
    --- End diff --
    
    marked as internal


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to