Github user dongjoon-hyun commented on a diff in the pull request:

    https://github.com/apache/spark/pull/23132#discussion_r237247452
  
    --- Diff: 
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala ---
    @@ -1610,6 +1610,13 @@ object SQLConf {
           """ "... N more fields" placeholder.""")
         .intConf
         .createWithDefault(25)
    +
    +  val LEGACY_DECIMAL_PARSING_ENABLED = 
buildConf("spark.sql.legacy.decimalParsing.enabled")
    +    .doc("If it is set to false, it enables parsing decimals in locale 
specific formats. " +
    +      "To switch back to previous behaviour when parsing was performed by 
java.math.BigDecimal " +
    +      "and all commas were removed from the input, set the flag to true.")
    +    .booleanConf
    +    .createWithDefault(false)
    --- End diff --
    
    I want to distiguish adding new implementation from enforcing it as a 
default. Your new implementation will already be in 3.0 release, isn't it? All 
users can taste it technically.
    
    BTW, we don't have a regular release plan for major release. So if you are 
thinking of 4.0, I don't think that's feasible option for this PR. Our next 
feature(minor) release is 3.1. Enabling this in 3.1 sounds natural to me. We 
already enabled many things in 2.4.0.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to