[ 
https://issues.apache.org/jira/browse/SPARK-40032?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

jiaan.geng updated SPARK-40032:
-------------------------------
    Description: 
Spark SQL today supports the Decimal data type. The implementation of Decimal 
that can hold a BigDecimal or Long.  Decimal provides some operators like +, -, 
*, / and so on.
Take the + as example, the implementation show below.

{code:java}
  def + (that: Decimal): Decimal = {
    if (decimalVal.eq(null) && that.decimalVal.eq(null) && scale == that.scale) 
{
      Decimal(longVal + that.longVal, Math.max(precision, that.precision) + 1, 
scale)
    } else {
      Decimal(toBigDecimal.bigDecimal.add(that.toBigDecimal.bigDecimal))
    }
  }
{code}

We can see there exists two addition and call Decimal.apply. The add operator 
of BigDecimal will construct a new BigDecimal instance.
The implementation of Decimal.apply will call new to construct a new Decimal 
instance with the new BigDecimal instance.
As we know, Decimal instance will hold the new BigDecimal instance.
If a large table has a Decimal field called 'colA, the execution of SUM('colA) 
will involve the creation of a large number of Decimal instances and BigDecimal 
instances.

Decimal128 is a high-performance decimal about 8X more efficient than Java 
BigDecimal for typical operations.

  was:Spark SQL today supports the TIMESTAMP data type. 


> Support Decimal128 type
> -----------------------
>
>                 Key: SPARK-40032
>                 URL: https://issues.apache.org/jira/browse/SPARK-40032
>             Project: Spark
>          Issue Type: New Feature
>          Components: SQL
>    Affects Versions: 3.4.0
>            Reporter: jiaan.geng
>            Priority: Major
>
> Spark SQL today supports the Decimal data type. The implementation of Decimal 
> that can hold a BigDecimal or Long.  Decimal provides some operators like +, 
> -, *, / and so on.
> Take the + as example, the implementation show below.
> {code:java}
>   def + (that: Decimal): Decimal = {
>     if (decimalVal.eq(null) && that.decimalVal.eq(null) && scale == 
> that.scale) {
>       Decimal(longVal + that.longVal, Math.max(precision, that.precision) + 
> 1, scale)
>     } else {
>       Decimal(toBigDecimal.bigDecimal.add(that.toBigDecimal.bigDecimal))
>     }
>   }
> {code}
> We can see there exists two addition and call Decimal.apply. The add operator 
> of BigDecimal will construct a new BigDecimal instance.
> The implementation of Decimal.apply will call new to construct a new Decimal 
> instance with the new BigDecimal instance.
> As we know, Decimal instance will hold the new BigDecimal instance.
> If a large table has a Decimal field called 'colA, the execution of 
> SUM('colA) will involve the creation of a large number of Decimal instances 
> and BigDecimal instances.
> Decimal128 is a high-performance decimal about 8X more efficient than Java 
> BigDecimal for typical operations.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to