On Tue, 12 Mar 2024 06:18:27 GMT, Shaojin Wen <[email protected]> wrote:
>> The current BigDecimal(String) constructor calls String#toCharArray, which
>> has a memory allocation.
>>
>>
>> public BigDecimal(String val) {
>> this(val.toCharArray(), 0, val.length()); // allocate char[]
>> }
>>
>>
>> When the length is greater than 18, create a char[]
>>
>>
>> boolean isCompact = (len <= MAX_COMPACT_DIGITS); // 18
>> if (!isCompact) {
>> // ...
>> } else {
>> char[] coeff = new char[len]; // allocate char[]
>> // ...
>> }
>>
>>
>> This PR eliminates the two memory allocations mentioned above, resulting in
>> an approximate 60% increase in performance..
>
> Shaojin Wen has updated the pull request incrementally with one additional
> commit since the last revision:
>
> easier to compare
Sorry, when I got pinged in here the earlier comments didn't render and I
missed the conversation.
So first off I think it's probably acceptable to regress the `char[]`-based
constructors and gently push people towards other options. In few applications
are `char[]` the source, and in the example mentioned there's a transform from
`byte[]` to `char[]` which would probably end up being faster now with `new
BigDecimal(new String(bytes, start, end, ISO_8859_1))`.
That said perhaps there's room for improvement. Looking at `CharBuffer::charAt`
it does surprisingly complicated logic and bounds checks. And the `CharBuffer`
wrapper itself is a bit bloated due quite a few state variables (that we don't
need for this use case). I did an experiment with a specialized implementation
here: https://github.com/wenshao/jdk/pull/7 which yields good results. Not sure
this is a great idea but something to consider (note that it's not general
purpose since it's exploiting a deliberately missing bounds check in `charAt`).
-------------
PR Comment: https://git.openjdk.org/jdk/pull/18177#issuecomment-1991187517