On 04/11/2012 04:18 PM, Benedict Elliott Smith wrote:
Hi,

Looking at the source code, it doesn't appear as though there is any reason
to require the input be a String - only length() and charAt() are called,
which are both declared by CharSequence. Is there a reason I can't fathom,
or was it an oversight? It would be nice to widen the type for this (and
potentially other equivalent methods).

Integer.parseInt (1.0 or 1.1) pre-date CharSequence (1.4),
that's why it use a String an not a CharSequence.

If you don't want to break all already compiled programs,
you can't just replace String by CharSequence because the exact signature of the method
(with the parameter types) is encoded in the bytecode.
Joe Darcy write a cool blog post on that [1].

The best here is to add new methods that takes a CharSequence, move the code that use a String in them and change the method that takes a String to delegate to the one that use
a CharSequence.

cheers,
Rémi
[1] https://blogs.oracle.com/darcy/entry/kinds_of_compatibility


Reply via email to