The difference is that the CharSequence... will search for the first
occurrence of any of the provided CharSequences. The single CharSequence
arg-method will search for the first of any chars in the provided sequence.

Regarding the test assertion:
it's the index of *any *of the provided characters, not the entire string.
"z" appears at index 0 of the searched string.
I'm guessing your assumption of "2" being the expected out is due to
expectations of:
a. Searching for the entire provided String. This would be the behavior if
you'd used the traditional *indexOf *method.
b. Assuming that the second character of the string is index 2. Java
indexes at 0, so this would be returned as 1.

Looking at the code has me wondering whether adding the search chars to a
Set and then testing each one is a better approach.

On Sun, Aug 16, 2020 at 9:52 PM Eric Peters <e...@peters.org> wrote:

> Hi ~
>
> I'm in the process of porting apache commons lang to scala, so it can
> transpile to javascript/scala native. (
> https://github.com/er1c/scala-apache-commons-lang3 FWIW)
>
> One test I'm currently investigating is this line:
>
>
> https://github.com/apache/commons-lang/blob/master/src/test/java/org/apache/commons/lang3/StringUtilsEqualsIndexOfTest.java#L401
>
> What exactly is the difference between CharSequence... and CharSequence
> signatures on the usage of indexOfAny.  I'm trying to figure out why the
> expected index is "0" instead of "2"
>
> Thanks!
>
> -Eric
>

Reply via email to