On Wednesday, May 9, 2018 at 10:10:26 PM UTC+8, eatmore wrote:
> When sorting arrays of primitive types, Arrays.sort uses Quicksort, which is 
> O(n²) in the worst case.

Actually I recently looked at what java does. It uses a bunch of different 
algorithms in different cases. And in the cases where it chooses quicksort, it 
is:
1) expected to be efficient based on the characteristics of the array
AND
2) implemented in a better way - dual pivots, handling equal elements, etc.

While it may be theoretically possible to get Arrays.sort to run in O(n²), it 
does not happen on any simple cases, and you'd have to craft the array 
specifically to exploit fine details of the algorithm.

The javadoc actually says "This algorithm offers O(n log(n)) performance on 
many data sets that cause other quicksorts to degrade to quadratic performance, 
and is typically faster than traditional (one-pivot) Quicksort implementations."

While I guess they could have used something like introsort to guarantee O(n 
log(n)) worst case complexity, I'm pretty confident that their algorithm is 
close to fastest in the overwhelming majority of cases.

> There are generators of tests which make Arrays.sort work in O(n²) time, e.g. 
> this: http://codeforces.com/contest/101/submission/3882807

That looks like what I mentioned above: crafting the array specifically to 
exploit fine details of the algorithm.

My point was not that Arrays.sort can never ever be O(n²), but that it doesn't 
fall into that with simple cases (like 50000 zeros), and in practice you can 
expect it to be always fast. Even if it degrades to O(n²), I expect the 
constants to be much better than a naive traditional quicksort.

Finally, the code is specifically written for java 7, but java 8 already has 
some improvements in the sorting code, and I don't think this "killer" is 
working anymore. I'm not sure what it's supposed to do, but here it prints 
something like:
Generation time = 1129 ms.
Sorting time = 484 ms.

I haven't seen any java 8 quicksort killer, and I'm not sure if it's possible 
to write one.
Anyway, if you're really paranoid, you can implement an O(n log(n)) worst case 
algorithm and be done with it.

> Scanner is SLOW. Try to read a million numbers using Scanner and using 
> FastReader.

Ok, I'm trying:
1 million numbers on a single line - Scanner: 800ms, FastReader: 200ms
1 million numbers on separate lines - Scanner: 800ms, FastReader: 200ms
1 million numbers, 100 per line - Scanner: 800ms, FastReader: 200ms

So alright, FastReader is faster, but Scanner can still read a million numbers 
in less than a second on my machine. It only makes a real difference if the 
input is huge and the time limit is very tight.

> In general, if you think that standard library functions do their job in the 
> best possible way, you're wrong.

Maybe not the "best possible way", but good enough in most cases.

-- 
You received this message because you are subscribed to the Google Groups 
"Google Code Jam" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/google-code/58c15fbe-fb88-4ca6-85fe-ae42e45e2441%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to