https://codejam.withgoogle.com/codejam/contest/8284486/dashboard#s=p2


What I thought : Sort the coordinates in term of x, y and z values. Loop over i 
from 0 to n-1, and divide stars into (0,i) and (i+1, n) and compute cube edges 
to cover stars (0,i) and (i+1, n). The max of these two is the edge length 
required to cover the two partitions of stars.
Answer is the minimum edge for all i's. 
Is there something wrong in this algorithm ? If yes, how do i begin solving it!

Thanks!

-- 
You received this message because you are subscribed to the Google Groups 
"Google Code Jam" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/google-code/dde78f3f-df9a-4a06-833d-45db4de4bcaa%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to