Caching was not a problem, caching was a good idea - you have to go through 
that array twice, so it's better to save it for future use. Problem was with 
implementation.
I did caching myself, but I allocated it with

auto M = vector<long long>(N)

where N is number of stacks per node.
Even faster, probably, to do this:

auto M = vector<long long>(N);
M.resize(N);

worst thing to do is to do repeating push_backs(). First, it is much slower - 
same O(n) on average, but much slower because of constant. More important in 
this case is that you lose control over exact amount of memory consumed by 
vector. In your case likely problem was that after pushing back 8,388,609th 
item, size of vector doubled (as dynamic arrays do) and immeditely went from 
64Mb to 128Mb. That's memory limit and RTE.

-- 
You received this message because you are subscribed to the Google Groups 
"Google Code Jam" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/google-code/41eb80b9-00b7-4cf1-be83-5023ac2e698e%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to