Iain King wrote:

I think (2)'s poor performance is being amplified by how python
handles lists and list deletions; the effect may be stymied in other
languages

Delete is O(n) (or "O(n/2) on average", if you prefer), while append is amortized O(1).

Unless I'm missing something, your example keeps going until it's flagged *all* nodes as "on", which, obviously, kills performance for the first version as the probability goes down. The OP's question was about a single pass (but he did mention "as the simulation progresses", so I guess it's fair to test a complete simulation.)

Btw, if the nodes can be enumerated, I'd probably do something like:

    node_list = ... get list of nodes ...
    random.shuffle(node_list)

    start = 0
    end = len(node_list)
    step = end / MAX

    while start < end:

        for i in xrange(start, start + step):
            ... switch on node_list[i] ...

        ... do whatever you want to do after a step ...

        # prepare for next simulation step
        start += step
        step = max((len(node_list) - start) / MAX, 1)

which is near O(n) overall, and mostly constant wrt. the probability for each pass (where the probability is 1:MAX).

Might need some tuning; tweak as necessary.

</F>

--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to