Hi guys,

Has anyone encountered (what feels like) quite slow iteration when looping
over small containers ? I'm in a situation where I'm iterating over 100000+
3d vectors and its taking much longer than I'd expect. The vector type is
bound via boost::python and has an iter method which uses range(being, end)
for the iteration. 

I've attached a simple example binding and python test to illustrate what I
mean more clearly. I'm under the impression that in a situation like this
the overhead of creating the iterator is greater than the actual iteration,
but thats just a guess. Alternatively my perspective on what the iteration
speed should be might be completely skewed, so any help in clarifying what's
going on here would be greatly appreciated.

thanks
Babak

http://boost.2283326.n4.nabble.com/file/n4311109/source.cpp source.cpp 
http://boost.2283326.n4.nabble.com/file/n4311109/test.py test.py 

--
View this message in context: 
http://boost.2283326.n4.nabble.com/Iterating-over-small-objects-tp4311109p4311109.html
Sent from the Python - c++-sig mailing list archive at Nabble.com.
_______________________________________________
Cplusplus-sig mailing list
Cplusplus-sig@python.org
http://mail.python.org/mailman/listinfo/cplusplus-sig

Reply via email to