Hello,

I'm using matplotlib for various tasks beautifully...but on some occasions,
I have to visualize large datasets (in the range of 10M data points) (using
imshow or regular plots)...system start to choke a bit at that point...

I would like to be consistent somehow and not use different tools for
basically similar tasks...
so I'd like some pointers regarding rendering performance...as I would be
interested to be involved in dev is there is something to be done....

To active developers, what's the general feel does matplotlib have room to
spare in its rendering performance?...
or is it pretty tied down to the speed of Agg right now?
Is there something to gain from using the multiprocessing module now
included by default in 2.6?
or even go as far as using something like pyGPU for fast vectorized
computations...?

I've seen around previous discussions about OpenGL being a backend in some
future...
would it really stand up compared to the current backends? is there clues
about that right now?

thanks for any inputs! :D
bye
-- 
View this message in context: 
http://www.nabble.com/Large-datasets-performance....-tp24074329p24074329.html
Sent from the matplotlib - devel mailing list archive at Nabble.com.


------------------------------------------------------------------------------
Crystal Reports - New Free Runtime and 30 Day Trial
Check out the new simplified licensing option that enables unlimited
royalty-free distribution of the report engine for externally facing 
server and web deployment.
http://p.sf.net/sfu/businessobjects
_______________________________________________
Matplotlib-devel mailing list
Matplotlib-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/matplotlib-devel

Reply via email to