Hi,
I have a dataset that's about 2 million points (floats), and I'm trying
to figure out why DX is taking up so much memory to process it. I'm
importing it with some custom modules, so maybe my mistake is there...
or maybe I'm missing something else.
When I first read in the (unconnected) dataset, DX uses about 30
megabytes of RAM. After I connect it, memory usage for the dataset
(unstructured hexahedra) is over 380 megabytes, and when I try and
isosurface it, it usually fails with an out of memory error. I have a
gig of memory, but I need to do some other things with it, and I don't
want to give DX 600 meg of memory to handle 2 million data points. I
suppose I'm not deleting/freeing some of the interim objects my module
creates-- but I am free'ing all the malloc arrays, and I can't DXDelete
anything that becomes a component of the final output. Am I missing
something?

-- 
Mike Miller
[EMAIL PROTECTED] ->
[EMAIL PROTECTED]

Attachment: signature.asc
Description: This is a digitally signed message part

Reply via email to