These modules create a triangulated surface for each position in the object. Depending on the number of positions and the choice of glyph, you can generate enough polygons to use up memory as well as overload your graphics card. For a few thousand particles, this is probably more polygons than you can see on the screen anyway. There are a few strategies to consider:
1. Don't show all of the points. Consider subsetting the data by resolution (i.e., Reduce if the connections are regular), value (e.g., Include, keeping cull off if the connections are regular) or position (e.g., Slab or Mark(,"positions)->Include->Unmark) so that you are only visualizing the portion you wish to see. 2. If the density of points is high enough and if the field is a scalar, try using ShowPositions. 3. With Glyph or AutoGlyph, choose a simpler glyph type that will result in fewer polygons per glyph. Multiplying that by a few thousand may help. 4. You can have dxexec expand its memory usage to include swap by dx -memory n, where n is MB of real+swap, assuming you have allocated more swap than memory. You should be able to do 2x-3x on your machine. The default is 7/8 of physical memory. Ville Mustonen <[EMAIL PROTECTED]>@opendx.watson.ibm.com on 06/08/2001 04:47:38 AM Please respond to [email protected] Sent by: [EMAIL PROTECTED] To: [email protected] cc: Subject: [opendx-users] Memory problems Hello all. I have problems with memory usage of the OpenDx dxexec. I want to visualize few thousands particles using Glyph or AutoGlyph routines. The visualization starts without problems but after a while the dxexec uses all the memory and I get following message: ERROR: Glyph: Out of memory: reached limit of 432013312 in large arena. I am running OpenDX v. 4.1.1 with Red Hat 7.0 and using Fire GL graphics card. The computer is a AMD atholon 1.2GHz with 512Mb of DDR memory. I have tried to solve the problem by switching the cache off, but it did not help. The increasing memory usage seems to be related with hardware rendering, when it is off there is no memory leak. Ville Mustonen
