I seem to have stumbled across a degenerate case somewhere in the combination of Java3D, Linux, and X11. I wondered if others are seeing this as well, and if the causes are known.
I have a very simple app, exploring notions of "pseudo volume rendering"; it displays roughly 1.6 million points from a PointArray, all having various alpha values. Right off that bat, the memory requirements are at least about 44MB (1.6 million by x,y,z,r,g,b,a 4-byte floats).
The initial image takes about 5 minutes before it appears (1500+ AMD, 512MB); after that, view changes are jerky to be sure, but require several seconds, not minutes, to affect. The JVM at that point has a RSS of around 150MB. What's disturbing is that memory requirements for the X-server go through the roof: I see a SIZE of 1.2GB and an RSS of nearly 1GB! Furthermore these values persist even after my app terminates, suggestive of a big 'ole memory leak. If I rerun the app, the memory requirements climb even higher, although much slower. The machine quickly becomes unusable until the server is restarted.
Curiously, I see system memory requirements on this order on a Windows 2K machine as well, but they return to "normal" once my app exits.
What the heck is going on behind the scenes there with J3D that it incurs all that memory demand outside of the JVM?
Thanks folks.
Rick Brownrigg
=========================================================================== To unsubscribe, send email to [EMAIL PROTECTED] and include in the body of the message "signoff JAVA3D-INTEREST". For general help, send email to [EMAIL PROTECTED] and include in the body of the message "help".