Hello Robert,

I have two different machines that I've got functioning independently of one 
another, one with 16GB of RAM and one with 32GB of RAM.  I'm currently 
attempting to process the ASTER dataset on the 16GB of RAM machine (Quad Core 
i5).

For the dataset statistics I provided above, how much RAM do you think it 
should consume?  Is there a way to dynamically add or remove cores depending on 
predicted RAM useage for a dataset?  Or do you have to just monitor the system 
useage, kill VPB when you think you can add a second core to the system, and 
restart using the --tasks command?

Does the native clustering virtualize RAM?  Or should I attempt to delve back 
into SGE or another resource distribution service?

------------------
Read this topic online here:
http://forum.openscenegraph.org/viewtopic.php?p=53791#53791





_______________________________________________
osg-users mailing list
[email protected]
http://lists.openscenegraph.org/listinfo.cgi/osg-users-openscenegraph.org

Reply via email to