I have been trying to track down a memory leak in our application. In the
process I found a very interesting behavior. If you compile the attached
test application it seems to continually consume memory. Does this point to
an issue with OpenSG, or am I doing something wrong.

Thanks,
Aron
#include <iostream>
#include <string>
#include <vector>

#include <OpenSG/OSGConfig.h>
#include <OpenSG/OSGNode.h>


void run()
{
   std::cout << "Starting test" << std::endl;

   int num_reps = 0;
   std::vector<OSG::NodeRecPtr> nodes;

   for (int idx = 0; idx < 50000; ++idx)
   {
      OSG::NodeRecPtr node(OSG::Node::create());
      nodes.push_back(node);
      
      if (0 == idx % 500)
      {
         std::cout << "Adding node: " << idx << std::endl;
      }
   }

   // Clear all nodes.
   nodes.clear();
   nodes.swap(std::vector<OSG::NodeRecPtr>());
}

int main(int argc, char** argv)
{
   OSG::osgInit(argc, argv);

   while (true)
   {
      run();
   }

   OSG::osgExit();
   return 1;
}
------------------------------------------------------------------------------
The modern datacenter depends on network connectivity to access resources
and provide services. The best practices for maximizing a physical server's
connectivity to a physical network are well understood - see how these
rules translate into the virtual world? 
http://p.sf.net/sfu/oracle-sfdevnlfb
_______________________________________________
Opensg-users mailing list
Opensg-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/opensg-users

Reply via email to