I am doing Greg's test on an SGI for Jeremy's network, and do not see any
stabilization in the memory.  I even put the Option module in with the
cache attribute set to 0 as David suggested.  As Jeremy noted previously,
the network is rather large with a couple of macros, including a loop with
Get/SetLocal.  That is why I not tried to narrow the problem down to one
module.

Well I finally got the "Out of Memory" error running in script mode.
Can we now say that there is a memory leak?

I orginally tested on Windows 98, but after Suhaib's warning that memory
leaks are a known problem there, I ran the network on Windows NT and so
did Jeremy.  And the memory leak occured there, too.

Jeff



On Fri, 5 Nov 1999 [EMAIL PROTECTED] wrote:

> The standard way to test for memory leaks in DX is this.
> 
> 1.  Prepare a network that you believe demonstrates a memory leak - the
> smaller the better.
> 
> 2.  Run DX with the flags "-script -readahead off -cache off"   That'll
> bring up dx at a prompt.
> 
> 3  type:
> 
> include "foo.net"
> Executive("flush cache");
> Usage("memory", 0);
> 
> where foo.net is the name of your test script.  It'll run the network,
> flush out everything thats in the cache and dictionary, and tell you how
> much memory remains allocated.
> 
> 4. go to 3 - eg. do it a few times.  Memory usage *should* stabilize.  If
> it rises after the second or third time, there's a leak.
> 
> Greg
>                                                                    
>  (Embedded                                                         
>  image moved to "Suhaib Siddiqi" <[EMAIL PROTECTED]>       
>  file:          11/05/99 11:24 AM                                  
>  pic09223.pcx)                                                     
>                                                                    
> 
> 
> 
> Please respond to [email protected]
> 
> 
> To:   [email protected]
> cc:
> Subject:  RE: [opendx-dev] Memory Leak clarifications
> 
> 
> 
> > In response to Suhaib's post:
> > > The OpenDx/Cygwin version Windows 98? That is a known
> > problem. You
> > > have right to yell at M$ for selling the gloriously
> > shitty product.
> > > Win98 itself has memory leak, which gets horrible when an
> > > X-application (X-server) is running because Win98 tries
> > to be selfish
> > > and hat to share
> > > resources with any other server... With X-server trying to share
> > > Display causes Win98/95 to start eating CPU resources
> > and may once
> > > in a while will force you to reboot the PC..... because it gets
> > > horribly slow.
> > Actually, I was able to create the problem under NT as
> > well.  The problem is
> > not with CPU resources, but memory usage.  It seems like
> > OpenDx isn't releasing
> >  memory somewhere, so eventually it fills up its memory
> > limit (-memory 200) and
> >  won't release any of it.  Nothing effects the system,
> > except the dx network
> > won't run until you disconnect/reconnect the dxexec to
> > force the release of
> > memory.
> >
> > Thanks for the responses!  Let's keep at it so we can
> > figure out what's going
> > on here...
> 
> 
> 
> Get a CPU resources monitor and try to watch resources consuption
> and Windows 98/95 not releasing resources to X applications.
> You will notice this problem with any X app, not just OpenDX.
> I have tested this over and over again during developemnt of several
> X clients.  it is Win98/95.  The M$ engnieers do not know
> how write an OS which could be stable and share resources properly.
> 
> >
> >   Jeremy Zoss
> >   Southwest Research Institute
> >   (210)522-3089
> >   [EMAIL PROTECTED]
> >
> 
> 
> 
> 

----------------------------------------------------------------------
Jeff Braun                        Geophysics Dept. 
mailto: [EMAIL PROTECTED]          Montana Tech 
(406) 496-4206                    1300 W. Park St.
                                  Butte, MT 59701

Reply via email to