Chad Kellerman wrote: > > here's my $.02 on this subject. Correct me if I am wrong. > Once perl uses memory it does not want to let it go back to the system. > I believe I have read the the developers are working on this. Since you > have your script running as a daemon. It will not release a lot of > memory back to the system, if any at all.
currently, the memory will not be released back to the OS. your OS mostly likely do not support that. many langugages that handles memory management internally have the same problem. in C/C++, memory management is the job of the programmer but if you put your data on the stack, they won't be released back to the OS until your program exit. if, however, you request something from the heap, you will have the chance to relese them back to the OS. that's nice because you actually release what you don't need back to the OS, not just your process pool. > > I had a similar problem. The way I worked around it is: > I knew where my script was eating up memory. So at these point I fork() > children. Once the child completes and dies the memory is released back > into the system. > i don't know if what you describle really works. when you fork, you are making an exact copy of the process running. the child process will include the parent process's code, data, stack etc. if the fork success, you will have 2 pretty much identical process. they are not related other than the parent-child relation the kernal keeps track. so if your child process exit, it should release all the memory of it's own but shouldn't take anything with it from it's parent. this means your child process's exit should not cause your parent process's memory pool to be returned back to the OS. but you said you really see a dramatic decrease in memory consumption but if you check your process's memory foot print(let say, simply look at it from the top command), does it's size reduce at all? david -- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, e-mail: [EMAIL PROTECTED]