We have a couple of thrift servers running, one in Python which in turn calls another in Java. We're seeing that over time their memory footprints grow rather large, enough to require a restart. However, we aren't able to see exactly *why* they're growing.
On request the Python server generates some data which is passed to the Java server for further processing. Once complete, the Java server passes that back to the Python server as a byte stream which returns it to the client. Since these are both garbage-collected languages we'd expect to see that after the generation/processing of the data it would be released from memory but we're not seeing the a decrease in the memory usage. Is there something specifically we should be doing to release the memory when the handler returns to the client? -- Phillip B Oldham [email protected] +44 (0) 7525 01 09 01
