On Mon, Sep 28, 2009 at 1:24 PM, Manuel C. <[email protected]> wrote: > I tried also version 2.4.2.3, it's the same. I took me a few days to find > out that a missing Close() on a GZipStream was the root of all evil. Here is > an example: > > static void Decompress() > { > byte[] file = File.ReadAllBytes("in.dat.gz"); > Stream stream = new GZipStream(new MemoryStream(file), > CompressionMode.Decompress); > byte[] buffer = new byte[1024]; > while (stream.Read(buffer, 0, buffer.Length) > 0); > // stream.Close(); > } > > A function similar to this is called very often in my application. Under > Windows with.Net there is no issue, but with Mono the memory usage will > increase with every call of Decompress(). I thought the deconstructor of the > stream will call the Close or Dispose method?
The finalizer (which is *not* a deconstructor) might not call Close/Dispose in all cases. Consider the case where you wrap a Stream in a GZipStream and then throw the GZipStream away, but still want to operate on the other stream. The finalizer will not (and should not) assume that you are done with the stream you gave it. In this particular case, it seems to just be an instance of the GC not kicking in often enough. Since the backing stream is a MemoryStream, there are no unmanaged resources to release here. It's much more likely that the byte[] is just taking longer to free without calling Close, since there is still a reference to it. (Presumably the MemoryStream.Close method will set its byte[] reference to null.) As an aside, you should really be using a FileStream in this scenario. It is horribly inefficient to read an entire file into memory and decompress it there. You're wasting memory and cycles you could be spending on performing the actual decompression. -- Chris Howie http://www.chrishowie.com http://en.wikipedia.org/wiki/User:Crazycomputers _______________________________________________ Mono-list maillist - [email protected] http://lists.ximian.com/mailman/listinfo/mono-list
