Hi Trevor,

What kind of memory increase are we talking about?  Also, how big are the
documents that you are indexing, the ones returned from getFileInfoDoc()?
 Is it putting an entire file into the index?  Pre 2.9.3 versions had
issues with holding onto allocated byte arrays far beyond when they were
used.  The memory could only be freed via closing the IndexWriter.

I'm a little unclear on exactly what's happening.  Are you noticing memory
spike and stay constant at that level or is it a gradual increase?  Is it
causing your application to error, (ie OutOfMemory exception, etc)?


Thanks,
Christopher

On Mon, Nov 28, 2011 at 5:59 PM, Trevor Watson <
powersearchsoftw...@gmail.com> wrote:

> I'm attempting to use Lucene.Net v2.9.2.2 in a Visual Studio 2005 (.NET
> 2.0) environment.  We had a piece of software that WAS working.  I'm not
> sure what has changed however, the following code results in a memory leak
> in the Lucene.Net component (or a failure to clean up used memory).
>
> The code in issue is here:
>
>  private void SaveFileToFileInfo(Lucene.Net.Index.IndexWriter iw, bool
> delayCommit, string sDataPath)
> {
>   Document doc = getFileInfoDoc(sDataPath);
>   Analyzer analyzer = clsLuceneFunctions.getAnalyzer();
>   if (this.FileID == 0)
>   {
>      string s = "";
>   }
>   iw.UpdateDocument(new Lucene.Net.Index.Term("FileId",
> this.fileID.ToString("000000000")), doc, analyzer);
>
>   analyzer = null;
>   doc = null;
>   if (!delayCommit)
>                iw.Commit();
> }
>
> Commenting out the line iw.UpdateDocument resulted in no memory increase.
> I also tried replacing it with a deleteDocument and  AddDocument and the
> memory increased the same as using the UpdateDocument function
>
> The getAnalyzer() function returns a ExtendedStandardAnalyzer, but it's the
> UpdateDocument line specifically that gives me the issue.
>
> Any assistance would be greatly appreciated.
>
> Trevor Watson
>

Reply via email to