Hi Simon,
Simon Gelfand wrote:
Hi Alexander,
I will increase the MaxMergeDocs to 15000
Problem is very time I run an optimize (even with PHP max memory set
at 100MB) it exhusts the amount of memory and fails.
Does the problem appears even for newly created index (with full
optimization for each 15000 or less documents?) or only for your already
created index with more than 90 segments?
I didn't understand what you meant here
PS Remove /path to the file/ab_index/index.lock file. It may return
index into consistent state.
You mean delete the file?
Yes.
With best regards,
Alexander Veremyev.
Simon
On 5/10/07, Alexander Veremyev <[EMAIL PROTECTED]> wrote:
That looks like you have too many segments for your memory limits.
Each segment needs memory to store preloaded terms dictionary index.
Increase MaxMergeDocs option or perform full optimization after each
15000 documents.
PS Remove /path to the file/ab_index/index.lock file. It may return
index into consistent state.
PPS That's a subject for discussion, which permission options should be
used for created index. Should it be available for read/write for
another users by default?
With best regards,
Alexander Veremyev.
webshark27 wrote:
> Hi,
>
> I am running MaxMergeDocs with 1500 the rest with default (10 and 10);
>
> The script online runs out of memory when I try to force an Optimize
but
> still it indexes the articles.
>
> I currently have around 90 .cfs files
>
> But now if I try to search I always get (a var dump of the Exception
error)
>
> Object(Zend_Search_Lucene_Exception)#4 (6) {
> ["message:protected"]=>
> string(156) "fopen(/path to the file/ab_index/index.lock) [
function.fopen
> function.fopen ]: failed to open stream: Permission denied"
> ["string:private"]=>
> string(0) ""
> ["code:protected"]=>
> int(0)
> ["file:protected"]=>
> string(100) "/path to the
> file/public_html/Zend/Search/Lucene/Storage/File/Filesystem.php"
> ["line:protected"]=>
>
> Thanks,
>
> Simon
>
>
> Alexander Veremyev wrote:
>> Hi,
>>
>> Zend_Search_Lucene uses memory for:
>> 1. preloaded term dictionary index for reach index segment;
>> So large number of segments increases memory usage.
>> Segments may be merged into one with Zend_Search_Lucene::optimize()
>> method.
>> Segments are also partially auto-merged with auto-optimization
process.
>> Auto-optimization behavior depends on MergeFactor and MaxMergeDocs
>> parameters.
>>
>> 2. buffered docs (documents, which are indexed, but not dumped into
new
>> segment);
>> When number of buffered docs reaches MaxBufferedDocs parameter, new
>> segment is dumped into disk. It frees memory used for buffered docs.
>>
>> Did you changed MergeFactor, MaxMergeDocs or MaxBufferedDocs
parameters?
>> Or did you use default settings?
>>
>> Which number of segments (number of .cfs files in index directory) do
>> you have when script crashes?
>>
>> With best regards,
>> Alexander Veremyev.
>>
>