Hi Nick, 

Thanks for your reply! The another constraint with the single index approach is 
our index locations are dynamic and search happened through APIs which 
construct the index location based on certain input parameters. 

However, I did modification in code to do fork process to complete the actions. 
In that process Searcher object gets created and destroyed and I get all search 
hits in an array that I process later. I did not use polysercher. This 
increases the speed. 

Thanks,
Rajiv Gupta

-----Original Message-----
From: Nick Wellnhofer [mailto:wellnho...@aevum.de] 
Sent: Friday, September 16, 2016 3:51 PM
To: user@lucy.apache.org
Subject: Re: [lucy-user] Speed up Search with Lucy::Search::IndexSearcher and 
Lucy::Search::PolySearcher from multiple index folders

On 14/09/2016 09:05, Gupta, Rajiv wrote:
> I'm creating indexes on multiple subfolders under one parent folder.
>
> Indexes are created on multiple folders since files are getting created in 
> parallel and I want to avoid segment locking between multiple indexers.

> I did profiling using Devel::NYTProf<https://metacpan.org/pod/Devel::NYTProf> 
> and found two places where the maximum time was taken:
> 1.    While scanning the directory. (This I will try to solve by generating a 
> list of directories while the application is generating the indexes).
> 2.    When creating the searchers using Lucy::Search::IndexSearcher. This 
> takes maximum time when running in loop for all indexed directories.

It sounds like you're working with an excessively large number of indices. 
Maybe you should simply rethink your approach and use a single index? If you're 
concerned about locking maybe a separate indexing process with some kind of 
notification mechanism would help?

Nick

Reply via email to