problem with locks when updating the data of a previous stored do cument

2004-09-16 Thread Paul Williams
Hi, Using lucene-1.4.1.jar on WinXP I am having trouble with locking and updating an existing Lucene document. I delete the old document from the index and then add the new document to the index writer. I am using the minMerge docs set to 100 (much quicker!!) and close the writer once the

Getting a field value from a large indexed document is slow.

2004-05-14 Thread Paul Williams
Hi, I hope someone can help! I am using Lucene to make a searching repository of electronic documents. (MS Office, PDF's etc.). Some of these document can contain a large amount of text (about 500K of text in some cases) which is indexed to make it searchable. Doing the search and getting the

Not deleting temp files after updating/optimising.

2004-04-15 Thread Paul Williams
Hi, I am currently using Java 1.4.2_03 with Lucene 1.3 Final. I am using the option setUseCompoundFile(true) as I have a lot of fields in the database schema, and it can cause the dreaded 'too many open file error' on Windows based systems. (Optimising after 1000 documents with a merge factor of

RE: status of AND as default logical operator for QueryParser

2004-02-18 Thread Paul Williams
Mats, I believe when you call parser.parse with the Analyzer again you reset the state of default operator The following code should do what you want. QueryParser parser = new QueryParser( FIELD, service.getAnalyzer()); parser.setOperator(QueryParser.DEFAULT_OPERATOR_AND); Query

Error deleting a document when using the compound file index

2003-12-30 Thread Paul Williams
Hi, I am just testing Lucene 1.3 RC with the Compound index option on. When I come to delete an existing document in the index to re-update a document, I get an unable to obtain lock error. java.io.IOException: Lock obtain timed out at org.apache.lucene.store.Lock.obtain(Lock.java:97)