True, but the javadocs for the Standard Lock's implementation classes
also say they don't work:

http://java.sun.com/j2se/1.4.2/docs/api/java/io/File.html

Further, NFS locking is also clearly stated to not work in the
SimpleFSLockFactory:

http://lucene.zones.apache.org:8080/hudson/job/Lucene-Nightly/javadoc/or
g/apache/lucene/store/SimpleFSLockFactory.html

So it appears we're in between a lock and a hard place...  (oh the 80's
sitcom humor)

Adding a config parameter sounds good too but the new patch is no worse
than what exists in terms of javadoc warnings and has been shown to
actually fix what I would imagine is a rather standard configuration
(local disk xp/rh)

- will

 

-----Original Message-----
From: Hoss Man (JIRA) [mailto:[EMAIL PROTECTED] 
Sent: Tuesday, May 15, 2007 4:27 PM
To: solr-dev@lucene.apache.org
Subject: [jira] Commented: (SOLR-240) java.io.IOException: Lock obtain
timed out: SimpleFSLock


    [
https://issues.apache.org/jira/browse/SOLR-240?page=com.atlassian.jira.p
lugin.system.issuetabpanels:comment-tabpanel#action_12496115 ] 

Hoss Man commented on SOLR-240:
-------------------------------

the idea of using different lock implementations has come up in the
past, 

http://www.nabble.com/switch-to-native-locks-by-default--tf2967095.html

one reason not to hardcode native locks was because not all file systems
support it -- so we left in the usage of SimpleFSLock because it's the
most generally reusable.

rather then change from one hardcoded lock type to another hardcoded
lock type, we should support a config option for making the choice ...
perhaps even adding a SolrLockFactory that defines an init(NamedList)
method and creating simple SOlr sucbclasses of all the core Lucene
LockFactor Imples so it's easy for people to write their own if they
want (and we don't just have "if (lockType.equlas("simple"))..." type
config parsing.

> java.io.IOException: Lock obtain timed out: SimpleFSLock
> --------------------------------------------------------
>
>                 Key: SOLR-240
>                 URL: https://issues.apache.org/jira/browse/SOLR-240
>             Project: Solr
>          Issue Type: Bug
>          Components: update
>    Affects Versions: 1.2
>         Environment: windows xp
>            Reporter: Will Johnson
>         Attachments: IndexWriter.patch, stacktrace.txt,
ThrashIndex.java
>
>
> when running the soon to be attached sample application against solr
it will eventually die.  this same error has happened on both windows
and rh4 linux.  the app is just submitting docs with an id in batches of
10, performing a commit then repeating over and over again.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to