[ 
https://issues.apache.org/jira/browse/SOLR-6065?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13996549#comment-13996549
 ] 

Jack Krupansky commented on SOLR-6065:
--------------------------------------

As a historical note, I had filed LUCENE-4104 and LUCENE-4105, as well as 
SOLR-3504 and SOLR-3505 to both document and check against the per-index 
document limit in both Lucene and Solr.

I think Lucene should check against the limit, and then Solr should respond to 
that condition.

Two interesting use cases:

1. Deleted documents exist, so Solr should tell the user that "optimize" can 
resolve the problem.
2. No deleted documents exist, Solr can only report that the document limit has 
been reached.

As an afterthought, maybe we should have a configurable Solr parameter for 
"maximum documents per shard" since anybody adding 2 billion documents to a 
shard is very likely to run into performance issues long before they get near 
the absolute maximum limit. I'd suggest a Solr configurable limit of like 250 
million. Alternatively, this configurable limit could simply be a (noisy) 
warning, or maybe it could be configurable as either a hard error or a soft 
warning.


> Solr / IndexWriter should prevent you from adding docs if it creates an index 
> to big to open
> --------------------------------------------------------------------------------------------
>
>                 Key: SOLR-6065
>                 URL: https://issues.apache.org/jira/browse/SOLR-6065
>             Project: Solr
>          Issue Type: Bug
>            Reporter: Hoss Man
>
> yamazaki reported an error on solr-user where, on opening a new searcher, he 
> got an IAE from BaseCompositeReader because the numDocs was greater then 
> Integer.MAX_VALUE.
> I'm surprised that in a straight forward setup (ie: no "AddIndex" merging) 
> IndexWriter will even let you add more docs then max int.  We should 
> investigate if this makes sense and either add logic in IndexWriter to 
> prevent this from happening, or add logic to Solr's UpdateHandler to prevent 
> things from getting that far.
> ie: we should be failing to "add" too many documents, and leaving the index 
> usable -- not accepting the add and leaving hte index in an unusable state.
> stack trace reported by user...
> {noformat}
> ERROR org.apache.solr.core.CoreContainer  – Unable to create core: collection1
> org.apache.solr.common.SolrException: Error opening new searcher
>     at org.apache.solr.core.SolrCore.<init>(SolrCore.java:821)
>     at org.apache.solr.core.SolrCore.<init>(SolrCore.java:618)
>     at 
> org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:949)
>     at org.apache.solr.core.CoreContainer.create(CoreContainer.java:984)
>     at org.apache.solr.core.CoreContainer$2.call(CoreContainer.java:597)
>     at org.apache.solr.core.CoreContainer$2.call(CoreContainer.java:592)
>     at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>     at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>     at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
>     at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>     at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>     at 
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
>     at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
>     at java.lang.Thread.run(Thread.java:662)
> Caused by: org.apache.solr.common.SolrException: Error opening new searcher
>     at org.apache.solr.core.SolrCore.openNewSearcher(SolrCore.java:1438)
>     at org.apache.solr.core.SolrCore.getSearcher(SolrCore.java:1550)
>     at org.apache.solr.core.SolrCore.<init>(SolrCore.java:796)
>     ... 13 more
> Caused by: org.apache.solr.common.SolrException: Error opening Reader
>     at 
> org.apache.solr.search.SolrIndexSearcher.getReader(SolrIndexSearcher.java:172)
>     at 
> org.apache.solr.search.SolrIndexSearcher.<init>(SolrIndexSearcher.java:183)
>     at 
> org.apache.solr.search.SolrIndexSearcher.<init>(SolrIndexSearcher.java:179)
>     at org.apache.solr.core.SolrCore.openNewSearcher(SolrCore.java:1414)
>     ... 15 more
> Caused by: java.lang.IllegalArgumentException: Too many documents,
> composite IndexReaders cannot exceed 2147483647
>     at 
> org.apache.lucene.index.BaseCompositeReader.<init>(BaseCompositeReader.java:77)
>     at 
> org.apache.lucene.index.DirectoryReader.<init>(DirectoryReader.java:368)
>     at 
> org.apache.lucene.index.StandardDirectoryReader.<init>(StandardDirectoryReader.java:42)
>     at 
> org.apache.lucene.index.StandardDirectoryReader$1.doBody(StandardDirectoryReader.java:71)
>     at 
> org.apache.lucene.index.SegmentInfos$FindSegmentsFile.run(SegmentInfos.java:783)
>     at 
> org.apache.lucene.index.StandardDirectoryReader.open(StandardDirectoryReader.java:52)
>     at org.apache.lucene.index.DirectoryReader.open(DirectoryReader.java:88)
>     at 
> org.apache.solr.core.StandardIndexReaderFactory.newReader(StandardIndexReaderFactory.java:34)
>     at 
> org.apache.solr.search.SolrIndexSearcher.getReader(SolrIndexSearcher.java:169)
>     ... 18 more
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.2#6252)

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: dev-h...@lucene.apache.org

Reply via email to