Re: Why does Solr sort on _docid_ with rows=0 ?

2020-03-05 Thread S G
Thanks Hoss. Yes, that jira seems like a good one to fix.
And the variable name definitely does not explain why it will not cause any
sort operation.

-SG

On Mon, Mar 2, 2020 at 10:06 AM Chris Hostetter 
wrote:

> : docid is the natural order of the posting lists, so there is no sorting
> effort.
> : I expect that means “don’t sort”.
>
> basically yes, as documented in the comment right above hte lines of code
> linked to.
>
> : > So no one knows this then?
> : > It seems like a good opportunity to get some performance!
>
> The variable name is really stupid, but the 'solrQuery' variable you see
> in the code is *only* ever used for 'checkAZombieServer()' ... which
> should only be called when a server hasn't been responding to other (user
> initiated requests)
>
> : >> I see a lot of such queries in my Solr 7.6.0 logs:
>
> If you are seeing a lot of those queries, then there are other problems in
> your cluster you should investigate -- that's when/why LBSolrClient does
> this query -- to see if the server is responding.
>
> : >> *path=/select
> : >>
> params={q=*:*=false=_docid_+asc=0=javabin=2}
> : >> hits=287128180 status=0 QTime=7173*
>
> that is an abnormally large number of documents to have in a single shard.
>
> : >> If you want to check a zombie server, shouldn't there be a much less
> : >> expensive way to do a health-check instead?
>
> Probably yes -- i've opened SOLR-14298...
>
> https://issues.apache.org/jira/browse/SOLR-14298
>
>
>
> -Hoss
> http://www.lucidworks.com/


SolrException in Solr 6.1.0

2020-03-05 Thread vishal patel
I got below ERROR in Solr 6.1.0 log

2020-03-05 16:54:09.508 ERROR (qtp1239731077-468949) [c:workflows s:shard1 
r:core_node1 x:workflows] o.a.s.h.RequestHandlerBase 
org.apache.solr.common.SolrException: Exception writing document id 
WF204878828_42970103 to the index; possible analysis error.
at 
org.apache.solr.update.DirectUpdateHandler2.addDoc(DirectUpdateHandler2.java:181)
at 
org.apache.solr.update.processor.RunUpdateProcessor.processAdd(RunUpdateProcessorFactory.java:68)
at 
org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:48)
at 
org.apache.solr.update.processor.DistributedUpdateProcessor.doLocalAdd(DistributedUpdateProcessor.java:939)
at 
org.apache.solr.update.processor.DistributedUpdateProcessor.versionAdd(DistributedUpdateProcessor.java:1094)
at 
org.apache.solr.update.processor.DistributedUpdateProcessor.processAdd(DistributedUpdateProcessor.java:720)
at 
org.apache.solr.update.processor.LogUpdateProcessorFactory$LogUpdateProcessor.processAdd(LogUpdateProcessorFactory.java:103)
at org.apache.solr.handler.loader.JavabinLoader$1.update(JavabinLoader.java:97)
at 
org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec$1.readOuterMostDocIterator(JavaBinUpdateRequestCodec.java:179)
at 
org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec$1.readIterator(JavaBinUpdateRequestCodec.java:135)
at org.apache.solr.common.util.JavaBinCodec.readVal(JavaBinCodec.java:274)
at 
org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec$1.readNamedList(JavaBinUpdateRequestCodec.java:121)
at org.apache.solr.common.util.JavaBinCodec.readVal(JavaBinCodec.java:239)
at org.apache.solr.common.util.JavaBinCodec.unmarshal(JavaBinCodec.java:157)
at 
org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.unmarshal(JavaBinUpdateRequestCodec.java:186)
at 
org.apache.solr.handler.loader.JavabinLoader.parseAndLoadDocs(JavabinLoader.java:107)
at org.apache.solr.handler.loader.JavabinLoader.load(JavabinLoader.java:54)
at 
org.apache.solr.handler.UpdateRequestHandler$1.load(UpdateRequestHandler.java:97)
at 
org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:69)
at 
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:156)
at org.apache.solr.core.SolrCore.execute(SolrCore.java:2036)
at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:657)
at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:464)
at 
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:257)
at 
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:208)
at 
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1668)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:581)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)
at 
org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226)
at 
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1160)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:511)
at 
org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)
at 
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1092)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at 
org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:213)
at 
org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:119)
at 
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)
at org.eclipse.jetty.server.Server.handle(Server.java:518)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:308)
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:244)
at 
org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:273)
at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:95)
at 
org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93)
at 
org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceAndRun(ExecuteProduceConsume.java:246)
at 
org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:156)
at 
org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:654)
at 
org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:572)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.lucene.store.AlreadyClosedException: this IndexWriter is 
closed
at org.apache.lucene.index.IndexWriter.ensureOpen(IndexWriter.java:724)
at org.apache.lucene.index.IndexWriter.ensureOpen(IndexWriter.java:738)
at org.apache.lucene.index.IndexWriter.updateDocument(IndexWriter.java:1488)
at 

Slow queries until core is reindexed

2020-03-05 Thread dbourassa
Hi all,

We have a solr 8.4.1 server running on Windows. (Very simple setup.)
16GB RAM / JVM-Mem set to 4GB
Solr host 4 cores. (2 GB + 1GB + 75MB + 75MB)
Full data import every night. No delta import.
This server is used for tests by 2 people. (very low request rate)

We have an issue we don't understand: 
The average response time for search queries is < 10ms.
Sometimes the response time slow down considerably (>1000ms) for all queries
but just for 1 core.
All queries continue to be slow until we reindex the core with a full data
import.
After that, response time go back under 10ms for this core but another core
begins to slow down.
We cannot operate all the 4 cores with the expected response time <10ms at
the same time.

What can be the cause of this issue?

Thanks,
Dany




--
Sent from: https://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Re: "SolrCore Initialization Failures" in the Solr's current UI, but not in the original UI

2020-03-05 Thread Jan Høydahl
v6.6.0 is from 2017 and not supported anymore. You are really encouraged to 
upgrade!

Are you by any chance using Internet Explorer? (See 
https://stackoverflow.com/questions/56262704/solr-solrcore-initialization-failures-core-error)

Jan

> 5. mar. 2020 kl. 22:03 skrev Ryan W :
> 
> Hi all,
> 
> On my dev server, in the Solr admin UI, I see...
> 
>SolrCore Initialization Failures
>{{core}}: {{error}}
>Please check your logs for more information
>{{exception.msg}}
> 
> These appear to be template variables, but they are never populated.  They
> just dump to the screen.  There is nothing useful in the log.  If I click
> over to the "original UI" via the link in the upper right corner of the
> screen, there is no error displayed and all seems to be well.  Meanwhile,
> Apache Solr seems to be functioning OK.  So this does not appear to be a
> significant problem, but still a little annoying.
> 
> I am trying to get the dev server to perform the same way as the staging
> server, but at least in the "current UI," I get these errors in my dev
> server.  I have tried to compared the two installs very carefully, but
> can't find any discrepancy between staging and dev.
> 
> Both servers run...
> Solr 6.6.6
> RHEL 7
> 
> Thank you,
> Ryan



"SolrCore Initialization Failures" in the Solr's current UI, but not in the original UI

2020-03-05 Thread Ryan W
Hi all,

On my dev server, in the Solr admin UI, I see...

SolrCore Initialization Failures
{{core}}: {{error}}
Please check your logs for more information
{{exception.msg}}

These appear to be template variables, but they are never populated.  They
just dump to the screen.  There is nothing useful in the log.  If I click
over to the "original UI" via the link in the upper right corner of the
screen, there is no error displayed and all seems to be well.  Meanwhile,
Apache Solr seems to be functioning OK.  So this does not appear to be a
significant problem, but still a little annoying.

I am trying to get the dev server to perform the same way as the staging
server, but at least in the "current UI," I get these errors in my dev
server.  I have tried to compared the two installs very carefully, but
can't find any discrepancy between staging and dev.

Both servers run...
Solr 6.6.6
RHEL 7

Thank you,
Ryan


Re: Problem with Solr 7.7.2 after OOM

2020-03-05 Thread Jörn Franke
Just keep in mind that the total memory should be much more than the heap to 
leverage Solr file caches. If you have 8 GB heap probably at least 16 gb total 
memory make sense to be available on the machine .

> Am 05.03.2020 um 16:58 schrieb Walter Underwood :
> 
> 
>> 
>> On Mar 5, 2020, at 4:29 AM, Bunde Torsten  wrote:
>> 
>> -Xms512m -Xmx512m 
> 
> Your heap is too small. Set this to -Xms8g -Xmx8g
> 
> In solr.in.sh, that looks like this:
> 
> SOLR_HEAP=8g
> 
> wunder
> Walter Underwood
> wun...@wunderwood.org
> http://observer.wunderwood.org/  (my blog)
> 


Re: Problem with Solr 7.7.2 after OOM

2020-03-05 Thread Walter Underwood
> On Mar 5, 2020, at 4:29 AM, Bunde Torsten  wrote:
> 
>  -Xms512m -Xmx512m 

Your heap is too small. Set this to -Xms8g -Xmx8g

In solr.in.sh, that looks like this:

SOLR_HEAP=8g

wunder
Walter Underwood
wun...@wunderwood.org
http://observer.wunderwood.org/  (my blog)



Problem with Solr 7.7.2 after OOM

2020-03-05 Thread Bunde Torsten
Hello @all,

I've got a problem with Solr (version 7.7.2) since some days.

On March, 4th I got an OOM and the log file just says the following:

Running OOM killer script for process 12351 for Solr on port 8983
Killed process 12351

So I started the service again and the service is running fine. But if I try to 
search something I only get the error that the string I searched for wasn't 
found. Next, I tried to see the status of the process. That only gives me the 
following error message:

./solr status

Found 1 Solr nodes:

Solr process 1519 running on port 8983

ERROR: Failed to get system information from http://localhost:8983/solr due to: 
org.apache.http.client.ClientProtocolException: Expected JSON response from 
server but received: 


Error 404 Not Found

HTTP ERROR 404
Problem accessing /solr/admin/info/system. Reason:
Not FoundCaused 
by:javax.servlet.ServletException: 
javax.servlet.UnavailableException: Error processing the request. CoreContainer 
is either not initialized or shutting down.
at 
org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146)
at 
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
at 
org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:335)
at 
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
at org.eclipse.jetty.server.Server.handle(Server.java:502)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:364)
at 
org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:260)
at 
org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305)
at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:118)
at 
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333)
at 
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310)
at 
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168)
at 
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:126)
at 
org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:366)
at 
org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:765)
at 
org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:683)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: javax.servlet.UnavailableException: Error processing the request. 
CoreContainer is either not initialized or shutting down.
at 
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:359)
at 
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:341)
at 
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1602)
at 
org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:540)
at 
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:146)
at 
org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)
at 
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
at 
org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:257)
at 
org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1588)
at 
org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255)
at 
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1345)
at 
org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)
at 
org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:480)
at 
org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1557)
at 
org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)
at 
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1247)
at 
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)
at 
org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:220)
at 
org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:126)
... 17 more

Caused by:javax.servlet.UnavailableException: Error processing 
the request. CoreContainer is either not initialized or shutting down.
at 
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:359)
at 
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:341)
at 

Re: Solr Search cause high CPU with **

2020-03-05 Thread Jan Høydahl
Hi

Solr query parser has special handling for single asterisk * which is rewritten 
as a «MatchAllDocs» query, the same as *:*. So when you do ** you probably 
explicitly invoke the wildcard feature which enumerates all terms in the 
dictionary.
I don’t see what you are trying to achieve with the ** term in your query.
If I should guess, it is some automatic query generator code that somehow 
generates this by mistake?

Jan

> 4. mar. 2020 kl. 16:59 skrev Shreyas Kothiya 
> :
> 
> Thanks you for quick response.
> 
> we do use wild card searches. I am sorry if i have not asked my question 
> correctly. 
> 
> in SOLR if you search with consecutive asterisks [**]  does not like it and 
> cause high CPU. and does not return any result.
> 
> q= N*W* 154 ** underpass was converted to 
> key=((text:N*W*+OR+text:154+OR+text:**+OR+text:underpass))
> 
> After doing the testing in our lower environment it appears that it caused 
> because of consecutive asterisks in search term above. 
> I was able to reproduce it by just searching q=**
> 
> 
> 
> 
> 
> 
>> On Mar 3, 2020, at 10:50 PM, em...@yeikel.com wrote:
>> 
>> According to the documentation, the standard query parser uses asterisks to
>> do wild card searches[1]. If you do not need to do wildcard queries and what
>> you are trying to do is to use the asterisks as a search term, you should
>> escape it[2]
>> 
>> [1]
>> https://lucene.apache.org/solr/guide/6_6/the-standard-query-parser.html#TheS
>> tandardQueryParser-WildcardSearches
>> [2]
>> https://lucene.apache.org/solr/guide/6_6/the-standard-query-parser.html#TheS
>> tandardQueryParser-EscapingSpecialCharacters 
>> 
>> -Original Message-
>> From: Shreyas Kothiya  
>> Sent: Tuesday, March 3, 2020 11:43 AM
>> To: solr-user@lucene.apache.org
>> Subject: Solr Search cause high CPU with **
>> 
>> Hello
>> 
>> we are using SOLR 6.6. we recently noticed unusual CPU spike. while one of
>> customer did search with ** in query string.
>> 
>> we had customer who searched for following term
>> 
>> q = N*W* 154 ** underpass
>> 
>> it caused CPU to go above 80%. our normal range of CPU is around 20%.
>> I wanted to know few things.
>> 
>> 1. what does ** mean in SOLR search.
>> 2. Is there a bug filed already for this issue.
>> 
>> 
>> Please let me know if you need more information.
>> 
>> Thanks
>> Shreyas Kothiya
>> 
>