Re: Unable to create core in Solr 8.6.0

2020-08-09 Thread Chris Larsson
Just to provide a little closure, it appears that this issue is fixed in
Java 14.0.2.

Chris

On Mon, Jul 27, 2020 at 5:38 PM Chris Larsson  wrote:

> Nice... That's the code I was just looking at.
>
>
> https://github.com/apache/lucene-solr/blob/branch_8_6/solr/core/src/java/org/apache/solr/update/processor/ParseDateFieldUpdateProcessorFactory.java
>
> From the stack trace:
> Caused by: java.time.format.DateTimeParseException: Text
> '2020-07-27T18:02:42.069Z' could not be parsed: null
> at
> java.base/java.time.format.DateTimeFormatter.createError(DateTimeFormatter.java:2021)
> at
> java.base/java.time.format.DateTimeFormatter.parse(DateTimeFormatter.java:1924)
> at
> org.apache.solr.update.processor.ParseDateFieldUpdateProcessorFactory.parseInstant(ParseDateFieldUpdateProcessorFactory.java:233)
> at
> org.apache.solr.update.processor.ParseDateFieldUpdateProcessorFactory.validateFormatter(ParseDateFieldUpdateProcessorFactory.java:217)
> ... 56 more
> Caused by: java.lang.NullPointerException
> at
> java.base/java.time.format.DateTimeFormatterBuilder$PrefixTree.prefixLength(DateTimeFormatterBuilder.java:4538)
> at
> java.base/java.time.format.DateTimeFormatterBuilder$PrefixTree.add0(DateTimeFormatterBuilder.java:4407)
>
>
> It seems to like the DateTimeFormatter.parse fails after being called by
> line 233 in ParseDateFieldUpdateProcessorFactory.java:233.  But if
> I understand, which may be a stretch, it seems like the formatter itself is
> created earlier in the code at line 189.  I guess I was concerned that the
> formatter creation was the problem since the fix seems to be to impose a
> locale using java.locale.providers=JRE,SPI.  If I understand this
> https://www.oracle.com/java/technologies/javase/jdk14-suported-locales.html#providers
>  using
> JRE  =  COMPAT  and COMPAT says "Represents the locale sensitive services
> that are compatible with the prior JDK releases up to JDK 8."  Is it
> possible that the formatter being created is not compatible with Java 14
> and that's why setting the locale back to 8 makes it work?
>
> Again I thank you for your responses.
>
> C
>
> On Mon, Jul 27, 2020 at 5:13 PM Erick Erickson 
> wrote:
>
>> I just remembered the issue and relayed the JIRA ;).
>>
>> As to why I suspect the Java issue, if this was a core Solr issue I’d
>> expect it to be a show stopper,
>> Solr isn’t much good if you can’t create cores. Of course weirder things
>> have happened. Plus,
>> the date is fine: "2020-07-27T18:02:42.069Z"
>>
>> If you have the bandwidth, I’d log all the inputs in
>> ParseDateFieldUpdateProcessor.validateFormatter,
>> which looks at a glance to be where the exception is caught. The code
>> looks like this:
>>
>> public static void validateFormatter(DateTimeFormatter formatter) {
>>   // check it's valid via round-trip
>>   try {
>> parseInstant(formatter, formatter.format(Instant.now()), new
>> ParsePosition(0));
>>   } catch (Exception e) {
>> throw new SolrException(SolrException.ErrorCode.SERVER_ERROR,
>> "Bad or unsupported pattern: " + formatter.toFormat().toString(),
>> e);
>>   }
>> }
>>
>> I suspect the message is a bit misleading and the formatter in that call
>> is null or some such. Why I
>> haven’t a clue.
>>
>> Best,
>> Erick
>>
>>
>> > On Jul 27, 2020, at 3:54 PM, Chris Larsson 
>> wrote:
>> >
>> > Thank you Erick.  I appreciate you taking the time to respond.  I have
>> seen
>> > the post you shared and I know the work around does solve the issue.  I
>> am
>> > more concerned about why Solr doesn't work with the default settings.
>> I am
>> > also not convinced it is solely a Java issue, but I would like to
>> > investigate that further so I ask what makes you say it is a Java issue
>> as
>> > opposed to being a Solr issue?  I see David's comment, but he seems to
>> be
>> > approaching the solution as an end user as opposed to a Solr developer.
>> >
>> >
>> > On Mon, Jul 27, 2020 at 3:43 PM Erick Erickson > >
>> > wrote:
>> >
>> >> Take a look at:
>> >>
>> >> https://issues.apache.org/jira/browse/SOLR-13606
>> >>
>> >> It’s actually a weirdness with Java. In that JIRA David Smiley
>> >> suggest a way to deal with it, but I confess I haven’t a clue
>> >> about the nuances there.
>> >>
>> >> Best,
>> >> Erick
>> >>
>> >>
>> >>
>> >>> On Jul 27, 2020, at 3:12 PM, Chris Larsson 
>> wrote:
>> >>>
>> >>> Ran into an issue attempting to create a core on a new install of Solr
>> >>> 8.6.  The system is CentOS 8 fully updated and using Java OpenJDK
>> version
>> >>> 14.0.1.
>> >>>
>> >>> # java -version
>> >>> openjdk version "14.0.1" 2020-04-14
>> >>> OpenJDK Runtime Environment 20.3 (build 14.0.1+7)
>> >>> OpenJDK 64-Bit Server VM 20.3 (build 14.0.1+7, mixed mode, sharing)
>> >>>
>> >>> Attempting to create a core from the command line results in the
>> >> following
>> >>> messages:
>> >>>
>> >>> # cd /opt/solr;  sudo -u solr bin/solr create_core -c foocore
>> >>> WARNING: Using _default configset with data driven schema
>> functionality.
>> 

Re: Can create collections with Drupal 8 configset

2020-08-09 Thread Shawn Heisey

On 8/9/2020 8:11 AM, Shane Brooks wrote:

Thanks Shawn. The way we have it configured presently is as follows:
icu4j.jar is located in /opt/solr/contrib/analysis-extras/lib/icu4j-62.1.jar

solrconfig.xml contains:



Which should load the jar at startup, correct?


I do not know if that path spec is right or not.  It might be.

The class that doesn't load (in your error message) is not located in 
the icu4j jar.  It is located in the lucene-analyzers-icu-X.Y.Z.jar 
file, which is found in the contrib/analysis-extras/lucene-libs 
subdirectory.  That jar also needs the icu4j jar.


If the same class is loaded more than once, it probably won't work.  I 
know for sure from experience that this is the case for the Lucene ICU 
classes.  That's the biggest reason I use the ${solr.home}/lib directory 
-- so I am sure that each extra jar is only loaded once.  That directory 
does not exist until you create it.


Thanks,
Shawn


Production Issue: TIMED_WAITING - Will net.ipv4.tcp_tw_reuse=1 help?

2020-08-09 Thread Doss
Hi,

We are having 3 node SOLR (8.3.1 NRT) + 3 Node Zookeeper Ensemble now and
then we are facing "Max requests queued per destination 3000 exceeded for
HttpDestination"

After restart evering thing starts working fine until another problem. Once
a problem occurred we are seeing soo many TIMED_WAITING threads

Server 1:
   *7722*  Threads are in TIMED_WATING
("lock":"java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject@151d5f2f
")
Server 2:
   *4046*   Threads are in TIMED_WATING
("lock":"java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject@1e0205c3
")
Server 3:
   *4210*   Threads are in TIMED_WATING
("lock":"java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject@5ee792c0
")

Please suggest whether net.ipv4.tcp_tw_reuse=1 will help ? or how can we
increase the 3000 limit?

Sorry, since I haven't got any response to my previous query,  I am
creating this as new,

Thanks,
Mohandoss.


Re: HttpSolrClient Connection Evictor

2020-08-09 Thread Shawn Heisey

On 8/9/2020 2:46 AM, Srinivas Kashyap wrote:

We are using HttpSolrClient(solr-solrj-8.4.1.jar) in our app along with 
required jar(httpClient-4.5.6.jar). Before that we upgraded these jars from 
(solr-solrj-5.2.1.jar) and (httpClient-4.4.1.jar).

After we upgraded, we are seeing lot of below connection evictor statements in 
log file.

DEBUG USER_ID - STEP 2020-08-09 13:59:33,085 [Connection evictor] - Closing 
expired connections
DEBUG USER_ID - STEP 2020-08-09 13:59:33,085 [Connection evictor] - Closing 
connections idle longer than 5 MILLISECONDS


These logs are coming from the HttpClient library, not from SolrJ. 
Those actions appear to be part of HttpClient's normal operation.


It is entirely possible that the older version of the HttpClient library 
doesn't create these debug-level log entries.  That library is managed 
by a separate project under the Apache umbrella -- we here at the Solr 
project are not involved with it.


The solution here is to change your logging level.  You can either 
change the level of the main logger to something like INFO or WARN, or 
you can reduce the logging level of the HttpClient classes without 
touching the rest.  I do not know what logging system you are using.  If 
you need help with how to configure it, the people who made the logging 
system are much better equipped to configure it than we are.


I personally would change the default logging level for the whole 
program.  Those messages are logged at the DEBUG level.  Running an 
application with all loggers set that low should only be done when 
debugging a problem ... that level is usually far too verbose for a 
production system.  I do not recommend it at all.


If you choose to only change the level of the HttpClient classes, those 
loggers all start with "org.apache.http" which you will need for your 
logging configuration.


An additional note:  Your code should *not* create and close the 
HttpSolrClient for every query as you have done.  The HttpSolrClient 
object should be created once and re-used for the life of the program.


Thanks,
Shawn


Re: java.util.concurrent.RejectedExecutionException: Max requests queued per destination 3000 exceeded for HttpDestination

2020-08-09 Thread Doss
Hi,

Sorry, actually the issue was not fixed, on that day the service
restart cleared the TIMED_WAITNG threads. The problem re-occured recently,
everything works fine until a disturbance happens to the cluster due to
high load or a node responds slow.

We have 3 servers, the following TIMED_WAITING threads are reducing

Server 1:
   *7722*  Threads are in TIMED_WATING
("lock":"java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject@151d5f2f
")
Server 2:
   *4046*   Threads are in TIMED_WATING
("lock":"java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject@1e0205c3
")
Server 3:
   *4210*   Threads are in TIMED_WATING
("lock":"java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject@5ee792c0
")

Please help in resolving this issue.

Thanks,
Doss

On Wed, Aug 5, 2020 at 11:30 PM Doss  wrote:

> Hi All,
>
> We have tried the system variable recommend here
>
> https://www.eclipse.org/jetty/documentation/current/high-load.html
>
> The errors stopped now, and system is running stable.
>
> Is there any disadvantage using the following?
>
> sysctl -w net.ipv4.tcp_tw_recycle=1
>
>
> Somewhere I read tw_reuse also will help, can we use both ? please suggest.
>
> Thanks,
> Doss.
>
> On Sunday, August 2, 2020, Doss  wrote:
>
>> Hi All,
>>
>> We are having SOLR (8.3.1) CLOUD (NRT) with Zookeeper Ensemble , 3 nodes
>> each on Centos VMs
>>
>> SOLR Nodes has 66GB RAM, 15GB HEAP MEM, 4 CPUs.
>> Record Count: 33L. Avg Doc Size is 350Kb.
>>
>> In recent times while doing a full import (Few bulk data fields computed
>> on a daily basis) we are getting this error, what could be the problem? how
>> to increase the Queue Size? Please help.
>>
>> 2020-08-02 08:10:30.847 ERROR
>> (updateExecutor-5-thread-190288-processing-x:userinfoindex_6jul20_shard4_replica_n13
>> r:core_node24 null n:172.29.3.23:8983_solr c:userinfoindex_6jul20
>> s:shard4) [c:userinfoindex_6jul20 s:shard4 r:core_node24
>> x:userinfoindex_6jul20_shard4_replica_n13]
>> o.a.s.u.ErrorReportingConcurrentUpdateSolrClient Error when calling
>> SolrCmdDistributor$Req: cmd=add{,id=1081904963}; node=ForwardNode:
>> http://172.29.3.23:8983/solr/userinfoindex_6jul20_shard3_replica_n10/ to
>> http://172.29.3.23:8983/solr/userinfoindex_6jul20_shard3_replica_n10/ =>
>> java.io.IOException: java.util.concurrent.RejectedExecutionException: Max
>> requests queued per destination 3000 exceeded for HttpDestination[
>> http://172.29
>> .3.23:8983]@5a9a216b,queue=3000,pool=MultiplexConnectionPool@545dc448
>> [c=4/4,b=4,m=0,i=0]
>> java.io.IOException: java.util.concurrent.RejectedExecutionException: Max
>> requests queued per destination 3000 exceeded for HttpDestination[
>> http://172.29
>> .3.23:8983]@5a9a216b,queue=3000,pool=MultiplexConnectionPool@545dc448
>> [c=4/4,b=4,m=0,i=0]
>> Suppressed: java.io.IOException:
>> java.util.concurrent.RejectedExecutionException: Max requests queued per
>> destination 3000 exceeded for HttpDestination[http://172.29
>> .3.23:8983]@5a9a216b,queue=3000,pool=MultiplexConnectionPool@545dc448
>> [c=4/4,b=4,m=0,i=0]
>> Caused by: java.util.concurrent.RejectedExecutionException: Max requests
>> queued per destination 3000 exceeded for HttpDestination[http://172.29
>> .3.23:8983]@5a9a216b,queue=3000,pool=MultiplexConnectionPool@545dc448
>> [c=4/4,b=4,m=0,i=0]
>> Caused by: java.util.concurrent.RejectedExecutionException: Max requests
>> queued per destination 3000 exceeded for HttpDestination[http://172.29
>> .3.23:8983]@5a9a216b,queue=3000,pool=MultiplexConnectionPool@545dc448
>> [c=4/4,b=4,m=0,i=0]
>>
>> 2020-08-02 08:10:30.911 ERROR
>> (updateExecutor-5-thread-190288-processing-x:userinfoindex_6jul20_shard4_replica_n13
>> r:core_node24 null n:172.29.3.23:8983_solr c:userinfoindex_6jul20
>> s:shard4) [c:userinfoindex_6jul20 s:shard4 r:core_node24
>> x:userinfoindex_6jul20_shard4_replica_n13]
>> o.a.s.u.ErrorReportingConcurrentUpdateSolrClient Error when calling
>> SolrCmdDistributor$Req: cmd=add{,id=1034918151}; node=ForwardNode:
>> http://172.29.3.23:8983/solr/userinfoindex_6jul20_shard1_replica_n3/ to
>> http://172.29.3.23:8983/solr/userinfoindex_6jul20_shard1_replica_n3/ =>
>> java.io.IOException: java.util.concurrent.RejectedExecutionException: Max
>> requests queued per destination 3000 exceeded for HttpDestination[
>> http://172.29
>> .3.23:8983]@5a9a216b,queue=3000,pool=MultiplexConnectionPool@545dc448
>> [c=4/4,b=4,m=0,i=0]
>>
>>
>> Thanks,
>> Doss.
>>
>


Re: Solr Down Issue

2020-08-09 Thread sanjay dutt
It could be OOM killer. Try to monitor it's heap and there is a script in bin 
which basically kills solr when OOM occurs.

Sent from Yahoo Mail on Android 
 
  On Sun, Aug 9, 2020 at 8:14 PM, Ben wrote:   Can you send 
solr logs?

Best,
Ben

On Sun, Aug 9, 2020, 9:55 AM Rashmi Jain  wrote:

> Hello Team,
>
>                I am Rashmi jain implemented solr on one of our site
> bookswagon.com. last 2-3 month we are facing
> strange issue, solr down suddenly without interrupting.  We check solr
> login and also check application logs but no clue found there regarding
> this.
>                We have implemented solr 7.4 on Java SE 10 and have index
> data of books around 28 million.
>                Also we are running solr on Windows server 2012 standard
> with 32 RAM.
>                Please help us on this.
>
> Regards,
> Rashmi
>
>
>
  


Re: Solr Down Issue

2020-08-09 Thread Ben
Can you send solr logs?

Best,
Ben

On Sun, Aug 9, 2020, 9:55 AM Rashmi Jain  wrote:

> Hello Team,
>
> I am Rashmi jain implemented solr on one of our site
> bookswagon.com. last 2-3 month we are facing
> strange issue, solr down suddenly without interrupting.   We check solr
> login and also check application logs but no clue found there regarding
> this.
> We have implemented solr 7.4 on Java SE 10 and have index
> data of books around 28 million.
> Also we are running solr on Windows server 2012 standard
> with 32 RAM.
> Please help us on this.
>
> Regards,
> Rashmi
>
>
>


RE: Can create collections with Drupal 8 configset

2020-08-09 Thread Shane Brooks
Thanks Shawn. The way we have it configured presently is as follows: 
icu4j.jar is located in /opt/solr/contrib/analysis-extras/lib/icu4j-62.1.jar

solrconfig.xml contains:



Which should load the jar at startup, correct?


Is there anything significant about the mime-type error?

Expected mime type application/octet-stream but got text/html

Thanks,
Shane



-Original Message-
From: Shawn Heisey  
Sent: Sunday, August 9, 2020 3:00 AM
To: solr-user@lucene.apache.org
Subject: Re: Can create collections with Drupal 8 configset

On 8/8/2020 10:31 PM, Shane Brooks wrote:
> org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException:E
> rror from server at http://192.168.xx.xx:8983/solr: Expected mime type 
> application/octet-stream but got text/html. \n\n http-equiv=\"Content-Type\"
> content=\"text/html;charset=utf-8\"/>\nError 500 Server 
> Error\n\nHTTP ERROR 500\nProblem 
> accessing /solr/admin/cores. Reason:\n Server 
> ErrorCaused by:java.lang.NoClassDefFoundError:
> org/apache/lucene/collation/ICUCollationKeyAnalyzer\n\tat
>
>
> I haven't found anything on Google except that ICUCollationKeyAnalyzer 
> depends on the icu4j library, which I verified is part of the SOLR 
> package.

The icu4j library and the related Lucene jars for ICU capability are not 
part of Solr by default.  They can be found in the "contrib" in the Solr 
download ... but you must add the jars to Solr if you intend to use any 
contrib capability.

The best way I have found to add custom jars to Solr is to create a 
"lib" directory under the location designated as the solr home and place 
the jars there.  All jars found in that directory will be automatically 
loaded and will be available to all cores.

An alternate way to load jars is with the  directive in 
solrconfig.xml ... but I don't recommend that approach.

Thanks,
Shawn



Solr Down Issue

2020-08-09 Thread Rashmi Jain
Hello Team,

I am Rashmi jain implemented solr on one of our site 
bookswagon.com. last 2-3 month we are facing 
strange issue, solr down suddenly without interrupting.   We check solr login 
and also check application logs but no clue found there regarding this.
We have implemented solr 7.4 on Java SE 10 and have index data 
of books around 28 million.
Also we are running solr on Windows server 2012 standard with 
32 RAM.
Please help us on this.

Regards,
Rashmi




HttpSolrClient Connection Evictor

2020-08-09 Thread Srinivas Kashyap
Hello,

We are using HttpSolrClient(solr-solrj-8.4.1.jar) in our app along with 
required jar(httpClient-4.5.6.jar). Before that we upgraded these jars from 
(solr-solrj-5.2.1.jar) and (httpClient-4.4.1.jar).

After we upgraded, we are seeing lot of below connection evictor statements in 
log file.

DEBUG USER_ID - STEP 2020-08-09 13:59:33,085 [Connection evictor] - Closing 
expired connections
DEBUG USER_ID - STEP 2020-08-09 13:59:33,085 [Connection evictor] - Closing 
connections idle longer than 5 MILLISECONDS
DEBUG USER_ID - STEP 2020-08-09 13:59:33,154 [Connection evictor] - Closing 
expired connections
DEBUG USER_ID - STEP 2020-08-09 13:59:33,154 [Connection evictor] - Closing 
connections idle longer than 5 MILLISECONDS
DEBUG USER_ID - STEP 2020-08-09 13:59:33,168 [Connection evictor] - Closing 
expired connections
DEBUG USER_ID - STEP 2020-08-09 13:59:33,168 [Connection evictor] - Closing 
connections idle longer than 5 MILLISECONDS
DEBUG USER_ID - STEP 2020-08-09 13:59:33,172 [Connection evictor] - Closing 
expired connections
DEBUG USER_ID - STEP 2020-08-09 13:59:33,172 [Connection evictor] - Closing 
connections idle longer than 5 MILLISECONDS
DEBUG USER_ID - STEP 2020-08-09 13:59:33,214 [Connection evictor] - Closing 
expired connections
DEBUG USER_ID - STEP 2020-08-09 13:59:33,214 [Connection evictor] - Closing 
connections idle longer than 5 MILLISECONDS
DEBUG USER_ID - STEP 2020-08-09 13:59:34,061 [Connection evictor] - Closing 
expired connections
DEBUG USER_ID - STEP 2020-08-09 13:59:34,061 [Connection evictor] - Closing 
connections idle longer than 5 MILLISECONDS

These statements appear when we try to access a module which is bound with solr 
as shown below:

HttpSolrClient client = null;
try
{
  client =  new HttpSolrClient.Builder(solrURL).build();

  QueryResponse response = client.query(query);

}
finally
{
  if(client!=null)
  {
try {
  client.close();
} catch (IOException e) {
  logger.debug("Error in closing 
HttpSolrClient"+e.getMessage());
}
  }
}

Is there a way we can turn off the logging or set something which doesn't cause 
log statements to appear?

Thanks,
Srinivas

DISCLAIMER:
E-mails and attachments from Bamboo Rose, LLC are confidential.
If you are not the intended recipient, please notify the sender immediately by 
replying to the e-mail, and then delete it without making copies or using it in 
any way.
No representation is made that this email or any attachments are free of 
viruses. Virus scanning is recommended and is the responsibility of the 
recipient.

Disclaimer

The information contained in this communication from the sender is 
confidential. It is intended solely for use by the recipient and others 
authorized to receive it. If you are not the recipient, you are hereby notified 
that any disclosure, copying, distribution or taking action in relation of the 
contents of this information is strictly prohibited and may be unlawful.

This email has been scanned for viruses and malware, and may have been 
automatically archived by Mimecast Ltd, an innovator in Software as a Service 
(SaaS) for business. Providing a safer and more useful place for your human 
generated data. Specializing in; Security, archiving and compliance. To find 
out more visit the Mimecast website.


Re: Can create collections with Drupal 8 configset

2020-08-09 Thread Shawn Heisey

On 8/8/2020 10:31 PM, Shane Brooks wrote:
org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException:Error 
from server at http://192.168.xx.xx:8983/solr: Expected mime type 
application/octet-stream but got text/html. \n\nhttp-equiv=\"Content-Type\" 
content=\"text/html;charset=utf-8\"/>\nError 500 Server 
Error\n\nHTTP ERROR 500\nProblem 
accessing /solr/admin/cores. Reason:\n Server 
ErrorCaused by:java.lang.NoClassDefFoundError: 
org/apache/lucene/collation/ICUCollationKeyAnalyzer\n\tat



I haven’t found anything on Google except that ICUCollationKeyAnalyzer 
depends on the icu4j library, which I verified is part of the SOLR 
package.


The icu4j library and the related Lucene jars for ICU capability are not 
part of Solr by default.  They can be found in the "contrib" in the Solr 
download ... but you must add the jars to Solr if you intend to use any 
contrib capability.


The best way I have found to add custom jars to Solr is to create a 
"lib" directory under the location designated as the solr home and place 
the jars there.  All jars found in that directory will be automatically 
loaded and will be available to all cores.


An alternate way to load jars is with the  directive in 
solrconfig.xml ... but I don't recommend that approach.


Thanks,
Shawn