Re: partial update in solr

2018-10-30 Thread Zahra Aminolroaya
Alex I use solr 7.



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Re: partial update in solr

2018-10-29 Thread Zahra Aminolroaya
Thanks Alex. I want to have a query for atomic update with solrj like below:

http://localhost:8983/solr/test4/update?preprocessor=atomic=set=set=set=true=%3Cadd%3E%3Cdoc%3E%3Cfield%20name=%22id%22%3E11%3C/field%3E%3Cfield%20name=%22text3%22%20update=%22set%22%3Ehi%3C/field%3E%3C/doc%3E%3C/add%3E


First, in solrj, I used "setfield" instead of "addfield" like
doc.setField("text3", "hi");


Then, I added ModifiableSolrParams :


ModifiableSolrParams add = new ModifiableSolrParams()
.add("processor", "atomic")
.add("atomic.text", "set")
.add("atomic.text2", "set")
.add("atomic.text3", "set")
.add(UpdateParams.COMMIT, "true")
.add("commit","true");

And then I updated my document:

req.setParams(add);
req.setAction( UpdateRequest.ACTION.COMMIT,false,false );
 req.add(docs);
 UpdateResponse rsp = req.process( server );



 However, I get "No such processor atomic"


As you see I set commit to true. What the problem is?




--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Re: partial update in solr

2018-10-29 Thread Zahra Aminolroaya
Thanks Alex. I try the following to set the atomic processor:

http://localhost:8983/solr/test4/update?processor=atomic=add

However, I get the following error:



400
4



org.apache.solr.common.SolrException
org.apache.solr.common.SolrException

No such processor atomic
400




I read in
https://medium.com/@sarkaramrit2/atomicupdateprocessorfactory-in-apache-solr-c9be62a29117
that "You need not to declare / define AUPF in solrconfig.xml".

What is the problem?

Best,
Zahra



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


partial update in solr

2018-10-24 Thread Zahra Aminolroaya
Does Solr have a partial update like elastic?

Elastic will automatically merge new document with the existing one having
the same id. For example if the new document has a value for field that it
was previously null, it will add the value for that field.


However, based on what I found, partially update in solr could be applied
only by directly defining the updated field sth like below:

curl 'localhost:8983/solr/update?commit=true' -H
'Content-type:application/json' -d '[{"id":"1","price":{"set":100}}]'

I do not want to define the updated field for Solr by sth like "set". I want
Solr to automatically merge documents with same id instead of deleting the
previous document and inserting the new document.

Can Solr do that?



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Re: Casting from schemaless to classic schema

2018-10-24 Thread Zahra Aminolroaya
Thanks Alexandre and Shawn. 



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Casting from schemaless to classic schema

2018-10-17 Thread Zahra Aminolroaya
I want to change my Solr from schemaless to classic schema.

I read 
https://stackoverflow.com/questions/29819854/how-does-solrs-schema-less-feature-work-how-to-revert-it-to-classic-schema

 
.


What would be the challenges that I will confront with as my schemaless
collection has some indexed documents in it?

In solr 7, is commenting add-unknown-fields-to-the-schema the only
difference between schemaless and classic schema?


Best,

Zahra



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Re: using uuid for documents

2018-09-22 Thread Zahra Aminolroaya
Hello Alfonso,

I expected that we could use *uuid* updateRequestProcessorChain like below
for generating unique value for unique key:


 
uniqueKey 




However, I saw that you used *dedupe* updateRequestProcessorChain as below;

 
 
  true 
………

I wonder if we can use *UUIDUpdateProcessorFactory* class instead of
*SignatureUpdateProcessorFactory* class!



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Re: using uuid for documents

2018-09-17 Thread Zahra Aminolroaya
Hello Alfonso,


Thanks. You used the dedupe updateRequestProcessorChain, so for this
application we cannot use the uuid updateRequestProcessorChain
individually?!


Best,
Zahra



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


using uuid for documents

2018-09-16 Thread Zahra Aminolroaya
I have two questions about using uuid:

 In Solr refs, it is stated that the unique id could be of uuid type. I
found we can generate uuid when the doc id with uuid field type is null or
we it is generated by using
doc.addField("id", UUID.randomUUID().toString()) in solrj.

 1- Suppose my unique id type is string. Can I generate a uuid value from my
unique id string and insert it as a distinct uuid field in Solr?


 1- Suppose my unique id type is uuid. If I try to insert a random string
which is not in uuid format, I get the "invalid uuid" error. Is there anyway
to generate a correct uuid value in solr GUI similar to what solrj 
UUID.randomUUID().toString() generates?

Best,
Zahra



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Re: Error casting to PointField

2018-09-16 Thread Zahra Aminolroaya
Thanks Shawn and Erick.



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Re: Error casting to PointField

2018-09-11 Thread Zahra Aminolroaya
Thanks Erick. We used to use TrieLongField for our unique id and in the
document it is said that all Trie* fieldtypes are casting to
*pointfieldtypes. What would be the alternative solution?



Best,

Zahra



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Error casting to PointField

2018-09-11 Thread Zahra Aminolroaya
We read that in Solr 7, Trie* fields are deprecated, so we decided to change
all of our Trie* fields to *pointtype Fields. 

Our unique key field type is long, and we changed our long field type
something like below;



We get the error uniqueKey field can not be configured to use a Points based
FieldType.


I think it is a bug. If lucene decides to deprecate the Trie* filed type, it
should also think of these kinds of errors.


What is the solution?

Best,
Zahra




--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Re: Error with plugin in Solrcloud

2018-09-02 Thread Zahra Aminolroaya
Thanks Shawn. I wrote my own filter. I attached my jar in hear.

I found your answer in hear: 


http://lucene.472066.n3.nabble.com/How-to-load-plugins-with-Solr-4-9-and-SolrCloud-td4312113.html

Based on your answer, is it possible that blob Api does not work for my own
filter jar?
  norm.jar   



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Rectangle with rotation in Solr

2018-08-29 Thread Zahra Aminolroaya
I have locations with 4-tuple (longitude,latitude) which are like rectangles
and I want to index them. Solr BBoxField with minX, maxX, maxY and minY,
only considers rectangles which does not have rotations. suppose my
rectangle is rotated  45 degree  clockwise based on axis, how can I define
rotation in bbox? Is using RPT (polygon) the only way?



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Re: Error with plugin in Solrcloud

2018-08-18 Thread Zahra Aminolroaya
Thanks Shawn. For Solr mode I should include all dependencies in lib, so I
thought for Solr Cloud mode I should include the dependencies too.


The error is as follows:
 



Caused by: org.apache.solr.common.SolrException: Unable to reload core
[textd_shard1_replica2]
at org.apache.solr.core.CoreContainer.reload(CoreContainer.java:1161)
at
org.apache.solr.handler.admin.CoreAdminOperation.lambda$static$2(CoreAdminOperation.java:111)
... 36 more
Caused by: org.apache.solr.common.SolrException: Could not load conf for
core textd_shard1_replica2: Can't load schema managed-schema: Plugin init
failure for [schema.xml] fieldType "text_general": Plugin init failure for
[schema.xml] analyzer/filter: Error loading class
'norm.myNormalizerFilterFactory'
at
org.apache.solr.core.ConfigSetService.getConfig(ConfigSetService.java:97)
at org.apache.solr.core.CoreContainer.reload(CoreContainer.java:1153)
... 37 more
Caused by: org.apache.solr.common.SolrException: Can't load schema
managed-schema: Plugin init failure for [schema.xml] fieldType
"text_general": Plugin init failure for [schema.xml] analyzer/filter: Error
loading class 'norm.myNormalizerFilterFactory'
at org.apache.solr.schema.IndexSchema.readSchema(IndexSchema.java:608)
at org.apache.solr.schema.IndexSchema.(IndexSchema.java:182)
at
org.apache.solr.schema.ManagedIndexSchema.(ManagedIndexSchema.java:104)
at
org.apache.solr.schema.ManagedIndexSchemaFactory.create(ManagedIndexSchemaFactory.java:173)
at
org.apache.solr.schema.ManagedIndexSchemaFactory.create(ManagedIndexSchemaFactory.java:45)
at
org.apache.solr.schema.IndexSchemaFactory.buildIndexSchema(IndexSchemaFactory.java:75)
at
org.apache.solr.core.ConfigSetService.createIndexSchema(ConfigSetService.java:119)
at
org.apache.solr.core.ConfigSetService.getConfig(ConfigSetService.java:92)
... 38 more
Caused by: org.apache.solr.common.SolrException: Plugin init failure for
[schema.xml] fieldType "text_general": Plugin init failure for [schema.xml]
analyzer/filter: Error loading class 'norm.myNormalizerFilterFactory'
at
org.apache.solr.util.plugin.AbstractPluginLoader.load(AbstractPluginLoader.java:182)
at org.apache.solr.schema.IndexSchema.readSchema(IndexSchema.java:494)
... 45 more
Caused by: org.apache.solr.common.SolrException: Plugin init failure for
[schema.xml] analyzer/filter: Error loading class
'norm.myNormalizerFilterFactory'
at
org.apache.solr.util.plugin.AbstractPluginLoader.load(AbstractPluginLoader.java:182)
at
org.apache.solr.schema.FieldTypePluginLoader.readAnalyzer(FieldTypePluginLoader.java:414)
at
org.apache.solr.schema.FieldTypePluginLoader.create(FieldTypePluginLoader.java:104)
at
org.apache.solr.schema.FieldTypePluginLoader.create(FieldTypePluginLoader.java:53)
at
org.apache.solr.util.plugin.AbstractPluginLoader.load(AbstractPluginLoader.java:152)
... 46 more
Caused by: org.apache.solr.common.SolrException: Error loading class
'norm.myNormalizerFilterFactory'
at
org.apache.solr.core.SolrResourceLoader.findClass(SolrResourceLoader.java:555)
at
org.apache.solr.core.SolrResourceLoader.newInstance(SolrResourceLoader.java:624)
at
org.apache.solr.schema.FieldTypePluginLoader$3.create(FieldTypePluginLoader.java:397)
at
org.apache.solr.schema.FieldTypePluginLoader$3.create(FieldTypePluginLoader.java:390)
at
org.apache.solr.util.plugin.AbstractPluginLoader.load(AbstractPluginLoader.java:152)
... 50 more
Caused by: java.lang.ClassNotFoundException: norm.myNormalizerFilterFactory
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at java.net.FactoryURLClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Unknown Source)
at
org.apache.solr.core.SolrResourceLoader.findClass(SolrResourceLoader.java:539)
... 54 more



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Re: Error with plugin in Solrcloud

2018-08-18 Thread Zahra Aminolroaya
Thanks Shawn. For Solr mode I should include all dependencies in lib, so I
thought for Solr Cloud mode I should include the dependencies too.


The error is as follows:


java.util.concurrent.ExecutionException:
org.apache.solr.common.SolrException: Unable to create core
[gettingstarted_shard1_replica2]
at java.util.concurrent.FutureTask.report(Unknown Source)
at java.util.concurrent.FutureTask.get(Unknown Source)
at 
org.apache.solr.core.CoreContainer.lambda$load$6(CoreContainer.java:592)
at
com.codahale.metrics.InstrumentedExecutorService$InstrumentedRunnable.run(InstrumentedExecutorService.java:176)
at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at
org.apache.solr.common.util.ExecutorUtil$MDCAwareThreadPoolExecutor.lambda$execute$0(ExecutorUtil.java:229)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Caused by: org.apache.solr.common.SolrException: Unable to create core
[gettingstarted_shard1_replica2]
at
org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:938)
at 
org.apache.solr.core.CoreContainer.lambda$load$5(CoreContainer.java:564)
at
com.codahale.metrics.InstrumentedExecutorService$InstrumentedCallable.call(InstrumentedExecutorService.java:197)
... 5 more
Caused by: org.apache.solr.common.SolrException: Could not load conf for
core gettingstarted_shard1_replica2: Can't load schema managed-schema:
Plugin Initializing failure for [schema.xml] fieldType
at
org.apache.solr.core.ConfigSetService.getConfig(ConfigSetService.java:97)
at
org.apache.solr.core.CoreContainer.createFromDescriptor(CoreContainer.java:918)
... 7 more
Caused by: org.apache.solr.common.SolrException: Can't load schema
managed-schema: Plugin Initializing failure for [schema.xml] fieldType
at org.apache.solr.schema.IndexSchema.readSchema(IndexSchema.java:608)
at org.apache.solr.schema.IndexSchema.(IndexSchema.java:182)
at
org.apache.solr.schema.ManagedIndexSchema.(ManagedIndexSchema.java:104)
at
org.apache.solr.schema.ManagedIndexSchemaFactory.create(ManagedIndexSchemaFactory.java:173)
at
org.apache.solr.schema.ManagedIndexSchemaFactory.create(ManagedIndexSchemaFactory.java:45)
at
org.apache.solr.schema.IndexSchemaFactory.buildIndexSchema(IndexSchemaFactory.java:75)
at
org.apache.solr.core.ConfigSetService.createIndexSchema(ConfigSetService.java:119)
at
org.apache.solr.core.ConfigSetService.getConfig(ConfigSetService.java:92)
... 8 more
Caused by: org.apache.solr.common.SolrException: Plugin Initializing failure
for [schema.xml] fieldType
at
org.apache.solr.util.plugin.AbstractPluginLoader.load(AbstractPluginLoader.java:194)
at org.apache.solr.schema.IndexSchema.readSchema(IndexSchema.java:494)
... 15 more
Caused by: java.lang.RuntimeException: schema fieldtype
text_general(org.apache.solr.schema.TextField) invalid
arguments:{runtimeLib=true}
at org.apache.solr.schema.FieldType.setArgs(FieldType.java:207)
at
org.apache.solr.schema.FieldTypePluginLoader.init(FieldTypePluginLoader.java:165)
at
org.apache.solr.schema.FieldTypePluginLoader.init(FieldTypePluginLoader.java:53)
at
org.apache.solr.util.plugin.AbstractPluginLoader.load(AbstractPluginLoader.java:191)
... 16 more



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Error with plugin in Solrcloud

2018-08-18 Thread Zahra Aminolroaya
My plugin works correctly in Solr. Now I want to have my plugin in solrCloud
mode:

I have a main jar file named "norm" and I have other jar files that my
"norm" is dependent on them:
"lucene-analyzers-common-6.6.1","lucene-core-6.6.1","slf4j-api-1.7.7","solr-core-6.6.1".

With the help of config and blob API, I transferred my libraries to .system
collection:


 {
"id":"norm/1",
"md5":"a4e068a60fb654d77cdb5f7d017bc1ec",
"blobName":"norm",
"version":1,
"timestamp":"2018-07-04T06:07:36.799Z",
"size":3377,
   
"blob":"UEsDBAoAAAgAABBX2UwJAAQATUVUQS1JTkYv/soAAFBLAwQKAAAIAAAPV9lMWXZiSjsBA
…….}

and I add these run time libraries to my collection, so my
configoverlay.json is like this:

{
  "props":{"updateHandler":{"autoSoftCommit":{"maxTime":3000}}},
  "runtimeLib":{
"norm":{
  "name":"norm",
  "version":1},
"slf":{
  "name":"slf",
  "version":1},
"lucene":{
  "name":"lucene",
  "version":1},
"lucene-ana":{
  "name":"lucene-ana",
  "version":1},
"solr-core":{
  "name":"solr-core",
  "version":1}}}

I also added my filter to my field type:

 

  
  
  
 


  
  
  
  

  

however, I get class not found exception.

Is there anything else that I should do to have my plugin for solr cloud
mode?







--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Re: Docvalue v.s. invert index

2018-08-13 Thread Zahra Aminolroaya
Thanks Erick, Shawn and Tomoko for complete answers.
 
If I set both docvalue and indexed "true" in a field, will Solr understand
to use which technique for faceting or searching? Or Is there any way to
inform Solr to use which technique?



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Docvalue v.s. invert index

2018-08-12 Thread Zahra Aminolroaya
Could we say that docvalue technique is better for sorting and faceting and
inverted index one is better for searching?

Will I lose anything if I only use docvalue?

Does docvalue technique have better performance?





--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Re: truncate string field type

2018-07-10 Thread Zahra Aminolroaya
suppose I want to search the "l(i|a)*on k(i|e)*ng". there is a space between
two words. I want solr to retrieve the exact match that these two words or
their other cases are adjacent. If I want to use text field type, each one
of these words are considered as tokens, so solr may bring back other
results too; However, we have strict costumers who only need exact matches
if any result is available not more!



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Re: truncate string field type

2018-07-10 Thread Zahra Aminolroaya
Thanks Alexandre and Erick. Erick I want to use my regular expression to
search a field and Solr text field token the document, so the regular
expression result will not be valid. I want Solr not to token my doc,
although I will lose some terms using solr string.



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


truncate string field type

2018-07-07 Thread Zahra Aminolroaya
I want to truncate my string field type due to its number of bytes limit. I
wrote the following in my schema:



  
  
  
   
   
  
  
   


However, I found that StrField (string) does not support specifying an
analyzer. Besides, prefixLength in TruncateTokenFilterFactory could not be
more than 1000.

I want to have the same application of string. Do you think it is reasonable
to use  "text_general" field type with solr.KeywordTokenizerFactory filter
to have the same application? Do I lose any feature?

If I use text_general, it is not needed to truncate.





--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Re: Errors when using Blob API

2018-07-04 Thread Zahra Aminolroaya
Thanks shawn. I removed the space from header because I got another error. I
finally used "Content-Type: application/octet-stream" instead of
'Content-Type: application/octet-stream' and all of errors even the space
limit error solved.



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Errors when using Blob API

2018-07-03 Thread Zahra Aminolroaya
I want to transfer my jar files to my ".system" collection in "Solrcloud".
One of my solr port is 

My jar file name is "norm", and the following is my command for this
transfer:
/
curl -X POST -H 'Content-Type:application/octet-stream' --data-binary
@norm.jar http://localhost:/solr/.system/blob/norm/

*However, I get the following error:*

/

org.apache.solr.common.SolrExceptionorg.apache.solr.common.SolrExceptionURLDecoder: Invalid character encoding detected after position 0
of query string / form data (while parsing as UTF-8)400
/

It is surprising that when I try to transfer the lucene jar files I also get
different errors as follows:

*for example, when I write the command:*

/curl -X POST -H 'Content-Type:application/octet-stream' --data-binary
@lucene-core-6.6.1.jar http://localhost:/solr/.system/blob/lucence/

*I get the following error:*

/

org.apache.solr.common.SolrExceptionorg.apache.solr.common.SolrExceptionapplication/x-www-form-urlencoded content length (2783509 bytes)
exceeds upload limit of 2048 KB400
/
*
or when I use the following command:*

/curl -X POST -H 'Content-Type:application/octet-stream' --data-binary
@slf4j-api-1.7.7.jar http://localhost:/solr/.system/blob/slf


/
*I get the following error:*

/org.apache.solr.common.SolrExceptionorg.apache.solr.common.SolrExceptionURLDecoder: Invalid digit (#19;) in escape (%) pattern400
/

What are these errors for even when I use the lucene default jar files?!!!

Is there any other way to insert jar files to .system collection?








--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Re: Error in solr Plugin

2018-06-25 Thread Zahra Aminolroaya
Thanks Andrea and Erick



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Re: Error in solr Plugin

2018-06-24 Thread Zahra Aminolroaya
Thanks Andrea. Do you mean all of my jar file versions should be 6.6.1? 

The lucene-core7 had some useful functions like  incrementToken which I
could not find in previous versions because of that I used this version.



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Error in solr Plugin

2018-06-24 Thread Zahra Aminolroaya
I am using solr 6.6.1. I want to write my own analyzer for the field type
"text_general" in schema. the field type in schema is as follows:
  

  
  
  
  

When I test the filter in Java, everything is alright; However, when I start
my solr I get the following error:

Caused by: java.lang.ClassCastException: class
normalizing.myNormalizerFilterFactory
at java.lang.Class.asSubclass(Unknown Source)
at
org.apache.solr.core.SolrResourceLoader.findClass(SolrResourceLoader.java:539)
at
org.apache.solr.core.SolrResourceLoader.newInstance(SolrResourceLoader.java:624)
at
org.apache.solr.schema.FieldTypePluginLoader$3.create(FieldTypePluginLoader.java:397)
at
org.apache.solr.schema.FieldTypePluginLoader$3.create(FieldTypePluginLoader.java:390)
at
org.apache.solr.util.plugin.AbstractPluginLoader.load(AbstractPluginLoader.java:152)
... 20 more

I used the jar files in my lib as follows: solr-core-4.1.0, 
slf4j-api-1.6.6, lucene-core-7.4.0, apache-solr-core-1.4.0, apache-lucene.

Why do I get this error? Is it becuase of the wrong jar file versions
especially the lucene-core-7.4.0 as my lucene version is 6.6.1?



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Re: code v.s. schema for BEST_COMPRESSION mode

2018-06-17 Thread Zahra Aminolroaya
Thanks



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


code v.s. schema for BEST_COMPRESSION mode

2018-06-16 Thread Zahra Aminolroaya
I want to reduce the size of indexed and stored documents in Solr. I found
two way in the first solution  
http://https://lucene.apache.org/solr/guide/6_6/codec-factory.html#solr-schemacodecfactory

  
it is only needed to change the compressionMode in the codecFactory section
of schema. In the other way, 
http://https://github.com/apache/lucene-solr/blob/releases/lucene-solr/6.4.0/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingStoredFieldsFormat.java

  
it is needed to use the java code to compress stored field.

What is the difference between these solutions? Is it enough to use the 
schema editting way?



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Re: Changing Leadership in SolrCloud

2018-03-02 Thread Zahra Aminolroaya
Dear Mr. Shalin,

Yes. I mean "state" in Cluster State API and UI.

Let me explain what happened previous days by detail:

Think I have Collection A distributed across node1 (the leader), node2 and
node 3. 

I used the following command to block node 1 solr and zookeeper ports from
being listend:
(the ports are 2888/3888/2181 and 4239)

firewall-cmd --remove-port=/tcp --permanent

node 1 state is still "active", and leader is "true" in response of Cluster
State API.

the Solr logs of node 1 is like below:


org.apache.solr.common.SolrException: ClusterState says we are the leader
(:4239/solr/collectionA_shard2_replica1), but locally we don't
think so. Request came from :4239/solr/collectionA_shard4_replica3/
at
org.apache.solr.update.processor.DistributedUpdateProcessor.doDefensiveChecks(DistributedUpdateProcessor.java:658)
at
org.apache.solr.update.processor.DistributedUpdateProcessor.setupRequest(DistributedUpdateProcessor.java:418)
at
org.apache.solr.update.processor.DistributedUpdateProcessor.setupRequest(DistributedUpdateProcessor.java:346)
at ..

node 2 error in solr logs is:

forwarding update to :4239/solr/collection A_shard5_replica1/
failed - retrying ... retries: 24 add{,id=121,commitWithin=1000}
params:update.chain=add-unknown-fields-to-the-schema=TOLEADER=node2:4239/solr/collection
A_shard2_replica2/
rsp:503:org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException:
Error from server at node1IP:4239/solr/collection A_shard5_replica1: Service
Unavailable

node 3 error in solr logs is like node 2 error.



Unforunately, today I found that my node 4 and node 5 from collection B and
C became down. The  logs errors were like below:

2018-03-01 00:26:46.133 ERROR
(zkCallback-4-thread-28-processing-n:node4IP:4239_solr-EventThread) [   ]
o.a.s.c.ZkController :org.apache.solr.common.SolrException: There was a
problem making a request to the leader
at
org.apache.solr.cloud.ZkController.waitForLeaderToSeeDownState(ZkController.java:1551)
at
org.apache.solr.cloud.ZkController.registerAllCoresAsDown(ZkController.java:476)
at org.apache.solr.cloud.ZkController.access$500(ZkController.java:121)
at org.apache.solr.cloud.ZkController$1.command(ZkController.java:338)
at
org.apache.solr.common.cloud.ConnectionManager$1.update(ConnectionManager.java:168)
at
org.apache.solr.common.cloud.DefaultConnectionStrategy.reconnect(DefaultConnectionStrategy.java:57)
at
org.apache.solr.common.cloud.ConnectionManager.process(ConnectionManager.java:142)
at
org.apache.zookeeper.ClientCnxn$EventThread.processEvent(ClientCnxn.java:530)
at org.apache.zookeeper.ClientCnxn$EventThread.run(ClientCnxn.java:505)

and 

Caused by: org.apache.zookeeper.KeeperException$SessionExpiredException:
KeeperErrorCode = Session expired for /collections/Collection B/state.json
at org.apache.zookeeper.KeeperException.create(KeeperException.java:127)
at org.apache.zookeeper.KeeperException.create(KeeperException.java:51)
at org.apache.zookeeper.ZooKeeper.getData(ZooKeeper.java:1212)
at
org.apache.solr.common.cloud.SolrZkClient$7.execute(SolrZkClient.java:357)
at
org.apache.solr.common.cloud.SolrZkClient$7.execute(SolrZkClient.java:354)
at
org.apache.solr.common.cloud.ZkCmdExecutor.retryOperation(ZkCmdExecutor.java:60)
at 
org.apache.solr.common.cloud.SolrZkClient.getData(SolrZkClient.java:354)
at
org.apache.solr.common.cloud.ZkStateReader.fetchCollectionState(ZkStateReader.java:1110)
at
org.apache.solr.common.cloud.ZkStateReader.getCollectionLive(ZkStateReader.java:1096)
... 39 more


I think these errors are related to blocking the ports of node 1.

I wonder if you help me.

Regards,
Zahra









--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Re: Changing Leadership in SolrCloud

2018-02-27 Thread Zahra Aminolroaya
Thanks Shalin. our "zkClientTimeout" is 3, so the leader should be
changed by now; However, the previous leader is still active.



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Re: Changing Leadership in SolrCloud

2018-02-27 Thread Zahra Aminolroaya
The leader status is active. My main question is that how I can change the
leader in SolrCloud.



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Re: Changing Leadership in SolrCloud

2018-02-27 Thread Zahra Aminolroaya
Thanks Shawn for the reply. when I try to add a document to solr I get the
"no route to host" exception. this means that SolrCloud is aware of the
blocking ports; However, zookeeper does not automatically change the leader!  



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html