RE: solr.log rotation

2017-10-02 Thread Noriyuki TAKEI
Thanks for your quick reply!!



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


solr.log rotation

2017-10-02 Thread Noriyuki TAKEI
HI,All

When I restart Solr Service, solr.log is rotated as below.

solr.log.1
solr.log.2
solr.log.3
...

I would like to stop this rotation.

Do you have Any idea?



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Spellceck Componet Exception when querying for specific word

2017-09-15 Thread Noriyuki TAKEI
Hi,all

An exception as below occurred when I used spellcheck component only for
specific word "さいじんg".

2017-09-13 23:07:30.911 INFO  (qtp1712536284-299) [c:hoge s:shard2
r:core_node4 x:hoge_shard2_replica2] o.a.s.c.S.Request
[hoge_shard2_replica2]  webapp=/solr path=/suggest_ja
params={q=*:*=さいじんg=true=json=false} hits=2
status=0 QTime=90
2017-09-13 23:07:43.922 ERROR (qtp1712536284-20) [c:test s:shard2
r:core_node3 x:test_shard2_replica1] o.a.s.h.RequestHandlerBase
java.lang.StringIndexOutOfBoundsException: String index out of range: -1
at
java.lang.AbstractStringBuilder.replace(AbstractStringBuilder.java:851)
at java.lang.StringBuilder.replace(StringBuilder.java:262)
at
org.apache.solr.spelling.SpellCheckCollator.getCollation(SpellCheckCollator.java:238)
at
org.apache.solr.spelling.SpellCheckCollator.collate(SpellCheckCollator.java:93)
at
org.apache.solr.handler.component.SpellCheckComponent.addCollationsToResponse(SpellCheckComponent.java:297)
at
org.apache.solr.handler.component.SpellCheckComponent.process(SpellCheckComponent.java:209)
at
org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:295)
at
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:153)
at org.apache.solr.core.SolrCore.execute(SolrCore.java:2213)
at
org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:654)
at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:460)
at
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:303)
at
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:254)
at
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1668)
at
org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:581)
at
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
at
org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)
at
org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226)
at
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1160)
at
org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:511)
at
org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)
at
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1092)
at
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at
org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:213)
at
org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:119)
at
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)
at org.eclipse.jetty.server.Server.handle(Server.java:518)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:308)
at
org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:244)
at
org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:273)
at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:95)
at
org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93)
at
org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceAndRun(ExecuteProduceConsume.java:246)
at
org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:156)
at
org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:654)
at
org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:572)
at java.lang.Thread.run(Thread.java:745)


However, When I restored indexing data from backup data by collection API,
this exception resolved.

Why this exception resolved?

※Solr Config is as below. 

 
 
  suggest_dict 
  solr.Suggester 
  AnalyzingLookupFactory 
  suggest 
  suggest_ja 
  true 
  true 
  text_ja_romaji 
 
   

   
 
  suggest 
  AND 
  0 
  true 

  true 
  suggest 
  1000 
  1 

  true 
  suggest_dict 
  10 
  true 
  30 
  10 
  true 
 
 
  suggest_ja 
 
   



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


facet query when using word 'AND'

2017-09-06 Thread Noriyuki TAKEI
Hi,all

I use facet query,but I found it dose not work when using 'AND'.

I woud like to use facet query using 'AND' as not Operator but simple word.

At first,Solr Config is as below.



  suggest_dict
  solr.Suggester
  AnalyzingLookupFactory
  suggest
  suggest_ja
  true
  true
  text_ja_romaji

  

  

  suggest
  AND
  0
  true

  true
  suggest
  1000
  1

  true
  suggest_dict
  10
  true
  30
  10
  true


  suggest_ja

  

I executed the query below,but Solr gave unexpected result.

$ curl
"http://localhost:8983/solr/kms/suggest_ja?wt=json=true=\"AND\"=true;
{
  "response":{"numFound":0,"start":0,"maxScore":0.0,"docs":[]
  },
  "facet_counts":{
"facet_queries":{},
"facet_fields":{
  "suggest":[]},
"facet_ranges":{},
"facet_intervals":{},
"facet_heatmaps":{}},
  "spellcheck":{
"suggestions":[],
"collations":[]}}

I'd like to use facet search including the word "AND",so I surrounded "AND"
by
double quotes and then  appended the escappe parameter befoe
dobule quote as below.

\"AND\"

The Document included word "AND"(I have a pen AND an apple) is already
indexed.The evidence is as below.

$ curl "http://localhost:8983/solr/kms/select?wt=json=true=\"AND\";

{

  "responseHeader":{
"zkConnected":true,
"status":0,
"QTime":9,
"params":{
  "q":"\"AND\"",
  "indent":"true",
  "wt":"json"}},
  "response":{"numFound":1,"start":0,"maxScore":0.2770272,"docs":[
  {
"pub_date":"2017-03-09T12:34:56.789Z",
"body":"I have a pen AND an apple",
"title":"test",
"url":"http://10.16.44.180:8080/#/management/two/;,
"system_cd":"hoge",
"document_id":"001",
"id":"hoge001",
"content_type":"doc",
"_version_":157862221496320}]
  }}


Therefore I expected the result as below.

{
  "response":{"numFound":1,"start":0,"maxScore":0.66747504,"docs":[]
  },
  "facet_counts":{
"facet_queries":{},
"facet_fields":{
  "suggest":[
"AND",1,
"apple",1,
"pen",1,
]},
"facet_ranges":{},
"facet_intervals":{},
"facet_heatmaps":{}},
  "spellcheck":{
"suggestions":[],
"collations":[]}}


Actually facet fields includes nothing.

How do I solve this?



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Re: Too many logs recorded in zookeeper.out

2017-05-24 Thread Noriyuki TAKEI
Hi

Tahnks for your reply.I,ll try to join zookeeper mailg list!! 




--
View this message in context: 
http://lucene.472066.n3.nabble.com/Too-many-logs-recorded-in-zookeeper-out-tp4335238p4336914.html
Sent from the Solr - User mailing list archive at Nabble.com.


Too many logs recorded in zookeeper.out

2017-05-16 Thread Noriyuki TAKEI
Hi.All.

I use Solr Cloud with 3 Zoo Keepers and 2 Solr Servers,
having 3 shards and 2 replicas.

These servers are running as virtual machine on VMWare and
virtual machines are stored in the iSCSI storage attached to VMWare.

One day,iSCSI storage failure suddenly occurred and then 1 Solr Server and
2 Zoo Keepers were inaccessible via SSH.But indexing and searching
seemed to work properly.

In order to recover, I powered down and started up virtual machines
inaccessible via SSH.
For a few minutes after Zoo Keeper starting up,too many logs as below were
recorded in
zookeeper.out

  ERROR[LearnerHandler-/XXX.XXX.XXX.XXX:36524:LearnerHandler@631]
  Unexpected exception causing shutdown while sock still open
 
  java.io.EOFEception
   at java.io.DataInputStream.readInt(DataInputStream.java:392)
   at org.apache.jute.BinaryInputArchive.readInt(BinaryInputArchive.java:63)


The logs shown above were recorded in zookeeper.out once per 3 millisecond.

Why were too many logs recorded?




--
View this message in context: 
http://lucene.472066.n3.nabble.com/Too-many-logs-recorded-in-zookeeper-out-tp4335238.html
Sent from the Solr - User mailing list archive at Nabble.com.


Date field by Atomic Update

2017-05-15 Thread Noriyuki TAKEI
Hi,All.

I update some fields by Solj Atomic Update.But in 
particular case, an error occurred.

When I try to set  the value "2017-01-01" to date filed
by Solrj Atomic Update,the following error message appears.

org.apache.solr.client.solrj.impl.CloudSolrClient$RouteException: Error from
server at http://XXX.XXX.XXX.XXX:/solr/test_shard1_replica2: Invalid
Date String:'2017-01-01'
at
org.apache.solr.client.solrj.impl.CloudSolrClient.directUpdate(CloudSolrClient.java:765)
at
org.apache.solr.client.solrj.impl.CloudSolrClient.sendRequest(CloudSolrClient.java:1173)
at
org.apache.solr.client.solrj.impl.CloudSolrClient.requestWithRetryOnStaleState(CloudSolrClient.java:1062)
at
org.apache.solr.client.solrj.impl.CloudSolrClient.request(CloudSolrClient.java:1004)
at 
org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:149)
at org.apache.solr.client.solrj.SolrClient.add(SolrClient.java:173)
at org.apache.solr.client.solrj.SolrClient.add(SolrClient.java:138)
at org.apache.solr.client.solrj.SolrClient.add(SolrClient.java:152)


In order to solve this problem,I applied Atomic Update to
only date field,and applied not Atomic Update to
all the other fields.

The code is as follows.

SolrInputDocument doc = new SolrInputDocument();

doc.addField("title", new HashMap().put("set","title_test")); // Atomic
Update
doc.addField("body", new HashMap().put("set","body_test")); // Atomic
Update
doc.addField("pub_date", "2017-01-01"); // not Atomic Update to date
field

solr.add(doc);
solr.commit();

In short,Atomic Update and not Atomic Update are mixed.

The code shown above seems to work properly.I think this solution is
good.But would you like to tell me the any other problems?



--
View this message in context: 
http://lucene.472066.n3.nabble.com/Date-field-by-Atomic-Update-tp4335226.html
Sent from the Solr - User mailing list archive at Nabble.com.


RE: Japanese character is garbled when using TikaEntityProcessor

2017-04-12 Thread Noriyuki TAKEI
Thanks!!I appreciate for your quick reply.



--
View this message in context: 
http://lucene.472066.n3.nabble.com/Japanese-character-is-garbled-when-using-TikaEntityProcessor-tp4329217p4329657.html
Sent from the Solr - User mailing list archive at Nabble.com.


Japanese character is garbled when using TikaEntityProcessor

2017-04-10 Thread Noriyuki TAKEI
Hi,All

I use TikaEntityProcessor to extract the text content from binary or text
file.

But when I try to extract Japanese Characters from HTML File whose
caharacter encoding is SJIS, the content is garbled.In the case of UTF-8,it
does work 
well.

The setting of Data Import Handler is as below.

--- from here ---

  
  

  

  
  

  


  

  

  

--- to here ---

How do I solve this?




--
View this message in context: 
http://lucene.472066.n3.nabble.com/Japanese-character-is-garbled-when-using-TikaEntityProcessor-tp4329217.html
Sent from the Solr - User mailing list archive at Nabble.com.


JapanesePartOfSpeechStopFilterFactory

2017-02-22 Thread Noriyuki TAKEI
Hi,All

I would like to execute query as below.

field1:someword AND field2:★

But,It seems that the query as below is executed.

field1:someword

I guess that "solr.JapanesePartOfSpeechStopFilterFactory" Filter
excepted the word ★ from indexing target and as a result
the query "AND field:★" is exclude from the query 
"field1:someword AND field2:★".

I would like to execute the query "field1:someword AND field2:★".
How do I solve this?

Current schema.xml is as below.

  

  
  
  
  
  
  
  

  



--
View this message in context: 
http://lucene.472066.n3.nabble.com/JapanesePartOfSpeechStopFilterFactory-tp4321775.html
Sent from the Solr - User mailing list archive at Nabble.com.


Re: meaning of weight in Suggester response

2017-02-11 Thread Noriyuki TAKEI
Perhaps weight means term frequency in collection?

If so,the merged result mentioned previous mail is as below,right?

apple(weight:22)
abacus(weight:14)
ability(weight:11)
absence(weight:4)



--
View this message in context: 
http://lucene.472066.n3.nabble.com/meaning-of-weight-in-Suggester-response-tp4319851p4319856.html
Sent from the Solr - User mailing list archive at Nabble.com.


meaning of weight in Suggester response

2017-02-11 Thread Noriyuki TAKEI
Hi,all

I would like to know the meanings of "weight" including the response
Suggester Component return.

For example,when I use Suggester Compnet(refer to the URL below),
I got the result as below.

term=Apple,weight=3,payload=

・Suggeser Component
https://cwiki.apache.org/confluence/display/solr/Suggester

I would like to know what "weight" indicates.

The reason is as blow.

I would like to use Suggester in mulit shards.
(3 shards)

Distributed Suggest dose work well,but the response
is not what I expected as below.

apple(weight:9)
apple(weight:8)
apple(weight:5)
abacus(weight:14)
ability(weight:11)
absence(weight:4)
※suggest.q is "a" in this case

The response includes 3 same words "apple".I guess "apple" is
indexed to suggester dictionary in every shard,and
Suggester Component in multi shards join the response per shards.

I would like to merge "Apple" and get the result sorted by weight.
But I don't know what the weight means,so I can not merge
the merge weight values including same words.


suggest do not work on solr cloud

2017-01-23 Thread Noriyuki TAKEI
Hi.all.

We are running 3 shards,2 replicas solr cloud(Ver 6.3)
which is under control by Zoo Keeper(Ver 3.4.9).

We have a problem that suggesting by SpellCheck Component
do not work.I confirmed it do work well on single node.

When not using Solr Cloud,I can get the expected result by
sending the query below.

http://hostname:8983/solr/collection/suggest_ja?spellcheck.q=a=json=true=true

But when using Solr Cloud,I get no words suggested.

My solrconfig.xml is as below.

  

  suggest_ja
  org.apache.solr.spelling.suggest.Suggester
  org.apache.solr.spelling.suggest.fst.AnalyzingLookupFactory
  suggest_ja
  true
  true
  freq
  suggest
  text_ja_romaji
  true

text_ja_romaji
  

  

  true
  suggest_ja
  false
  10
  true
  true
  false
  suggest

 
   suggest_ja
  terms

  


How do I solve this?


Re: Max length of solr query

2017-01-12 Thread Noriyuki TAKEI
Hi,all.I got it.Thanks a lot!!

2017年1月13日(金) 1:56 Shawn Heisey-2 [via Lucene] <
ml-node+s472066n4313737...@n3.nabble.com>:

>
>
>
>
> On 1/12/2017 9:36 AM, 武井宜行 wrote:
>
>
> > My Application throws too large query to solr server with solrj
>
>
> > client.(Http Method is Post)
>
>
> >
>
>
> > I have two questions.
>
>
> >
>
>
> > At first,I would like to know the limit of  clauses of Boolean Query.I
> Know
>
>
> > the number is restricted to 1024 by default, and I can increase the limit
>
>
> > by setting setMaxClauseCount,but what is the limit of increasing clauses?
>
>
>
> The maximum possible value for maxBooleanClauses is Java's
>
>
> Integer.MAX_VALUE -- about 2.1 billion.  Note that if you want to
>
>
> increase this setting, you must do it in EVERY configuration.  The
>
>
> setting is global, which means that the last core that loads is the one
>
>
> that sets it for everything running in that JVM.  If the last core that
>
>
> loads happens to be missing the config, it will be set back to 1024.
>
>
> Some of us have been trying to get this limit lifted, or at least
>
>
> arranged so that it doesn't have to be changed on every core, but we've
>
>
> been meeting with some resistance.
>
>
>
> > Next,if there is no limit of increasing clauses,is there the limit of
> query
>
>
> > length?My Application throws to large query like this with solrj client.
>
>
>
> The default size limit on a POST request is 2MB, since about version
>
>
> 4.1.  Before that version, it was controlled by the container config,
>
>
> not Solr.  This can be adjusted with the formdataUploadLimitInKB setting
>
>
> in solrconfig.xml.  The default value for this is 2048, resulting in the
>
>
> 2MB I already mentioned.  This page contains the documentation for that
>
>
> setting:
>
>
>
>
> https://cwiki.apache.org/confluence/display/solr/RequestDispatcher+in+SolrConfig
>
> Thanks,
>
>
> Shawn
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> --
>
>
>
>
> If you reply to this email, your message will be added to the discussion
> below:
>
>
>
> http://lucene.472066.n3.nabble.com/Max-length-of-solr-query-tp4313734p4313737.html
>
>
>
>
>
> To start a new topic under Solr - User, email
> ml-node+s472066n472068...@n3.nabble.com
>
>
> To unsubscribe from Solr - User, click here
> 
> .
>
>
> NAML
> 
>
>




--
View this message in context: 
http://lucene.472066.n3.nabble.com/Max-length-of-solr-query-tp4313734p4313740.html
Sent from the Solr - User mailing list archive at Nabble.com.

Re: copying all fields to one specific single value field

2016-12-24 Thread Noriyuki TAKEI
I'm sorry.As everyone pointed out,the filed named "content" is a type
defined mulitvalued.

I solved this problem.

Thanks

2016-12-24 2:19 GMT+09:00 Erick Erickson [via Lucene] <
ml-node+s472066n4311074...@n3.nabble.com>:

> The problem is exactly as stated, sending multiple fields to the
> _same_ destination in copyField directives requires that the
> destination field be multiValued, so you can't do what you're
> describing unless the destination field is multiValued.
>
> There is no requirement that faceting be performed on single-valued
> fields.
>
> But this seems like an XY problem. From the naming of your fields
> (suggest_facet) I'm not sure you want to deal with faceting at all.
> Are you trying to implement autosuggest perhaps? It would be useful if
> you stated the user problem you're trying to solve, because on the
> surface it looks like you're not going down the best path.
>
> That may just mean I don't understand the problem at all. And I'm
> quite ignorant of what special handling is necessary for autosuggest
> in Japanese so may be way off base here.
>
> Best,
> Erick
>
> On Fri, Dec 23, 2016 at 7:14 AM, KRIS MUSSHORN <[hidden email]
> > wrote:
>
> > work backwards and look at the type definition for fields named content,
> title, author, and body. one of them has a type defined as multivalued
> >
> > - Original Message -
> >
> > From: "武井宜行" <[hidden email]
> >
> > To: [hidden email]
> 
> > Sent: Friday, December 23, 2016 10:05:01 AM
> > Subject: copying all fields to one specific single value field
> >
> > Hi,all
> >
> > I would like to copy all fields to one specific single value field.
> > The reason is that I must use facet query.I think that
> > the fileld to use facet query needs not multi value but single value.
> >
> > In order to achive this,I've tried to use CopyFiled in schema.xml,but
> > Error occured.
> >
> > The Schema is as bellow.
> > ※I'd like use facet query to "suggest_facet" field.
> >
> > 
> > 
> > 
> >  />
> > 
> > 
> > 
> > 
> >
> > When I tried to index,thf following error ocuured.
> >
> > 2016-12-22 03:47:38.139 WARN (coreLoadExecutor-6-thread-3) [ ]
> > o.a.s.s.IndexSchema Field suggest_facet is not multivalued and
> destination
> > for multiple copyFields (6)
> >
> > How do I Solve this in order to copy all fields to one specific single
> > value field?
> >
>
>
> --
> If you reply to this email, your message will be added to the discussion
> below:
> http://lucene.472066.n3.nabble.com/copying-all-fields-
> to-one-specific-single-value-field-tp4311062p4311074.html
> To start a new topic under Solr - User, email
> ml-node+s472066n472068...@n3.nabble.com
> To unsubscribe from Solr - User, click here
> 
> .
> NAML
> 
>



-- 
・‥…━━━…‥
サイオステクノロジー株式会社
技術部
クラウドソリューショングループ
武井 宜行
〒106-0047  東京都港区南麻布二丁目 12 番 3 号 サイオスビル
TEL:03-6401-5314 (直通) 03-6401-5117 (部代表)
URL:http://www.sios.com/

■SIOSの最新情報はこちらから!「いいね!」をお待ちしています■
(SIOS Technology):http://www.facebook.com/SIOSTechnology
(OSSよろず相談室):http://www.facebook.com/OSSyorozu

■Twitter公式アカウント■
https://twitter.com/#!/SIOS_Technology
・‥…━━━…‥




--
View this message in context: 
http://lucene.472066.n3.nabble.com/copying-all-fields-to-one-specific-single-value-field-tp4311062p4311135.html
Sent from the Solr - User mailing list archive at Nabble.com.