More like this inly return is and score issue

2020-04-30 Thread derrick cui
Hi,
I want to return more fields in moreLikeThis response, how should I reach it?
Currently the main doc returns all fields, but morelikethis result only has I’d 
and score, please help 
Thanks



how to use multiple update process chain?

2020-04-11 Thread derrick cui
Hi, 
I need to do three tasks.1. add-unkown-fields-to-the-schema2. create composite 
key3. remove duplicate for specified field
I defined update.chain as below, but only the first one works, the others don't 
work. please help. thanks

  
add-unknown-fields-to-the-schema
composite-id
deduplicateTaxonomy
  


   
  
  


  
_gl_collection
_gl_id
  
  
id
_gl_id
  
  
_gl_id
-
  
  
  


  
_gl_dp_.*
_gl_ss_score_.*
  
  
  

thanks



indexing slow in solr 8.0.0

2019-07-12 Thread derrick cui
Hi,
I am facing an problem now. I just moved my solr cloud from one environment to 
another one, but performance is extremely slow in the new servers. the  only 
difference is CPU. also I just copy my whole solr folder from old env to new 
env and changed the configuration file.
before:hardware: three servers: 8 core cpu, mem 32G, ssd:300Gindexing 400k only 
needs 5 minutescollection: 3 shareds/2 replicas/3 nodes
now:hardware: three servers: 4 core cpu, mem 32G, ssd:300G
indexing 400k, less than 1 documents per minutes
collection: 3 shareds/2 replicas/3 nodes

anyone what could cause the issue? thanks advance

Re: Add dynamic field to existing index slow

2019-07-02 Thread derrick cui
I have tested the query desperately, actually executing query is pretty fast, 
it only took a few minutes to go through all results including converting solr 
document to java object. So I believe the slowness is in persistence end.  BTW, 
 I am using linux system.


Sent from Yahoo Mail for iPhone


On Sunday, June 30, 2019, 4:52 PM, Shawn Heisey  wrote:

On 6/30/2019 2:08 PM, derrick cui wrote:
> Good point Erick, I will try it today, but I have already use cursorMark in 
> my query for deep pagination.
> Also I noticed that my cpu usage is pretty high, 8 cores, usage is over 700%. 
> I am not sure it will help if I use ssd disk

That depends on whether the load is caused by iowait or by actual CPU usage.

If it's caused by iowait, then SSD would help, but additional memory 
would help more.  Retrieving data from the OS disk cache (which exists 
in main memory) is faster than SSD.

If it is actual CPU load, then it will take some additional poking 
around to figure out which part of your activities causes the load, as 
Erick mentioned.

It's normally a little bit easier to learn these things from Unix-like 
operating systems than from Windows.  What OS are you running Solr on?

Thanks,
Shawn





Re: Add dynamic field to existing index slow

2019-06-30 Thread derrick cui
Good point Erick, I will try it today, but I have already use cursorMark in my 
query for deep pagination.
Also I noticed that my cpu usage is pretty high, 8 cores, usage is over 700%. I 
am not sure it will help if I use ssd disk 


Sent from Yahoo Mail for iPhone


On Sunday, June 30, 2019, 2:57 PM, Erick Erickson  
wrote:

Well, the first thing I’d do is see what’s taking the time, querying or 
updating? Should be easy enough to comment out whatever it is that sends docs 
to Solr.

If it’s querying, it sounds like you’re paging through your entire data set and 
may be hitting the “deep paging” problem. Use cursorMark in that case.

Best,
Erick

> On Jun 30, 2019, at 9:12 AM, Alexandre Rafalovitch  wrote:
> 
> Only thing I can think of is to check whether you can do in-place
> rather than atomic updates:
> https://lucene.apache.org/solr/guide/8_1/updating-parts-of-documents.html#in-place-updates
> But the conditions are quite restrictive: non-indexed
> (indexed="false"), non-stored (stored="false"), single valued
> (multiValued="false") numeric docValues (docValues="true") field
> 
> The other option may be to use an external value field and not update
> Solr documents at all:
> https://lucene.apache.org/solr/guide/8_1/working-with-external-files-and-processes.html
> 
> Regards,
>  Alex.
> 
> On Sun, 30 Jun 2019 at 10:53, derrick cui
>  wrote:
>> 
>> Thanks Alex,
>> My usage is that
>> 1. I execute query and get result, return id only 2. Add a value to a 
>> dynamic field3. Save to solr with batch size1000
>> I have define 50 queries and run them parallel Also I disable hard commit 
>> and soft commit per 1000 docs
>> 
>> I am wondering whether any configuration can speed it
>> 
>> 
>> 
>> 
>> Sent from Yahoo Mail for iPhone
>> 
>> 
>> On Sunday, June 30, 2019, 10:39 AM, Alexandre Rafalovitch 
>>  wrote:
>> 
>> Indexing new documents is just adding additional segments.
>> 
>> Adding new field to a document means:
>> 1) Reading existing document (may not always be possible, depending on
>> field configuration)
>> 2) Marking existing document as deleted
>> 3) Creating new document with reconstructed+plus new fields
>> 4) Possibly triggering a marge if a lot of documents have been updated
>> 
>> Perhaps the above is a contributing factor. But I also feel that maybe
>> there is some detail in your question I did not fully understand.
>> 
>> Regards,
>>  Alex.
>> 
>> On Sun, 30 Jun 2019 at 10:33, derrick cui
>>  wrote:
>>> 
>>> I have 400k data, indexing is pretty fast, only take 10 minutes, but add 
>>> dynamic field to all documents according to query results is very slow, 
>>> take about 1.5 hours.
>>> Anyone knows what could be the reason?
>>> Thanks
>>> 
>>> 
>>> 
>>> Sent from Yahoo Mail for iPhone
>> 
>> 
>> 





Re: Add dynamic field to existing index slow

2019-06-30 Thread derrick cui
Thanks Alex,
My usage is that 
1. I execute query and get result, return id only 2. Add a value to a dynamic 
field3. Save to solr with batch size1000
I have define 50 queries and run them parallel Also I disable hard commit and 
soft commit per 1000 docs 

I am wondering whether any configuration can speed it




Sent from Yahoo Mail for iPhone


On Sunday, June 30, 2019, 10:39 AM, Alexandre Rafalovitch  
wrote:

Indexing new documents is just adding additional segments.

Adding new field to a document means:
1) Reading existing document (may not always be possible, depending on
field configuration)
2) Marking existing document as deleted
3) Creating new document with reconstructed+plus new fields
4) Possibly triggering a marge if a lot of documents have been updated

Perhaps the above is a contributing factor. But I also feel that maybe
there is some detail in your question I did not fully understand.

Regards,
  Alex.

On Sun, 30 Jun 2019 at 10:33, derrick cui
 wrote:
>
> I have 400k data, indexing is pretty fast, only take 10 minutes, but add 
> dynamic field to all documents according to query results is very slow, take 
> about 1.5 hours.
> Anyone knows what could be the reason?
> Thanks
>
>
>
> Sent from Yahoo Mail for iPhone





Add dynamic field to existing index slow

2019-06-30 Thread derrick cui
I have 400k data, indexing is pretty fast, only take 10 minutes, but add 
dynamic field to all documents according to query results is very slow, take 
about 1.5 hours.
Anyone knows what could be the reason?
Thanks



Sent from Yahoo Mail for iPhone


solrcloud 8.0.0 - debugQuery=on, exception

2019-06-19 Thread derrick cui
Hi all,
I run a query in solr admin, it's ok if debugQuery=off, but it's throw an 
exception if I set debugQuery=on, please help, thanks
there is debug stack:
2019-06-20 11:33:29.529 ERROR (qtp769429195-2324) [c:article s:shard2 
r:core_node9 x:article_shard2_replica_n6] o.a.s.h.RequestHandlerBase 
java.lang.NullPointerException        at 
java.util.Objects.requireNonNull(Objects.java:203)        at 
org.apache.lucene.search.LeafSimScorer.(LeafSimScorer.java:38)        at 
org.apache.lucene.search.spans.SpanWeight.explain(SpanWeight.java:160)        
at org.apache.lucene.search.BooleanWeight.explain(BooleanWeight.java:81)        
at org.apache.lucene.search.IndexSearcher.explain(IndexSearcher.java:707)       
 at org.apache.lucene.search.IndexSearcher.explain(IndexSearcher.java:684)      
  at 
org.apache.solr.search.SolrIndexSearcher.explain(SolrIndexSearcher.java:2216)   
     at 
org.apache.solr.util.SolrPluginUtils.getExplanations(SolrPluginUtils.java:453)  
      at 
org.apache.solr.util.SolrPluginUtils.doStandardResultsDebug(SolrPluginUtils.java:382)
        at 
org.apache.solr.util.SolrPluginUtils.doStandardDebug(SolrPluginUtils.java:343)  
      at 
org.apache.solr.handler.component.DebugComponent.process(DebugComponent.java:100)
        at 
org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:306)
        at 
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:199)
        at org.apache.solr.core.SolrCore.execute(SolrCore.java:2559)        at 
org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:711)        at 
org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:516)        at 
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:394)
        at 
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:340)
        at 
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1602)
        at 
org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:540)      
  at 
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:146)   
     at 
org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)     
   at 
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132) 
       at 
org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:257)
        at 
org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1588)
        at 
org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:255)
        at 
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1345)
        at 
org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)
        at 
org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:480)       
 at 
org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1557)
        at 
org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)
        at 
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1247)
        at 
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)   
     at 
org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:220)
        at 
org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:126)
        at 
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132) 
       at 
org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:335)
        at 
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132) 
       at org.eclipse.jetty.server.Server.handle(Server.java:502)        at 
org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:364)        at 
org.eclipse.jetty.server.HttpChannel.run(HttpChannel.java:305)        at 
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333)
        at 
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310)
        at 
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168)
        at 
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.produce(EatWhatYouKill.java:132)
        at 
org.eclipse.jetty.http2.HTTP2Connection.produce(HTTP2Connection.java:171)       
 at 
org.eclipse.jetty.http2.HTTP2Connection.onFillable(HTTP2Connection.java:126)    
    at 
org.eclipse.jetty.http2.HTTP2Connection$FillableCallback.succeeded(HTTP2Connection.java:338)
        at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)    
    at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:118)     
   at 
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333)
        at 
org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310)
     

Please help on pdate type during indexing

2019-06-02 Thread derrick cui
Hi all,
I spent whole day to indexing my data to solr(8.0), but there is one field 
which type is pdate always failed. 
error adding field 
'UpdateDate'='org.apache.solr.common.SolrInputField:UpdateDate=2019-06-03T05:22:14.842Z'
 msg=Invalid Date in Date Math 
String:'org.apache.solr.common.SolrInputField:UpdateDate=2019-06-03T05:22:14.842Z',,
 retry=0 commError=false errorCode=400 

I have put timezone in the string, please help,
thanks

How to define nested document schema

2019-05-20 Thread derrick cui
Hi, I have a nested document, how should I define this schema?
How to use addChildDocument in solr-solrj?
Thanks
Derrick

Sent from Yahoo Mail for iPhone


Execute query against to one document

2019-05-16 Thread Derrick Cui
Hi,
I have a use case, but don’t know how to implement, please help.

I have one large data file, let’s say 500b data,  which doesn’t have
category in the source. What I want to do is that execute a query on
indexing documents, if the query hits great than 0, add category field and
save to solr.

Currently I do two steps , indexing, then query on collection and add field
of hits great than 0, but it takes several days to complete.

Any idea or solution please .

Thanks
-- 
Regards,

Derrick Cui
Email: derrick...@gmail.com


query keyword but no result (solr 8)

2019-05-13 Thread Derrick Cui
Hi,

I am trying to setup solrcloud, I can index a few documents successfully.
but I cannot get result if I search keyword(without field). if I use
field:keyword, I can get result.

any idea why I get this issue?

Thank you

-- 
Regards,

Derrick Cui
Email: derrick...@gmail.com