Error when running ant package in solr 7.2.1 from source

2018-03-27 Thread C F
I'm unable to run the ant package target in solr-7.2.1-src.tgz. Any ideas?
Is git required now?


$ ant package
Buildfile: /home/local/fred/solr-7.2.1/solr/build.xml

init-dist:

package-src-tgz:
 [exec] fatal: Not a git repository (or any parent up to mount point /)
 [exec] Stopping at filesystem boundary
(GIT_DISCOVERY_ACROSS_FILESYSTEM not set).

BUILD FAILED
/home/local/fred/solr-7.2.1/solr/build.xml:487: The following error
occurred while executing this line:
/home/local/fred/solr-7.2.1/lucene/common-build.xml:2305: exec returned: 128

Total time: 0 seconds

Thanks,
Carlton


Re: Default Index config

2018-03-27 Thread Shawn Heisey

On 3/27/2018 9:35 PM, mganeshs wrote:

I am using the default configuration, as all solr experts say default one
suits for most of the cases and so following the defaults. We changed only
the commits part and using 15000 for hard commits and 1 sec for soft commit.
All other setting like locking, deleting policy, merge, directory, etc are
left to default ones.


One second for autoSoftCommit has a tendency to cause a lot of issues.  
That interval should be set as large as you can tolerate for change 
visibility.  I personally would want to see a setting there of at least 
one minute, but I know that this is not fast enough for a lot of users.


Does autoCommit have openSearcher set to false?  This is a typical 
configuration, and recommended.



One strange thing we noticed after moving from solr 5.x to solr 6.5.1 is
that CPU and RAM usage is increased drastically. We have two solr nodes, one
for data and another for replica. It's EC2 r4.xlarge machine. We have almost
6 collection  and each carries around 5GB of data in average and in couple
of collection we have frequent updates too.


That kind of instance has 30.5 GB of total memory.  With 8GB assigned to 
the heap, that leaves about 22GB for everything else.  If Solr is the 
only thing running on the machine, and your numbers mean that each 
server has about 30GB of index data, then that means you can get about 
two thirds of the index into the OS disk cache.   Usually this is enough 
for decent performance, but that's not the case for everyone.  If this 
is an accurate picture of your Solr install, I am not really worried 
about total memory size.  I can't make a guess about it unless I have 
accurate information.


The way you're phrasing things, it sounds like you're running in 
SolrCloud mode.  Is that correct?


Since you're probably running Solr on Linux, can you get me a screenshot 
from the "top" program?  It must be specifically that program -- don't 
use an alternate utility like "htop".


Run top, press shift-M to sort the display by memory, and then grab a 
screenshot or photo of the display.  Share that file somewhere and 
provide a URL to access it.  With that info, we can get a good picture 
of system health.  Be sure that when that "top" display is grabbed, that 
the system is experiencing whatever problems you're trying to fix.


What do you know about your query load and how fast the data is being 
indexed?



In solr 5.x we didn't feel this much of RAM and CPU usages. CPU is always 80
to 90% even if we are trying to index or update some 50 docs at one shot and
RAM it occupies whatever we give. We started with 8GB of Heap. But its
always 8GB.  Initially we were using CMS GC and tried with G1 GC. Only
difference is that, In case of CMS, even after starting solr, cpu goes to
80%, where as in G1, after started solr it's around 10% and when load comes
( around 100 to 200 docs  in 5 mins ) it's goes 90% ( in both CMS and G1 )


The default garbage collection tuning does a pretty good job most of the 
time.  For some people, G1 works even better, but this can only happen 
if it is tuned.  Simply turning on G1GC and leaving it untuned will 
probably get you WORSE performance than leaving the GC tuning at the 
settings that the project has provided.



When we tried to profile the solr process, we found that merging is keep on
happening.


Can you elaborate more on exactly what you observed, and what 
conclusions you came to based on that information?



This spikes of CPU and memory, we are seeing only after moving to 6.5.1. Is
that stable version ? moving to latest stable will solve this issue or we
miss something wrt configurations ? Do we need to change the solr default
config ?


Virtually all releases are considered stable at the time of release.  If 
they weren't, then they wouldn't be released!  I'm only aware of two 
times that versions were released without being called "stable."  Those 
were the 4.0-ALPHA and 4.0-BETA releases.  There have been no ALPHA or 
BETA releases since then.


Pretty much every version is later revealed to have bugs, and sometimes 
those problems are big enough that the release could be called 
unstable.  But that is a determination made AFTER release, not before.


6.5.1 was announced on 27 April 2017.  It is a bugfix release, mostly 
identical to 6.5.0, which was announced exactly one month earlier, on 27 
March 2017.  Which makes it one year old.


There are a number of bugfixes in 6.5.1 compared to 6.5.0, but the two 
are substantially similar.


http://mail-archives.apache.org/mod_mbox/www-announce/201704.mbox/%3CCAKUpjcQiYLZ1a+ZM=ohuxjywlj9vnub8q9b_ilrvmkididv...@mail.gmail.com%3E

If you're going to run 6.x, then you should run 6.6.3.  The latest 
version currently out is 7.2.1.  Version 7.3.0 is just around the corner 
-- release is underway right now.  I haven't looked through the 
changelogs, so I don't really know what's coming.


Thanks,
Shawn



Re: Copying a SolrCloud collection to other hosts

2018-03-27 Thread David Smiley
The backup/restore API is intended to address this.
https://builds.apache.org/job/Solr-reference-guide-master/javadoc/making-and-restoring-backups.html

Erick's advice is good (and I once drafted docs for the same scheme years
ago as well), but I consider it dated -- it's what people had to do before
the backup/restore API existed.  Internally, backup/restore is doing
similar stuff.  It's easy to give backup/restore a try; surely you have by
now?

~ David

On Tue, Mar 6, 2018 at 9:47 AM Patrick Schemitz  wrote:

> Hi List,
>
> so I'm running a bunch of SolrCloud clusters (each cluster is: 8 shards
> on 2 servers, with 4 instances per server, no replicas, i.e. 1 shard per
> instance).
>
> Building the index afresh takes 15+ hours, so when I have to deploy a new
> index, I build it once, on one cluster, and then copy (scp) over the
> data//index directories (shutting down the Solr instances
> first).
>
> I could get Solr 6.5.1 to number the shard/replica directories nicely via
> the createNodeSet and createNodeSet.shuffle options:
>
> Solr 6.5.1 /var/lib/solr:
>
> Server node 1:
> instance00/data/main_index_shard1_replica1
> instance01/data/main_index_shard2_replica1
> instance02/data/main_index_shard3_replica1
> instance03/data/main_index_shard4_replica1
>
> Server node 2:
> instance00/data/main_index_shard5_replica1
> instance01/data/main_index_shard6_replica1
> instance02/data/main_index_shard7_replica1
> instance03/data/main_index_shard8_replica1
>
> However, while attempting to upgrade to 7.2.1, this numbering has changed:
>
> Solr 7.2.1 /var/lib/solr:
>
> Server node 1:
> instance00/data/main_index_shard1_replica_n1
> instance01/data/main_index_shard2_replica_n2
> instance02/data/main_index_shard3_replica_n4
> instance03/data/main_index_shard4_replica_n6
>
> Server node 2:
> instance00/data/main_index_shard5_replica_n8
> instance01/data/main_index_shard6_replica_n10
> instance02/data/main_index_shard7_replica_n12
> instance03/data/main_index_shard8_replica_n14
>
> This new numbering breaks my copy script, and furthermode, I'm worried
> as to what happens when the numbering is different among target clusters.
>
> How can I switch this back to the old numbering scheme?
>
> Side note: is there a recommended way of doing this? Is the
> backup/restore mechanism suitable for this? The ref guide is kind of terse
> here.
>
> Thanks in advance,
>
> Ciao, Patrick
>
-- 
Lucene/Solr Search Committer, Consultant, Developer, Author, Speaker
LinkedIn: http://linkedin.com/in/davidwsmiley | Book:
http://www.solrenterprisesearchserver.com


Re: InetAddressPoint support in Solr or other IP type?

2018-03-27 Thread David Smiley
(I overlooked your reply; sorry to leave you hanging)

>From a simplicity standpoint, Just use InetAddressPoint.  Solr has no
rules/restrictions as to which Lucene module it's in.

That said, I *suspect* a Terms PrefixTree aligned to each byte would offer
better query performance, presuming that typical range queries are
byte-to-byte (as they would be for IPs?).  The Points API internally makes
the splitting decision, and it's not customizable.  It's blind to how
people will realistically query the data; it just wants a balanced tree.
For the same reason, I *suspect* (but have not benchmarked to see) that
DateRangeField has better query performance than DatePointField.  That
said, a Points index is probably going to be leaner & faster to index.

~ David

On Fri, Mar 23, 2018 at 7:51 PM Mike Cooper  wrote:

> Thanks David. Is there a reason we wouldn't want to base the Solr
> implementation on the InetAddressPoint class?
>
>
> https://lucene.apache.org/core/7_2_1/misc/org/apache/lucene/document/InetAddressPoint.html
>
> I realize that is in the "misc" package for now, so it's not part of core
> Lucene. But it is nice in that it has one class for both ipv4 and ipv6 and
> it's based on point numerics rather than trie numerics which seem to be
> deprecated. I'm pretty familiar with the code base, I could take a stab at
> implementing this. I just wanted to make sure there wasn't something I was
> missing since I couldn't find any discussion on this.
>
> Michael Cooper
>
> -Original Message-
> From: David Smiley [mailto:david.w.smi...@gmail.com]
> Sent: Friday, March 23, 2018 5:14 PM
> To: solr-user@lucene.apache.org
> Subject: Re: InetAddressPoint support in Solr or other IP type?
>
> Hi,
>
> For IPv4, use TrieIntField with precisionStep=8
>
> For IPv6 https://issues.apache.org/jira/browse/SOLR-6741   There's nothing
> there yet; you could help out if you are familiar with the codebase.  Or
> you
> might try something relatively simple involving edge ngrams.
>
> ~ David
>
> On Thu, Mar 22, 2018 at 1:09 PM Mike Cooper 
> wrote:
>
> > I have scoured the web and cannot find any discussion of having the
> > Lucene InetAddressPoint type exposed in Solr. Is there a reason this
> > is omitted from the Solr supported types? Is it on the roadmap? Is
> > there an alternative recommended way to index and store Ipv4 and Ipv6
> > addresses for optimal range searches and subnet searches? Thanks for your
> > help.
> >
> >
> >
> > *Michael Cooper*
> >
> --
> Lucene/Solr Search Committer, Consultant, Developer, Author, Speaker
> LinkedIn: http://linkedin.com/in/davidwsmiley | Book:
> http://www.solrenterprisesearchserver.com
>
-- 
Lucene/Solr Search Committer, Consultant, Developer, Author, Speaker
LinkedIn: http://linkedin.com/in/davidwsmiley | Book:
http://www.solrenterprisesearchserver.com


Re: Default Index config

2018-03-27 Thread mganeshs
Hi Shawn,

Thanks for detail mail. Yes I am behind the IndexConfig only.

Regarding 5GB size of collection, it's not one document. It has almost 3M of
docs in that collection. 

I am using the default configuration, as all solr experts say default one
suits for most of the cases and so following the defaults. We changed only
the commits part and using 15000 for hard commits and 1 sec for soft commit. 
All other setting like locking, deleting policy, merge, directory, etc are
left to default ones.

One strange thing we noticed after moving from solr 5.x to solr 6.5.1 is
that CPU and RAM usage is increased drastically. We have two solr nodes, one
for data and another for replica. It's EC2 r4.xlarge machine. We have almost
6 collection  and each carries around 5GB of data in average and in couple
of collection we have frequent updates too. 

In solr 5.x we didn't feel this much of RAM and CPU usages. CPU is always 80
to 90% even if we are trying to index or update some 50 docs at one shot and
RAM it occupies whatever we give. We started with 8GB of Heap. But its
always 8GB.  Initially we were using CMS GC and tried with G1 GC. Only
difference is that, In case of CMS, even after starting solr, cpu goes to
80%, where as in G1, after started solr it's around 10% and when load comes
( around 100 to 200 docs  in 5 mins ) it's goes 90% ( in both CMS and G1 )

When we tried to profile the solr process, we found that merging is keep on
happening. 

This spikes of CPU and memory, we are seeing only after moving to 6.5.1. Is
that stable version ? moving to latest stable will solve this issue or we
miss something wrt configurations ? Do we need to change the solr default
config ? 


Advice... 




--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Routing a subquery directly to the shard a document came from

2018-03-27 Thread Jeff Wartes

I have a large 7.2 index with nested documents and many shards.
For each result (parent doc) in a query, I want to gather a relevance-ranked 
subset of the child documents. It seemed like the subquery transformer would be 
ideal: 
https://lucene.apache.org/solr/guide/7_2/transforming-result-documents.html#TransformingResultDocuments-_subquery_
(the [child] transformer allows for a filter, but the results have an 
effectively random sort)

So maybe something like this:
q=
fl=id,subquery:[subquery]
subquery.q=
subquery.fq={!cache=false} +{!terms f=_root_ v=$row.id}

This actually works fine, but there’s a lot more work going on than necessary. 
Say we have X shards and get N documents back:

Query http requests = 1 top-level query + X distributed shard-requests
Subquery http requests = N rows + N * X distributed shard-requests
So with N=10 results and X=50 shards, that is: 1+50+10+500 = 561 http requests 
through the cluster.

Some of that is unavoidable, of course, but it occurs to me that all the child 
docs are indexed in the same shard (segment) that the parent doc is. Meaning 
that if you know the parent doc id, (and I do) you can use the document routing 
to know exactly which shard to send the subquery request to. This would save 
490 of the http requests in the scenario above.

Is there any form of query that allows for explicitly following the document 
routing rules for a given document ID?

I’m aware of the “distrib=false” and “shards=foo” parameters, but using those 
would require me to recreate the document routing in the client.
There’s also the “fl=[shard]” thing, but that would still require me to handle 
the subqueries in the client.




Re: edit gc parameters in solr.in.sh or solr?

2018-03-27 Thread Shawn Heisey
On 3/27/2018 12:13 AM, Bernd Fehling wrote:
> may I give you the advise to _NOT_ set XX:G1HeapRegionSize.
> That is computed during JAVA start by the engine according to heap and 
> available memory.
> A wrong set size can even a huge machine with 31GB heap and 157GB RAM force 
> into OOM.
> Guess how I figured that out, took me about one week to locate it.

I have some notes on why I included that parameter on my wiki page.

https://wiki.apache.org/solr/ShawnHeisey#G1_.28Garbage_First.29_Collector

Basically, the filterCache entries were being marked as humongous
allocations, because each one for my indexes is over 2MB in size. 
Apparently it takes a full collection to collect humongous allocations
that become garbage, at least in the versions of Java that I was
experimenting with.  So without that parameter, full GCs were required,
and that will always make GC slow unless the heap size is very small.

If Oracle has made it so that humongous allocations can be collected by
the generation-specific collectors, then that parameter may no longer be
required in newer Java versions.  I do not know if this has happened.

Thanks,
Shawn



Re: Help Needed - Indexing Related

2018-03-27 Thread Shawn Heisey
On 3/27/2018 6:08 AM, YELESWARAPU, VENKATA BHAN wrote:
> Hope you are doing well. I have been struggling with indexing for a week now.
> Yesterday I deleted all indexing files and tried re-indexing. It failed 
> saying unable to open a new searcher. Also that _0.si file is missing.
> Today I redeployed the whole application and tried indexing. Now facing the 
> below issues.
> If you could guide me on this or if there is any documentation around this, 
> that would greatly help. Appreciate your time on this.
>
> 2018-03-27 07:53:59,896 DEBUG (DefaultdatabaseFunctions.java:319) - lock 
> [SolrIndexingJobReadFromQueue] acquired
> 2018-03-27 07:53:59,924 DEBUG (SolrIndexingJob.java:193) - done sleeping
> 2018-03-27 07:53:59,929 DEBUG (DefaultdatabaseFunctions.java:313) - lock 
> [SolrIndexingJobReadFromQueue] already exists, will try updating it now
> 2018-03-27 07:53:59,971 DEBUG (SolrIndexingQueueServiceImpl.java:54) - Object 
> Alerts.CWI_096850 not fetched because its identifier appears to be 
> already in processing
> 2018-03-27 07:53:59,971 DEBUG (SolrIndexingQueueServiceImpl.java:54) - Object 
> Alerts.CWI_096850 not fetched because its identifier appears to be 
> already in processing
> 2018-03-27 07:53:59,971 DEBUG (SolrIndexingQueueServiceImpl.java:54) - Object 
> Alerts.CWI_096850 not fetched because its identifier appears to be 
> already in processing
> 2018-03-27 07:53:59,971 DEBUG (SolrIndexingQueueServiceImpl.java:54) - Object 
> Alerts.CWI_096854 not fetched because its identifier appears to be 
> already in processing
>
> 2018-03-27 07:54:31,128 WARN  (SolrIndexingJob.java:107) - Solr indexing job 
> failed
> java.lang.IndexOutOfBoundsException: Index: 16, Size: 10

None of those logging messages are coming from Solr classes.  It all
appears to be third party code.  There are no actual errors in the log,
but there is one log entry at WARN.  In the stacktrace for that log
entry, I see the following classes:
"com.actimize.solr.indexing.SolrAlertsIndexer" and
"com.actimize.dao.DaoUtil".  The latter class is the first class in the
stacktrace not pointing at native Java code, so the
IndexOutOfBoundsException is likely happening in that code.

I was unable to find information about those classes with Google, so I'm
betting that it's custom code that your company has developed or
contracted to be developed.  You're going to need to talk to whoever
maintains the "com.actimize" code.  Give them the stacktrace, and ask
them to dig into why the error occurred.  If it happened because Solr
didn't respond with what they expected, then we can look into that problem.

Thanks,
Shawn



Re: Solr 4.9 - configs and collections

2018-03-27 Thread Abhi Basu
Thanks for the explanations, very helpful.

One more question, what is the sequence to delete the collection? If I use
the rest api to delete the collection, then when I go to create it again, I
sometimes get an error message saying shard already present. How to clean
up the underlying directories on all the nodes?

Thanks,

Abhi

On Mon, Mar 26, 2018 at 6:22 PM, Shawn Heisey  wrote:

> On 3/26/2018 8:43 AM, Abhi Basu wrote:
> > Running on MS HDInsight and Solr 4.9.What is the BKM for creation,
> update,
> > delete of configurations and collections?
>
> I have no idea what a BKM is.  I will cover the update of configuration
> below.
>
> > I do the following:
> >
> > 1. First I create the zk config:
> > sudo zkcli.sh -cmd upconfig -zkhost zknode
> >  >:2181
> > -confdir /home/sshuser/ems-collection-49/conf/ -confname ems-collection
>
> Exactly what you've got configured there for the zkhost parameter is
> difficult to decipher because it looks like the hsotname got replaced
> with a URL by your mail client.  But I think you've only got one ZK
> server there.  Usually there are at least three of them.  The command
> actually only needs one, but the zkHost string usually has at least
> three.  It's generally a good idea to use the same string for zkcli that
> you use for Solr itself, so it works even when a server is down.
>
> > 2. Then I create the collection:
> > curl '
> > http://headnode0:8983/solr/admin/collections?action=
> CREATE=ems-collection=2=
> 2=1
> > '
> >
> > This works the first time. When I change the zk config, do I run the same
> > command #1? Also, do I do a reload:
>
> Yes, if you want to change an existing config and then make it active,
> you re-upload the config and then reload any affected collection.
> Deleting and recreating the collection is not something you would want
> to do unless you plan to completely rebuild it anyway -- deleting the
> collection will also delete all the index data.  If that's what you
> WANT, then deleting and recreating the collection is a good way to make
> it happen.  Many config updates *do* require a reindex, and some changes
> will also require completely deleting the index directories before
> building it again.
>
> > Very familiar with CDH solrctl commands that make life easier by only
> > having one command for this. Any help is appreciated.
>
> If you're using CDH, you'll want to talk to Cloudera for help.  They
> customize their Solr install to the point where they're the only ones
> who know how to use it properly.
>
> Thanks,
> Shawn
>
>


-- 
Abhi Basu


Re: How to escape OR or any other keyword in solr

2018-03-27 Thread Steven White
What about for the case when you need to match cases as such the analyzer
does not use LowerCaseFilterFactory?  Is there a solution for this?

Steve

On Tue, Mar 27, 2018 at 4:22 AM, RAUNAK AGRAWAL 
wrote:

> Hi Peter,
>
> Yes, I am using the stopword file which has *or *in it. Thanks for pointing
> out. Will remove it from the stopword file and test it again. Thank you
> very much!!
>
> On Tue, Mar 27, 2018 at 1:17 PM, Peter Lancaster <
> peter.lancas...@findmypast.com> wrote:
>
> > Hi Raunak,
> >
> > Are you using a stop word file? That might be why you're getting 0
> results
> > searching for "OR".
> >
> > Cheers,
> > Peter.
> >
> > -Original Message-
> > From: RAUNAK AGRAWAL [mailto:agrawal.rau...@gmail.com]
> > Sent: 27 March 2018 07:45
> > To: solr-user@lucene.apache.org
> > Subject: How to escape OR or any other keyword in solr
> >
> > I have to search for state "OR" [short form for Oregon]. When I am making
> > query state:OR, I am getting SolrException since it is recognising it as
> > keyword.
> >
> > Now I tried with quotes ("") or //OR as well and when doing so..Solr
> > doesn't give exception but it also doesn't return any matching document.
> >
> > Kindly let me know what is the workaround for this issue?
> >
> > Thanks
> > 
> >
> > This message is confidential and may contain privileged information. You
> > should not disclose its contents to any other person. If you are not the
> > intended recipient, please notify the sender named above immediately. It
> is
> > expressly declared that this e-mail does not constitute nor form part of
> a
> > contract or unilateral obligation. Opinions, conclusions and other
> > information in this message that do not relate to the official business
> of
> > findmypast shall be understood as neither given nor endorsed by it.
> > 
> >
> > 
> __
> >
> > This email has been checked for virus and other malicious content prior
> to
> > leaving our network.
> > 
> __
> >
>


RE: Help Needed - Indexing Related

2018-03-27 Thread YELESWARAPU, VENKATA BHAN
Information Classification: ** Limited Access

Team,

Fyi..Deleting the indexing job queue table resolved the issue and it generated 
the index files.

Thank you,
Dutt

_
From: YELESWARAPU, VENKATA BHAN
Sent: Tuesday, March 27, 2018 8:08 AM
To: 'solr-user@lucene.apache.org'
Subject: Help Needed - Indexing Related


Information Classification: ** Limited Access

Hi Solr Team,

Hope you are doing well. I have been struggling with indexing for a week now.
Yesterday I deleted all indexing files and tried re-indexing. It failed saying 
unable to open a new searcher. Also that _0.si file is missing.
Today I redeployed the whole application and tried indexing. Now facing the 
below issues.
If you could guide me on this or if there is any documentation around this, 
that would greatly help. Appreciate your time on this.

2018-03-27 07:53:59,896 DEBUG (DefaultdatabaseFunctions.java:319) - lock 
[SolrIndexingJobReadFromQueue] acquired
2018-03-27 07:53:59,924 DEBUG (SolrIndexingJob.java:193) - done sleeping
2018-03-27 07:53:59,929 DEBUG (DefaultdatabaseFunctions.java:313) - lock 
[SolrIndexingJobReadFromQueue] already exists, will try updating it now
2018-03-27 07:53:59,971 DEBUG (SolrIndexingQueueServiceImpl.java:54) - Object 
Alerts.CWI_096850 not fetched because its identifier appears to be already 
in processing
2018-03-27 07:53:59,971 DEBUG (SolrIndexingQueueServiceImpl.java:54) - Object 
Alerts.CWI_096850 not fetched because its identifier appears to be already 
in processing
2018-03-27 07:53:59,971 DEBUG (SolrIndexingQueueServiceImpl.java:54) - Object 
Alerts.CWI_096850 not fetched because its identifier appears to be already 
in processing
2018-03-27 07:53:59,971 DEBUG (SolrIndexingQueueServiceImpl.java:54) - Object 
Alerts.CWI_096854 not fetched because its identifier appears to be already 
in processing

2018-03-27 07:54:31,128 WARN  (SolrIndexingJob.java:107) - Solr indexing job 
failed
java.lang.IndexOutOfBoundsException: Index: 16, Size: 10
at java.util.ArrayList.rangeCheck(ArrayList.java:635)
at java.util.ArrayList.set(ArrayList.java:426)
at com.actimize.dao.DaoUtil.orderList(DaoUtil.java:215)
at 
com.actimize.dao.AlertDaoImpl.findAlertsByIdentifierForIndexing(AlertDaoImpl.java:2347)
at sun.reflect.GeneratedMethodAccessor2119.invoke(Unknown Source)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at 
org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:317)
at 
org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:183)
at 
org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
at 
com.actimize.infrastructure.perfmon.PerformanceMonitorInterceptor.invokeUnderTrace(PerformanceMonitorInterceptor.java:57)
at 
org.springframework.aop.interceptor.AbstractTraceInterceptor.invoke(AbstractTraceInterceptor.java:111)
at 
org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
at 
org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:204)
at com.sun.proxy.$Proxy39.findAlertsByIdentifierForIndexing(Unknown 
Source)
at 
com.actimize.services.AlertsServiceImpl.findAlertsByIdentifierForIndexing(AlertsServiceImpl.java:5568)
at sun.reflect.GeneratedMethodAccessor2118.invoke(Unknown Source)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at 
org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:317)
at 
org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:183)
at 
org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
at 
org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:96)
at 
org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:260)
at 
org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:94)
at 
org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
at 
com.actimize.infrastructure.util.DeadLockLockingInterceptor.invoke(DeadLockLockingInterceptor.java:40)
at 
org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
at 

Re: query regarding Solr partial search

2018-03-27 Thread Erik Hatcher
This is as much about your schema as it is about your query parser usage.   
What’s parsed_query say in your debug=true output?   What query parser are you 
using?   If edismax, check qf/pf/mm settings, etc.

Erik


> On Mar 27, 2018, at 9:56 AM, Paul, Lulu  wrote:
> 
> Hi ,
> 
> Below is my SOLR configuration (schema.xml) for a keyword search field.
> 
>  stored="false" multiValued="true"/>
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
>  positionIncrementGap="100">
>  
>
> words="stopwords.txt" />
>
>  
>  
>
> words="stopwords.txt" />
> ignoreCase="true" expand="true"/>
>
> preserveOriginal="true"/>
>  
> 
> 
> 
> · If I search for “Autograph full score”, Solr returns all items that 
> contains this string in exactly the same order.
> 
> · If I search for “full Autograph score”, Solr doesn’t return any 
> results.
> 
> The requirement is that regardless of the order of the string, Solr should 
> return all records which “CONTAIN” these 3 strings. Please advise how can 
> this be made possible?
> 
> Thanks & Regards,
> Lulu
> 
> 
> 
> **
> Experience the British Library online at www.bl.uk
> The British Library’s latest Annual Report and Accounts : 
> www.bl.uk/aboutus/annrep/index.html
> Help the British Library conserve the world's knowledge. Adopt a Book. 
> www.bl.uk/adoptabook
> The Library's St Pancras site is WiFi - enabled
> *
> The information contained in this e-mail is confidential and may be legally 
> privileged. It is intended for the addressee(s) only. If you are not the 
> intended recipient, please delete this e-mail and notify the 
> postmas...@bl.uk : The contents of this e-mail must 
> not be disclosed or copied without the sender's consent.
> The statements and opinions expressed in this message are those of the author 
> and do not necessarily reflect those of the British Library. The British 
> Library does not take any responsibility for the views of the author.
> *
> Think before you print



Re: How to escape OR or any other keyword in solr

2018-03-27 Thread Erick Erickson
Also, a rather obscure bit is that unless you configure it, operators
must be upper case.
You can simply lower-case it on the client side if necessary, so "or"
won't be interpreted
as an operator.

Best,
Erick

On Tue, Mar 27, 2018 at 1:22 AM, RAUNAK AGRAWAL
 wrote:
> Hi Peter,
>
> Yes, I am using the stopword file which has *or *in it. Thanks for pointing
> out. Will remove it from the stopword file and test it again. Thank you
> very much!!
>
> On Tue, Mar 27, 2018 at 1:17 PM, Peter Lancaster <
> peter.lancas...@findmypast.com> wrote:
>
>> Hi Raunak,
>>
>> Are you using a stop word file? That might be why you're getting 0 results
>> searching for "OR".
>>
>> Cheers,
>> Peter.
>>
>> -Original Message-
>> From: RAUNAK AGRAWAL [mailto:agrawal.rau...@gmail.com]
>> Sent: 27 March 2018 07:45
>> To: solr-user@lucene.apache.org
>> Subject: How to escape OR or any other keyword in solr
>>
>> I have to search for state "OR" [short form for Oregon]. When I am making
>> query state:OR, I am getting SolrException since it is recognising it as
>> keyword.
>>
>> Now I tried with quotes ("") or //OR as well and when doing so..Solr
>> doesn't give exception but it also doesn't return any matching document.
>>
>> Kindly let me know what is the workaround for this issue?
>>
>> Thanks
>> 
>>
>> This message is confidential and may contain privileged information. You
>> should not disclose its contents to any other person. If you are not the
>> intended recipient, please notify the sender named above immediately. It is
>> expressly declared that this e-mail does not constitute nor form part of a
>> contract or unilateral obligation. Opinions, conclusions and other
>> information in this message that do not relate to the official business of
>> findmypast shall be understood as neither given nor endorsed by it.
>> 
>>
>> __
>>
>> This email has been checked for virus and other malicious content prior to
>> leaving our network.
>> __
>>


Wrong ngroups value result ?

2018-03-27 Thread Bruno Mannina
Dear Solr User,



I have several collections on a SOLR 5.4 (Ubuntu server).

Each collection have a same unique key "id".

Collections have sometimes common records.



Actually, I would like to use group option but the ngroups result is wrong.

SOLR find the right number of item but do the sum of each collection for
ngroups value.



I know this is a known problem but is exist a solution of this problem ?



I'm looking for since several hours without success.



Thanks for your help,

Bruno



---
L'absence de virus dans ce courrier électronique a été vérifiée par le logiciel 
antivirus Avast.
https://www.avast.com/antivirus


New Subscribe

2018-03-27 Thread Raymond Xie
**
*Sincerely yours,*


*Raymond*


query regarding Solr partial search

2018-03-27 Thread Paul, Lulu
Hi ,

Below is my SOLR configuration (schema.xml) for a keyword search field.







 
 
 



  



  
  





  



· If I search for “Autograph full score”, Solr returns all items that 
contains this string in exactly the same order.

· If I search for “full Autograph score”, Solr doesn’t return any 
results.

The requirement is that regardless of the order of the string, Solr should 
return all records which “CONTAIN” these 3 strings. Please advise how can this 
be made possible?

Thanks & Regards,
Lulu



**
Experience the British Library online at www.bl.uk
The British Library’s latest Annual Report and Accounts : 
www.bl.uk/aboutus/annrep/index.html
Help the British Library conserve the world's knowledge. Adopt a Book. 
www.bl.uk/adoptabook
The Library's St Pancras site is WiFi - enabled
*
The information contained in this e-mail is confidential and may be legally 
privileged. It is intended for the addressee(s) only. If you are not the 
intended recipient, please delete this e-mail and notify the 
postmas...@bl.uk : The contents of this e-mail must 
not be disclosed or copied without the sender's consent.
The statements and opinions expressed in this message are those of the author 
and do not necessarily reflect those of the British Library. The British 
Library does not take any responsibility for the views of the author.
*
Think before you print


RE: Help Needed - Indexing Related

2018-03-27 Thread YELESWARAPU, VENKATA BHAN
Information Classification: ll Limited Access

Found those values.
solr.autoSoftCommit.maxTime:2000
solr.autoCommit.maxTime:12

Thank you,
Dutt

-Original Message-
From: Sujay Bawaskar [mailto:sujaybawas...@gmail.com] 
Sent: Tuesday, March 27, 2018 8:54 AM
To: solr-user@lucene.apache.org
Subject: Re: Help Needed - Indexing Related

Since this is a scheduled job I think you can get rid of commits and optimize 
which are invoked from scheduled job.

On Tue, Mar 27, 2018 at 6:13 PM, YELESWARAPU, VENKATA BHAN < 
vyeleswar...@statestreet.com> wrote:

> Information Classification: ll Limited Access
>
> Thanks for your response Sujay.
> Solr Version - 4.3.1
> Yes, we are using client api to generate index files.
> I don't see those parameters configured outside or in the logs, but 
> indexing job is scheduled, which I think will take care of these.
> We have the option to schedule it to run in min intervals.
>
> Thank you,
> Dutt
>
>
> -Original Message-
> From: Sujay Bawaskar [mailto:sujaybawas...@gmail.com]
> Sent: Tuesday, March 27, 2018 8:32 AM
> To: solr-user@lucene.apache.org
> Subject: Re: Help Needed - Indexing Related
>
> Few questions here,
>
> Are you using solrj client from java application?
> What is version of solr?
> How frequently commit and optimize operation is called from solr client?
> If commit and optimize are not called from client what is value for 
> solr.autoCommit.maxTime and solr.autoSoftCommit.maxTime?
> What is current TPS and expected TPS?
>
> On Tue, Mar 27, 2018 at 5:38 PM, YELESWARAPU, VENKATA BHAN < 
> vyeleswar...@statestreet.com> wrote:
>
> > Information Classification: ** Limited Access
> >
> > Hi Solr Team,
> >
> > Hope you are doing well. I have been struggling with indexing for a 
> > week now.
> > Yesterday I deleted all indexing files and tried re-indexing. It 
> > failed saying unable to open a new searcher. Also that _0.si file is
> missing.
> > Today I redeployed the whole application and tried indexing. Now 
> > facing the below issues.
> > If you could guide me on this or if there is any documentation 
> > around this, that would greatly help. Appreciate your time on this.
> >
> > 2018-03-27 07:53:59,896 DEBUG (DefaultdatabaseFunctions.java:319) - 
> > lock [SolrIndexingJobReadFromQueue] acquired
> > 2018-03-27 07:53:59,924 DEBUG (SolrIndexingJob.java:193) - done 
> > sleeping
> > 2018-03-27 07:53:59,929 DEBUG (DefaultdatabaseFunctions.java:313) - 
> > lock [SolrIndexingJobReadFromQueue] already exists, will try 
> > updating it now
> > 2018-03-27 07:53:59,971 DEBUG (SolrIndexingQueueServiceImpl.java:54) 
> > - Object Alerts.CWI_096850 not fetched because its identifier 
> > appears to be already in processing
> > 2018-03-27 07:53:59,971 DEBUG (SolrIndexingQueueServiceImpl.java:54) 
> > - Object Alerts.CWI_096850 not fetched because its identifier 
> > appears to be already in processing
> > 2018-03-27 07:53:59,971 DEBUG (SolrIndexingQueueServiceImpl.java:54) 
> > - Object Alerts.CWI_096850 not fetched because its identifier 
> > appears to be already in processing
> > 2018-03-27 07:53:59,971 DEBUG (SolrIndexingQueueServiceImpl.java:54) 
> > - Object Alerts.CWI_096854 not fetched because its identifier 
> > appears to be already in processing
> >
> > 2018-03-27 07:54:31,128 WARN  (SolrIndexingJob.java:107) - Solr 
> > indexing job failed
> > java.lang.IndexOutOfBoundsException: Index: 16, Size: 10
> > at java.util.ArrayList.rangeCheck(ArrayList.java:635)
> > at java.util.ArrayList.set(ArrayList.java:426)
> > at com.actimize.dao.DaoUtil.orderList(DaoUtil.java:215)
> > at
> > com.actimize.dao.AlertDaoImpl.findAlertsByIdentifierForIndex
> > ing(AlertDaoImpl.java:2347)
> > at sun.reflect.GeneratedMethodAccessor2119.invoke(Unknown
> Source)
> > at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> > DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:606)
> > at org.springframework.aop.support.AopUtils.
> > invokeJoinpointUsingReflection(AopUtils.java:317)
> > at org.springframework.aop.framework.ReflectiveMethodInvocation.
> > invokeJoinpoint(ReflectiveMethodInvocation.java:183)
> > at org.springframework.aop.framework.ReflectiveMethodInvocation.
> > proceed(ReflectiveMethodInvocation.java:150)
> > at com.actimize.infrastructure.perfmon.
> > PerformanceMonitorInterceptor.invokeUnderTrace(
> > PerformanceMonitorInterceptor.java:57)
> > at org.springframework.aop.interceptor.AbstractTraceInterceptor.
> > invoke(AbstractTraceInterceptor.java:111)
> > at org.springframework.aop.framework.ReflectiveMethodInvocation.
> > proceed(ReflectiveMethodInvocation.java:172)
> > at org.springframework.aop.framework.JdkDynamicAopProxy.
> > invoke(JdkDynamicAopProxy.java:204)
> > at
> > com.sun.proxy.$Proxy39.findAlertsByIdentifierForIndexing(Unknown
> > Source)
> > at 

Re: Help Needed - Indexing Related

2018-03-27 Thread Sujay Bawaskar
Since this is a scheduled job I think you can get rid of commits and
optimize which are invoked from scheduled job.

On Tue, Mar 27, 2018 at 6:13 PM, YELESWARAPU, VENKATA BHAN <
vyeleswar...@statestreet.com> wrote:

> Information Classification: ll Limited Access
>
> Thanks for your response Sujay.
> Solr Version - 4.3.1
> Yes, we are using client api to generate index files.
> I don't see those parameters configured outside or in the logs, but
> indexing job is scheduled, which I think will take care of these.
> We have the option to schedule it to run in min intervals.
>
> Thank you,
> Dutt
>
>
> -Original Message-
> From: Sujay Bawaskar [mailto:sujaybawas...@gmail.com]
> Sent: Tuesday, March 27, 2018 8:32 AM
> To: solr-user@lucene.apache.org
> Subject: Re: Help Needed - Indexing Related
>
> Few questions here,
>
> Are you using solrj client from java application?
> What is version of solr?
> How frequently commit and optimize operation is called from solr client?
> If commit and optimize are not called from client what is value for
> solr.autoCommit.maxTime and solr.autoSoftCommit.maxTime?
> What is current TPS and expected TPS?
>
> On Tue, Mar 27, 2018 at 5:38 PM, YELESWARAPU, VENKATA BHAN <
> vyeleswar...@statestreet.com> wrote:
>
> > Information Classification: ** Limited Access
> >
> > Hi Solr Team,
> >
> > Hope you are doing well. I have been struggling with indexing for a
> > week now.
> > Yesterday I deleted all indexing files and tried re-indexing. It
> > failed saying unable to open a new searcher. Also that _0.si file is
> missing.
> > Today I redeployed the whole application and tried indexing. Now
> > facing the below issues.
> > If you could guide me on this or if there is any documentation around
> > this, that would greatly help. Appreciate your time on this.
> >
> > 2018-03-27 07:53:59,896 DEBUG (DefaultdatabaseFunctions.java:319) -
> > lock [SolrIndexingJobReadFromQueue] acquired
> > 2018-03-27 07:53:59,924 DEBUG (SolrIndexingJob.java:193) - done
> > sleeping
> > 2018-03-27 07:53:59,929 DEBUG (DefaultdatabaseFunctions.java:313) -
> > lock [SolrIndexingJobReadFromQueue] already exists, will try updating
> > it now
> > 2018-03-27 07:53:59,971 DEBUG (SolrIndexingQueueServiceImpl.java:54) -
> > Object Alerts.CWI_096850 not fetched because its identifier
> > appears to be already in processing
> > 2018-03-27 07:53:59,971 DEBUG (SolrIndexingQueueServiceImpl.java:54) -
> > Object Alerts.CWI_096850 not fetched because its identifier
> > appears to be already in processing
> > 2018-03-27 07:53:59,971 DEBUG (SolrIndexingQueueServiceImpl.java:54) -
> > Object Alerts.CWI_096850 not fetched because its identifier
> > appears to be already in processing
> > 2018-03-27 07:53:59,971 DEBUG (SolrIndexingQueueServiceImpl.java:54) -
> > Object Alerts.CWI_096854 not fetched because its identifier
> > appears to be already in processing
> >
> > 2018-03-27 07:54:31,128 WARN  (SolrIndexingJob.java:107) - Solr
> > indexing job failed
> > java.lang.IndexOutOfBoundsException: Index: 16, Size: 10
> > at java.util.ArrayList.rangeCheck(ArrayList.java:635)
> > at java.util.ArrayList.set(ArrayList.java:426)
> > at com.actimize.dao.DaoUtil.orderList(DaoUtil.java:215)
> > at
> > com.actimize.dao.AlertDaoImpl.findAlertsByIdentifierForIndex
> > ing(AlertDaoImpl.java:2347)
> > at sun.reflect.GeneratedMethodAccessor2119.invoke(Unknown
> Source)
> > at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> > DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:606)
> > at org.springframework.aop.support.AopUtils.
> > invokeJoinpointUsingReflection(AopUtils.java:317)
> > at org.springframework.aop.framework.ReflectiveMethodInvocation.
> > invokeJoinpoint(ReflectiveMethodInvocation.java:183)
> > at org.springframework.aop.framework.ReflectiveMethodInvocation.
> > proceed(ReflectiveMethodInvocation.java:150)
> > at com.actimize.infrastructure.perfmon.
> > PerformanceMonitorInterceptor.invokeUnderTrace(
> > PerformanceMonitorInterceptor.java:57)
> > at org.springframework.aop.interceptor.AbstractTraceInterceptor.
> > invoke(AbstractTraceInterceptor.java:111)
> > at org.springframework.aop.framework.ReflectiveMethodInvocation.
> > proceed(ReflectiveMethodInvocation.java:172)
> > at org.springframework.aop.framework.JdkDynamicAopProxy.
> > invoke(JdkDynamicAopProxy.java:204)
> > at
> > com.sun.proxy.$Proxy39.findAlertsByIdentifierForIndexing(Unknown
> > Source)
> > at com.actimize.services.AlertsServiceImpl.
> > findAlertsByIdentifierForIndexing(AlertsServiceImpl.java:5568)
> > at sun.reflect.GeneratedMethodAccessor2118.invoke(Unknown
> Source)
> > at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> > DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:606)
> > at 

RE: Help Needed - Indexing Related

2018-03-27 Thread YELESWARAPU, VENKATA BHAN
Information Classification: ll Limited Access

Thanks for your response Sujay.
Solr Version - 4.3.1
Yes, we are using client api to generate index files.
I don't see those parameters configured outside or in the logs, but indexing 
job is scheduled, which I think will take care of these. 
We have the option to schedule it to run in min intervals.

Thank you,
Dutt


-Original Message-
From: Sujay Bawaskar [mailto:sujaybawas...@gmail.com] 
Sent: Tuesday, March 27, 2018 8:32 AM
To: solr-user@lucene.apache.org
Subject: Re: Help Needed - Indexing Related

Few questions here,

Are you using solrj client from java application?
What is version of solr?
How frequently commit and optimize operation is called from solr client?
If commit and optimize are not called from client what is value for 
solr.autoCommit.maxTime and solr.autoSoftCommit.maxTime?
What is current TPS and expected TPS?

On Tue, Mar 27, 2018 at 5:38 PM, YELESWARAPU, VENKATA BHAN < 
vyeleswar...@statestreet.com> wrote:

> Information Classification: ** Limited Access
>
> Hi Solr Team,
>
> Hope you are doing well. I have been struggling with indexing for a 
> week now.
> Yesterday I deleted all indexing files and tried re-indexing. It 
> failed saying unable to open a new searcher. Also that _0.si file is missing.
> Today I redeployed the whole application and tried indexing. Now 
> facing the below issues.
> If you could guide me on this or if there is any documentation around 
> this, that would greatly help. Appreciate your time on this.
>
> 2018-03-27 07:53:59,896 DEBUG (DefaultdatabaseFunctions.java:319) - 
> lock [SolrIndexingJobReadFromQueue] acquired
> 2018-03-27 07:53:59,924 DEBUG (SolrIndexingJob.java:193) - done 
> sleeping
> 2018-03-27 07:53:59,929 DEBUG (DefaultdatabaseFunctions.java:313) - 
> lock [SolrIndexingJobReadFromQueue] already exists, will try updating 
> it now
> 2018-03-27 07:53:59,971 DEBUG (SolrIndexingQueueServiceImpl.java:54) - 
> Object Alerts.CWI_096850 not fetched because its identifier 
> appears to be already in processing
> 2018-03-27 07:53:59,971 DEBUG (SolrIndexingQueueServiceImpl.java:54) - 
> Object Alerts.CWI_096850 not fetched because its identifier 
> appears to be already in processing
> 2018-03-27 07:53:59,971 DEBUG (SolrIndexingQueueServiceImpl.java:54) - 
> Object Alerts.CWI_096850 not fetched because its identifier 
> appears to be already in processing
> 2018-03-27 07:53:59,971 DEBUG (SolrIndexingQueueServiceImpl.java:54) - 
> Object Alerts.CWI_096854 not fetched because its identifier 
> appears to be already in processing
>
> 2018-03-27 07:54:31,128 WARN  (SolrIndexingJob.java:107) - Solr 
> indexing job failed
> java.lang.IndexOutOfBoundsException: Index: 16, Size: 10
> at java.util.ArrayList.rangeCheck(ArrayList.java:635)
> at java.util.ArrayList.set(ArrayList.java:426)
> at com.actimize.dao.DaoUtil.orderList(DaoUtil.java:215)
> at 
> com.actimize.dao.AlertDaoImpl.findAlertsByIdentifierForIndex
> ing(AlertDaoImpl.java:2347)
> at sun.reflect.GeneratedMethodAccessor2119.invoke(Unknown Source)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.springframework.aop.support.AopUtils.
> invokeJoinpointUsingReflection(AopUtils.java:317)
> at org.springframework.aop.framework.ReflectiveMethodInvocation.
> invokeJoinpoint(ReflectiveMethodInvocation.java:183)
> at org.springframework.aop.framework.ReflectiveMethodInvocation.
> proceed(ReflectiveMethodInvocation.java:150)
> at com.actimize.infrastructure.perfmon.
> PerformanceMonitorInterceptor.invokeUnderTrace(
> PerformanceMonitorInterceptor.java:57)
> at org.springframework.aop.interceptor.AbstractTraceInterceptor.
> invoke(AbstractTraceInterceptor.java:111)
> at org.springframework.aop.framework.ReflectiveMethodInvocation.
> proceed(ReflectiveMethodInvocation.java:172)
> at org.springframework.aop.framework.JdkDynamicAopProxy.
> invoke(JdkDynamicAopProxy.java:204)
> at 
> com.sun.proxy.$Proxy39.findAlertsByIdentifierForIndexing(Unknown
> Source)
> at com.actimize.services.AlertsServiceImpl.
> findAlertsByIdentifierForIndexing(AlertsServiceImpl.java:5568)
> at sun.reflect.GeneratedMethodAccessor2118.invoke(Unknown Source)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.springframework.aop.support.AopUtils.
> invokeJoinpointUsingReflection(AopUtils.java:317)
> at org.springframework.aop.framework.ReflectiveMethodInvocation.
> invokeJoinpoint(ReflectiveMethodInvocation.java:183)
> at org.springframework.aop.framework.ReflectiveMethodInvocation.
> proceed(ReflectiveMethodInvocation.java:150)
> at org.springframework.transaction.interceptor.
> 

Re: Help Needed - Indexing Related

2018-03-27 Thread Sujay Bawaskar
Few questions here,

Are you using solrj client from java application?
What is version of solr?
How frequently commit and optimize operation is called from solr client?
If commit and optimize are not called from client what is value
for solr.autoCommit.maxTime and solr.autoSoftCommit.maxTime?
What is current TPS and expected TPS?

On Tue, Mar 27, 2018 at 5:38 PM, YELESWARAPU, VENKATA BHAN <
vyeleswar...@statestreet.com> wrote:

> Information Classification: ** Limited Access
>
> Hi Solr Team,
>
> Hope you are doing well. I have been struggling with indexing for a week
> now.
> Yesterday I deleted all indexing files and tried re-indexing. It failed
> saying unable to open a new searcher. Also that _0.si file is missing.
> Today I redeployed the whole application and tried indexing. Now facing
> the below issues.
> If you could guide me on this or if there is any documentation around
> this, that would greatly help. Appreciate your time on this.
>
> 2018-03-27 07:53:59,896 DEBUG (DefaultdatabaseFunctions.java:319) - lock
> [SolrIndexingJobReadFromQueue] acquired
> 2018-03-27 07:53:59,924 DEBUG (SolrIndexingJob.java:193) - done sleeping
> 2018-03-27 07:53:59,929 DEBUG (DefaultdatabaseFunctions.java:313) - lock
> [SolrIndexingJobReadFromQueue] already exists, will try updating it now
> 2018-03-27 07:53:59,971 DEBUG (SolrIndexingQueueServiceImpl.java:54) -
> Object Alerts.CWI_096850 not fetched because its identifier appears to
> be already in processing
> 2018-03-27 07:53:59,971 DEBUG (SolrIndexingQueueServiceImpl.java:54) -
> Object Alerts.CWI_096850 not fetched because its identifier appears to
> be already in processing
> 2018-03-27 07:53:59,971 DEBUG (SolrIndexingQueueServiceImpl.java:54) -
> Object Alerts.CWI_096850 not fetched because its identifier appears to
> be already in processing
> 2018-03-27 07:53:59,971 DEBUG (SolrIndexingQueueServiceImpl.java:54) -
> Object Alerts.CWI_096854 not fetched because its identifier appears to
> be already in processing
>
> 2018-03-27 07:54:31,128 WARN  (SolrIndexingJob.java:107) - Solr indexing
> job failed
> java.lang.IndexOutOfBoundsException: Index: 16, Size: 10
> at java.util.ArrayList.rangeCheck(ArrayList.java:635)
> at java.util.ArrayList.set(ArrayList.java:426)
> at com.actimize.dao.DaoUtil.orderList(DaoUtil.java:215)
> at com.actimize.dao.AlertDaoImpl.findAlertsByIdentifierForIndex
> ing(AlertDaoImpl.java:2347)
> at sun.reflect.GeneratedMethodAccessor2119.invoke(Unknown Source)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.springframework.aop.support.AopUtils.
> invokeJoinpointUsingReflection(AopUtils.java:317)
> at org.springframework.aop.framework.ReflectiveMethodInvocation.
> invokeJoinpoint(ReflectiveMethodInvocation.java:183)
> at org.springframework.aop.framework.ReflectiveMethodInvocation.
> proceed(ReflectiveMethodInvocation.java:150)
> at com.actimize.infrastructure.perfmon.
> PerformanceMonitorInterceptor.invokeUnderTrace(
> PerformanceMonitorInterceptor.java:57)
> at org.springframework.aop.interceptor.AbstractTraceInterceptor.
> invoke(AbstractTraceInterceptor.java:111)
> at org.springframework.aop.framework.ReflectiveMethodInvocation.
> proceed(ReflectiveMethodInvocation.java:172)
> at org.springframework.aop.framework.JdkDynamicAopProxy.
> invoke(JdkDynamicAopProxy.java:204)
> at com.sun.proxy.$Proxy39.findAlertsByIdentifierForIndexing(Unknown
> Source)
> at com.actimize.services.AlertsServiceImpl.
> findAlertsByIdentifierForIndexing(AlertsServiceImpl.java:5568)
> at sun.reflect.GeneratedMethodAccessor2118.invoke(Unknown Source)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.springframework.aop.support.AopUtils.
> invokeJoinpointUsingReflection(AopUtils.java:317)
> at org.springframework.aop.framework.ReflectiveMethodInvocation.
> invokeJoinpoint(ReflectiveMethodInvocation.java:183)
> at org.springframework.aop.framework.ReflectiveMethodInvocation.
> proceed(ReflectiveMethodInvocation.java:150)
> at org.springframework.transaction.interceptor.
> TransactionInterceptor$1.proceedWithInvocation(
> TransactionInterceptor.java:96)
> at org.springframework.transaction.interceptor.
> TransactionAspectSupport.invokeWithinTransaction(
> TransactionAspectSupport.java:260)
> at org.springframework.transaction.interceptor.
> TransactionInterceptor.invoke(TransactionInterceptor.java:94)
> at org.springframework.aop.framework.ReflectiveMethodInvocation.
> proceed(ReflectiveMethodInvocation.java:172)
> at com.actimize.infrastructure.util.DeadLockLockingInterceptor.
> invoke(DeadLockLockingInterceptor.java:40)
> 

Help Needed - Indexing Related

2018-03-27 Thread YELESWARAPU, VENKATA BHAN
Information Classification: ** Limited Access

Hi Solr Team,

Hope you are doing well. I have been struggling with indexing for a week now.
Yesterday I deleted all indexing files and tried re-indexing. It failed saying 
unable to open a new searcher. Also that _0.si file is missing.
Today I redeployed the whole application and tried indexing. Now facing the 
below issues.
If you could guide me on this or if there is any documentation around this, 
that would greatly help. Appreciate your time on this.

2018-03-27 07:53:59,896 DEBUG (DefaultdatabaseFunctions.java:319) - lock 
[SolrIndexingJobReadFromQueue] acquired
2018-03-27 07:53:59,924 DEBUG (SolrIndexingJob.java:193) - done sleeping
2018-03-27 07:53:59,929 DEBUG (DefaultdatabaseFunctions.java:313) - lock 
[SolrIndexingJobReadFromQueue] already exists, will try updating it now
2018-03-27 07:53:59,971 DEBUG (SolrIndexingQueueServiceImpl.java:54) - Object 
Alerts.CWI_096850 not fetched because its identifier appears to be already 
in processing
2018-03-27 07:53:59,971 DEBUG (SolrIndexingQueueServiceImpl.java:54) - Object 
Alerts.CWI_096850 not fetched because its identifier appears to be already 
in processing
2018-03-27 07:53:59,971 DEBUG (SolrIndexingQueueServiceImpl.java:54) - Object 
Alerts.CWI_096850 not fetched because its identifier appears to be already 
in processing
2018-03-27 07:53:59,971 DEBUG (SolrIndexingQueueServiceImpl.java:54) - Object 
Alerts.CWI_096854 not fetched because its identifier appears to be already 
in processing

2018-03-27 07:54:31,128 WARN  (SolrIndexingJob.java:107) - Solr indexing job 
failed
java.lang.IndexOutOfBoundsException: Index: 16, Size: 10
at java.util.ArrayList.rangeCheck(ArrayList.java:635)
at java.util.ArrayList.set(ArrayList.java:426)
at com.actimize.dao.DaoUtil.orderList(DaoUtil.java:215)
at 
com.actimize.dao.AlertDaoImpl.findAlertsByIdentifierForIndexing(AlertDaoImpl.java:2347)
at sun.reflect.GeneratedMethodAccessor2119.invoke(Unknown Source)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at 
org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:317)
at 
org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:183)
at 
org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
at 
com.actimize.infrastructure.perfmon.PerformanceMonitorInterceptor.invokeUnderTrace(PerformanceMonitorInterceptor.java:57)
at 
org.springframework.aop.interceptor.AbstractTraceInterceptor.invoke(AbstractTraceInterceptor.java:111)
at 
org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
at 
org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:204)
at com.sun.proxy.$Proxy39.findAlertsByIdentifierForIndexing(Unknown 
Source)
at 
com.actimize.services.AlertsServiceImpl.findAlertsByIdentifierForIndexing(AlertsServiceImpl.java:5568)
at sun.reflect.GeneratedMethodAccessor2118.invoke(Unknown Source)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at 
org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:317)
at 
org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:183)
at 
org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
at 
org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:96)
at 
org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:260)
at 
org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:94)
at 
org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
at 
com.actimize.infrastructure.util.DeadLockLockingInterceptor.invoke(DeadLockLockingInterceptor.java:40)
at 
org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
at 
com.actimize.infrastructure.util.OptimisticLockingInterceptor.invoke(OptimisticLockingInterceptor.java:48)
at 
org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
at 
com.actimize.infrastructure.perfmon.PerformanceMonitorInterceptor.invokeUnderTrace(PerformanceMonitorInterceptor.java:57)
at 

Re: How to escape OR or any other keyword in solr

2018-03-27 Thread RAUNAK AGRAWAL
Hi Peter,

Yes, I am using the stopword file which has *or *in it. Thanks for pointing
out. Will remove it from the stopword file and test it again. Thank you
very much!!

On Tue, Mar 27, 2018 at 1:17 PM, Peter Lancaster <
peter.lancas...@findmypast.com> wrote:

> Hi Raunak,
>
> Are you using a stop word file? That might be why you're getting 0 results
> searching for "OR".
>
> Cheers,
> Peter.
>
> -Original Message-
> From: RAUNAK AGRAWAL [mailto:agrawal.rau...@gmail.com]
> Sent: 27 March 2018 07:45
> To: solr-user@lucene.apache.org
> Subject: How to escape OR or any other keyword in solr
>
> I have to search for state "OR" [short form for Oregon]. When I am making
> query state:OR, I am getting SolrException since it is recognising it as
> keyword.
>
> Now I tried with quotes ("") or //OR as well and when doing so..Solr
> doesn't give exception but it also doesn't return any matching document.
>
> Kindly let me know what is the workaround for this issue?
>
> Thanks
> 
>
> This message is confidential and may contain privileged information. You
> should not disclose its contents to any other person. If you are not the
> intended recipient, please notify the sender named above immediately. It is
> expressly declared that this e-mail does not constitute nor form part of a
> contract or unilateral obligation. Opinions, conclusions and other
> information in this message that do not relate to the official business of
> findmypast shall be understood as neither given nor endorsed by it.
> 
>
> __
>
> This email has been checked for virus and other malicious content prior to
> leaving our network.
> __
>


RE: How to escape OR or any other keyword in solr

2018-03-27 Thread Peter Lancaster
Hi Raunak,

Are you using a stop word file? That might be why you're getting 0 results 
searching for "OR".

Cheers,
Peter.

-Original Message-
From: RAUNAK AGRAWAL [mailto:agrawal.rau...@gmail.com]
Sent: 27 March 2018 07:45
To: solr-user@lucene.apache.org
Subject: How to escape OR or any other keyword in solr

I have to search for state "OR" [short form for Oregon]. When I am making query 
state:OR, I am getting SolrException since it is recognising it as keyword.

Now I tried with quotes ("") or //OR as well and when doing so..Solr doesn't 
give exception but it also doesn't return any matching document.

Kindly let me know what is the workaround for this issue?

Thanks


This message is confidential and may contain privileged information. You should 
not disclose its contents to any other person. If you are not the intended 
recipient, please notify the sender named above immediately. It is expressly 
declared that this e-mail does not constitute nor form part of a contract or 
unilateral obligation. Opinions, conclusions and other information in this 
message that do not relate to the official business of findmypast shall be 
understood as neither given nor endorsed by it.


__

This email has been checked for virus and other malicious content prior to 
leaving our network.
__


How to escape OR or any other keyword in solr

2018-03-27 Thread RAUNAK AGRAWAL
I have to search for state "OR" [short form for Oregon]. When I am making
query state:OR, I am getting SolrException since it is recognising it as
keyword.

Now I tried with quotes ("") or //OR as well and when doing so..Solr
doesn't give exception but it also doesn't return any matching document.

Kindly let me know what is the workaround for this issue?

Thanks


Re: edit gc parameters in solr.in.sh or solr?

2018-03-27 Thread Bernd Fehling
Hi Walter,

may I give you the advise to _NOT_ set XX:G1HeapRegionSize.
That is computed during JAVA start by the engine according to heap and 
available memory.
A wrong set size can even a huge machine with 31GB heap and 157GB RAM force 
into OOM.
Guess how I figured that out, took me about one week to locate it.

Regards
Bernd

Am 26.03.2018 um 17:08 schrieb Walter Underwood:
> We use the G1 collector in Java 8u131 and it works well. We are running 
> 6.6.2. Our Solr instances do a LOT of allocation. We have long queries (25 
> terms average) and many unique queries.
> 
> SOLR_HEAP=8g
> # Use G1 GC  -- wunder 2017-01-23
> # Settings from https://wiki.apache.org/solr/ShawnHeisey
> GC_TUNE=" \
> -XX:+UseG1GC \
> -XX:+ParallelRefProcEnabled \
> -XX:G1HeapRegionSize=8m \
> -XX:MaxGCPauseMillis=200 \
> -XX:+UseLargePages \
> -XX:+AggressiveOpts \
> "
> 
> wunder
> Walter Underwood
> wun...@wunderwood.org
> http://observer.wunderwood.org/  (my blog)
> 
>> On Mar 26, 2018, at 1:22 AM, Derek Poh  wrote:
>>
>> Hi
>>
>> From your experience, would like to know if It is advisable to change the gc 
>> parameters in solr.in.sh or solrfile?
>> It is mentioned in the documentation to edit solr.in.sh but would like 
>> toknow which file you actually edit.
>>
>> I am using Solr 6.6.2at the moment.
>>
>> Regards,
>> Derek
>>
>>