Re: A user defined request handler is failing to fetch the data.

2018-07-03 Thread Adarsh_infor
Hi Shawn,

thanks that helped. I modified the searchHandler as below and it started
working 

 

   *:* 
   localhost:8983/solr/FI_idx 
   /select

 


Regards
Adarsh 



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Solr unable to start up after setting up SSL in Solr 7.4.0

2018-07-03 Thread Zheng Lin Edwin Yeo
Hi,

Would like to check, if there are any major changes in the way the SSL
works for Solr 7.4.0?

I have tried to set up with the same method that I used for Solr 7.3.1, but
after setting it up, the Solr is unable to load.

Below is the error message that I get.

Caused by: java.security.PrivilegedActionException:
java.lang.ClassNotFoundExcep
tion: org.apache.solr.util.configuration.SSLConfigurationsFactory
at java.security.AccessController.doPrivileged(Native Method)
at
org.eclipse.jetty.xml.XmlConfiguration.main(XmlConfiguration.java:150
8)
... 7 more
Caused by: java.lang.ClassNotFoundException:
org.apache.solr.util.configuration.
SSLConfigurationsFactory
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at org.eclipse.jetty.util.Loader.loadClass(Loader.java:65)
at
org.eclipse.jetty.xml.XmlConfiguration$JettyXmlConfiguration.call(Xml
Configuration.java:784)
at
org.eclipse.jetty.xml.XmlConfiguration$JettyXmlConfiguration.configur
e(XmlConfiguration.java:469)
at
org.eclipse.jetty.xml.XmlConfiguration$JettyXmlConfiguration.configur
e(XmlConfiguration.java:410)
at
org.eclipse.jetty.xml.XmlConfiguration.configure(XmlConfiguration.jav
a:308)
at
org.eclipse.jetty.xml.XmlConfiguration$1.run(XmlConfiguration.java:15
55)
at
org.eclipse.jetty.xml.XmlConfiguration$1.run(XmlConfiguration.java:15
09)
... 9 more
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.eclipse.jetty.start.Main.invokeMain(Main.java:220)
at org.eclipse.jetty.start.Main.start(Main.java:486)
at org.eclipse.jetty.start.Main.main(Main.java:77)
Caused by: java.security.PrivilegedActionException:
java.lang.ClassNotFoundExcep
tion: org.apache.solr.util.configuration.SSLConfigurationsFactory
at java.security.AccessController.doPrivileged(Native Method)
at
org.eclipse.jetty.xml.XmlConfiguration.main(XmlConfiguration.java:150
8)
... 7 more
Caused by: java.lang.ClassNotFoundException:
org.apache.solr.util.configuration.
SSLConfigurationsFactory
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at org.eclipse.jetty.util.Loader.loadClass(Loader.java:65)
at
org.eclipse.jetty.xml.XmlConfiguration$JettyXmlConfiguration.call(Xml
Configuration.java:784)
at
org.eclipse.jetty.xml.XmlConfiguration$JettyXmlConfiguration.configur
e(XmlConfiguration.java:469)
at
org.eclipse.jetty.xml.XmlConfiguration$JettyXmlConfiguration.configur
e(XmlConfiguration.java:410)
at
org.eclipse.jetty.xml.XmlConfiguration.configure(XmlConfiguration.jav
a:308)
at
org.eclipse.jetty.xml.XmlConfiguration$1.run(XmlConfiguration.java:15
55)
at
org.eclipse.jetty.xml.XmlConfiguration$1.run(XmlConfiguration.java:15
09)
... 9 more

Usage: java -jar $JETTY_HOME/start.jar [options] [properties] [configs]
   java -jar $JETTY_HOME/start.jar --help  # for more information


Regards,
Edwin


Re: Parent-child query; subqueries on child docs of the same set of fields

2018-07-03 Thread TK Solr
Thank you, Mikhail. But this didn't work. The first {!parent which='...' 
v='...'} alone works. But the second {!parent ...} clause is completely ignored.
In fact, if I turn on debugQuery, rawquerystring and querystring have the second 
query but parsedquery and parsedquery_toString only have the first query. BTW, 
does is the v parameter works in place of the query following {!parsername } for 
any parser?



On 7/3/18 12:42 PM, Mikhail Khludnev wrote:

q={!parent which="isParent:true" v='attrname:genre AND attrvalue:drama'} AND

{!parent which="isParent:true" v='attrname:country AND attrvalue:USA'}




Sort by payload field desc fails

2018-07-03 Thread Anurag Nilesh
Hi,
I was trying out extracting the payload field value for a payload field
type and then doing a sort based on that value.

For some reason, the sort works with ascending order but fails with
descending order.

{
  "responseHeader":{
"status":500,
"QTime":1,
"params":{
  "p":"payload(price_dpf, b)",
  "q":"*:*",
  "indent":"on",
  "fl":"${p},item",
  "sort":"${p} desc",
  "rows":"10",
  "wt":"json"}},
  "error":{
"msg":"-64",
"trace":"java.lang.ArrayIndexOutOfBoundsException: -64\n\tat
org.apache.lucene.codecs.lucene50.ForUtil.readBlock(ForUtil.java:196)\n\tat
org.apache.lucene.codecs.lucene50.Lucene50PostingsReader$EverythingEnum.refillPositions(Lucene50PostingsReader.java:1024)\n\tat
org.apache.lucene.codecs.lucene50.Lucene50PostingsReader$EverythingEnum.nextPosition(Lucene50PostingsReader.java:1226)\n\tat
org.apache.solr.search.FloatPayloadValueSource$1.floatVal(FloatPayloadValueSource.java:163)\n\tat
org.apache.lucene.queries.function.docvalues.FloatDocValues.doubleVal(FloatDocValues.java:60)\n\tat
org.apache.lucene.queries.function.ValueSource$ValueSourceComparator.compareBottom(ValueSource.java:259)\n\tat
org.apache.lucene.search.TopFieldCollector$SimpleFieldCollector$1.collect(TopFieldCollector.java:117)\n\tat
org.apache.lucene.search.MatchAllDocsQuery$1$1.score(MatchAllDocsQuery.java:56)\n\tat
org.apache.lucene.search.BulkScorer.score(BulkScorer.java:39)\n\tat
org.apache.lucene.search.IndexSearcher.search(IndexSearcher.java:668)\n\tat
org.apache.lucene.search.IndexSearcher.search(IndexSearcher.java:472)\n\tat
org.apache.solr.search.SolrIndexSearcher.buildAndRunCollectorChain(SolrIndexSearcher.java:217)\n\tat
org.apache.solr.search.SolrIndexSearcher.getDocListNC(SolrIndexSearcher.java:1582)\n\tat
org.apache.solr.search.SolrIndexSearcher.getDocListC(SolrIndexSearcher.java:1399)\n\tat
org.apache.solr.search.SolrIndexSearcher.search(SolrIndexSearcher.java:566)\n\tat
org.apache.solr.handler.component.QueryComponent.process(QueryComponent.java:545)\n\tat
org.apache.solr.handler.component.SearchHandler.handleRequestBody(SearchHandler.java:296)\n\tat
org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:173)\n\tat
org.apache.solr.core.SolrCore.execute(SolrCore.java:2477)\n\tat
org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:723)\n\tat
org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:529)\n\tat
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:361)\n\tat
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:305)\n\tat
org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1691)\n\tat
org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:582)\n\tat
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)\n\tat
org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)\n\tat
org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226)\n\tat
org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180)\n\tat
org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:512)\n\tat
org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)\n\tat
org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112)\n\tat
org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)\n\tat
org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:213)\n\tat
org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:119)\n\tat
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)\n\tat
org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:335)\n\tat
org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)\n\tat
org.eclipse.jetty.server.Server.handle(Server.java:534)\n\tat
org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:320)\n\tat
org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:251)\n\tat
org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:273)\n\tat
org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:95)\n\tat
org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93)\n\tat
org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303)\n\tat
org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148)\n\tat
org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136)\n\tat
org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671)\n\tat
org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589)\n\tat
java.lang.Thread.run(Thread.java:748)\n",
"code":500}}


Do you guys have any pointers 

[ANNOUNCE] Apache Solr 6.6.5 released

2018-07-03 Thread Ishan Chattopadhyaya
03 July 2018, Apache Solr™ 6.6.5 available

The Lucene PMC is pleased to announce the release of Apache Solr 6.6.5

Solr is the popular, blazing fast, open source NoSQL search platform from
the Apache Lucene project. Its major features include powerful full-text
search, hit highlighting, faceted search and analytics, rich document
parsing, geospatial search, extensive REST APIs as well as parallel SQL.
Solr is enterprise grade, secure and highly scalable, providing fault
tolerant distributed search and indexing, and powers the search and
navigation features of many of the world's largest internet sites.

This release includes the following changes:

* Ability to disable configset upload via -Dconfigset.upload.enabled=false
 startup parameter
* Referal to external resources in various config files now disallowed

The release is available for immediate download at:

http://www.apache.org/dyn/closer.lua/lucene/solr/6.6.5

Please read CHANGES.txt for a detailed list of changes:

https://lucene.apache.org/solr/6_6_5/changes/Changes.html

Please report any feedback to the mailing lists (
http://lucene.apache.org/solr/discussion.html)

Note: The Apache Software Foundation uses an extensive mirroring network
for distributing releases. It is possible that the mirror you are using may
not have replicated the release yet. If that is the case, please try
another mirror. This also goes for Maven access.


Re: [ANNOUNCE] Apache Lucene 6.6.5 released

2018-07-03 Thread Hasan Diwan
Congrats to all! -- H
On Tue, 3 Jul 2018 at 14:29, Ishan Chattopadhyaya
 wrote:
>
> 3 July 2018, Apache Lucene™ 6.6.5 available
>
> The Lucene PMC is pleased to announce the release of Apache Lucene 6.6.5.
>
> Apache Lucene is a high-performance, full-featured text search engine
> library written entirely in Java. It is a technology suitable for nearly
> any application that requires full-text search, especially cross-platform.
>
> This release contains one bug fix. The release is available for immediate
> download at:
> http://lucene.apache.org/core/mirrors-core-latest-redir.html
>
> Further details of changes are available in the change log available at:
> http://lucene.apache.org/core/6_6_5/changes/Changes.html
>
> Please report any feedback to the mailing lists (
> http://lucene.apache.org/core/discussion.html)
>
> Note: The Apache Software Foundation uses an extensive mirroring network
> for distributing releases. It is possible that the mirror you are using may
> not have replicated the release yet. If that is the case, please try
> another mirror. This also applies to Maven access.



-- 
OpenPGP: https://sks-keyservers.net/pks/lookup?op=get=0xFEBAD7FFD041BBA1
If you wish to request my time, please do so using bit.ly/hd1AppointmentRequest.
Si vous voudrais faire connnaisance, allez a bit.ly/hd1AppointmentRequest.

Sent from my mobile device
Envoye de mon portable


[ANNOUNCE] Apache Lucene 6.6.5 released

2018-07-03 Thread Ishan Chattopadhyaya
3 July 2018, Apache Lucene™ 6.6.5 available

The Lucene PMC is pleased to announce the release of Apache Lucene 6.6.5.

Apache Lucene is a high-performance, full-featured text search engine
library written entirely in Java. It is a technology suitable for nearly
any application that requires full-text search, especially cross-platform.

This release contains one bug fix. The release is available for immediate
download at:
http://lucene.apache.org/core/mirrors-core-latest-redir.html

Further details of changes are available in the change log available at:
http://lucene.apache.org/core/6_6_5/changes/Changes.html

Please report any feedback to the mailing lists (
http://lucene.apache.org/core/discussion.html)

Note: The Apache Software Foundation uses an extensive mirroring network
for distributing releases. It is possible that the mirror you are using may
not have replicated the release yet. If that is the case, please try
another mirror. This also applies to Maven access.


Re: Logging fails when starting Solr in Windows using solr.cmd

2018-07-03 Thread Erick Erickson
Shawn:

Yeah, the best I've been able to come up with so far is to have
exactly two, both in server/resources
log4j2.xml - the standard configuration for running Solr
log4j2-console.xml Used by the startup scripts (_not_ running Solr).

See the last two comments on SOLR-12008 for more detail.

Erick


On Tue, Jul 3, 2018 at 12:07 PM, Shawn Heisey  wrote:
> On 7/3/2018 2:53 AM, ja...@jafurrer.ch wrote:
>> I was intending to open an Issue in Jira when I read that I'm supposed
>> to first contact this mailinglist.
>>
>> Problem description
>> ==
>>
>> System: Microsoft Windows 10 Enterprise Version 10.0.16299 Build 16299
>>
>> Steps to reproduce the problem:
>> 1) Download solr-7.4.0.tgz
>> 2) Unzip to C:\solr-7.4.0
>> 3) No changes (configuration or otherwise) whatsoever
>> 4) Open cmd.exe
>> 5) Execute the following command: cd c:\solr-7.4.0\bin
>> 6) Execute the following command: solr.cmd start -p 8983
>> 7) The following console output appears:
>>
>> c:\solr-7.4.0\bin>solr.cmd start -p 8983
>> ERROR StatusLogger Unable to access
>> file:/c:/solr-7.4.0/server/file:c:/solr-7.4.0/server/scripts/cloud-scripts/log4j2.xml
>
> I'm seeing the same behavior on Windows 7.  I started with the .zip
> download, so the fact that you have the .tgz download is likely not a
> factor.  The .zip is a better option for Windows -- it has correct line
> endings for Windows in most files, and Windows knows how to extract it
> without installing additional software.
>
> This is looking to me like a probable windows-specific bug in log4j2.  I
> have asked the log4j mailing list about it.  The solr.cmd script appears
> to be functioning correctly and not producing the strange pathname shown
> in the error, and the same parameter syntax (with the file: prefix) is
> working correctly on Linux.
>
> Erick, the config in cloud-scripts logs to stderr rather than files.
> I'm all for moving it to resources so we don't have to keep track of
> logging config files in multiple locations, but it does need to be a
> different config file specifically for command-line tools.  Perhaps
> log4j2-cli.xml as the filename?
>
> This is the first version of Solr that includes log4j2.  All previous
> releases used log4j 1.2.x.  Operating systems like Linux and MacOS get a
> lot more testing than Windows does.  It's not good that this problem
> exists.  Thank you for finding a workaround.
>
> Solr 7.4 is using the most current version of log4j that is currently
> available.
>
> If you would like to proceed, please feel free to open an issue in
> Jira.  A suggested title for the issue would be "log4j exceptions during
> startup on Windows".  Even though I think this is a bug in software
> other than Solr, it is a problem that Solr is experiencing, so we need
> to track it and make sure we fix it.  It might end up being a two-part
> fix, where we initially apply your workaround and then later revert that
> change and upgrade log4j.
>
> Thanks,
> Shawn
>


Re: Parent-child query; subqueries on child docs of the same set of fields

2018-07-03 Thread Mikhail Khludnev
Hello,
q={!parent which="isParent:true" v='attrname:genre AND attrvalue:drama'} AND

{!parent which="isParent:true" v='attrname:country AND attrvalue:USA'}

On Tue, Jul 3, 2018 at 3:35 PM TK Solr  wrote:

> I have a document with child documents like:
>
>
>  maindoc_121
>  true
>  child_121_1
>  genre
>  drama
>  
>  
>  child_121_2
>  country
>  USA
>  
> 
>
> The child documents have the same set of fields.
>
> I can write a query that has a child which has attrname=genre and
> attrvalue=drama as
>
> q={!parent which="isParent:true"} attrname:genre AND attrvalue:drama
>
>
> But if I want to add another condition that the parent must have another
> child
> that have certain values, what do I do?
>
> q={!parent which="isParent:true"} attrname:genre AND attrvalue:drama AND
> attrname:country AND attrvalue:USA
>
> would mean a query of parent where one of the children must match. I want
> a
> parent that have two children, one is matched by one sub-query, and
> another is
> matched by another sub-query.
>
> TK
>
>
>

-- 
Sincerely yours
Mikhail Khludnev


Parent-child query; subqueries on child docs of the same set of fields

2018-07-03 Thread TK Solr

I have a document with child documents like:

  
maindoc_121
true
child_121_1
genre
drama


child_121_2
country
USA



The child documents have the same set of fields.

I can write a query that has a child which has attrname=genre and 
attrvalue=drama as

q={!parent which="isParent:true"} attrname:genre AND attrvalue:drama


But if I want to add another condition that the parent must have another child 
that have certain values, what do I do?


q={!parent which="isParent:true"} attrname:genre AND attrvalue:drama AND 
attrname:country AND attrvalue:USA


would mean a query of parent where one of the children must match. I want a 
parent that have two children, one is matched by one sub-query, and another is 
matched by another sub-query.


TK




Re: Errors when using Blob API

2018-07-03 Thread Shawn Heisey
On 7/3/2018 6:49 AM, Zahra Aminolroaya wrote:
> I want to transfer my jar files to my ".system" collection in "Solrcloud".
> One of my solr port is 
>
> My jar file name is "norm", and the following is my command for this
> transfer:
> /
> curl -X POST -H 'Content-Type:application/octet-stream' --data-binary
> @norm.jar http://localhost:/solr/.system/blob/norm/
>
> *However, I get the following error:*

Comparing your command with what's in the documentation, I see two
differences.

Try adding a space to the header, so you have this option:

-H 'Content-Type: application/octet-stream'

Without the space, Solr may not be interpreting the input as the correct
type.

Also, you probably should remove the trailing slash from the URL:

http://localhost:/solr/.system/blob/norm

The error you get with the Lucene jar happens because the file is larger
than 2MB, which is Solr's default limit on the size of a POST request
entity.

I think the error with the slf4j jar may be essentially the same as the
error with your custom jar -- the Content-Type header might be badly
formed, so when it tries to interpret the data according to whatever
type it defaults to, it fails.  The contents of the slf4j jar are
different than the contents of your custom jar, so it has a slightly
different complaint.

If fixing the header and the URL don't help, then please share any
errors found in solr.log after that change is made.  There may be better
information there than you receive in the response.

Thanks,
Shawn



Re: Can't recover - HDFS

2018-07-03 Thread Shawn Heisey
On 7/3/2018 6:55 AM, Joe Obernberger wrote:
> I think the root issue is related to some weirdness with HDFS. Log
> file is here:
> http://lovehorsepower.com/solr.log.4
> Config is here:
> http://lovehorsepower.com/solrconfig.xml
> I don't see anything set to 20 seconds.
>
> I believe the root exception is:
>
> org.apache.hadoop.ipc.RemoteException(java.io.IOException): File
> /solr7.1.0/UNCLASS_30DAYS/core_node-1684300827/data/tlog/tlog.0008930
> could only be replicated to 0 nodes instead of minReplication (=1). 
> There are 41 datanode(s) running and no node(s) are excluded in this
> operation.

That does look like what's causing all the errors.  This is a purely
hadoop/hdfs exception.  There are no Solr classes in the "Caused by"
part of the exception.  If you have any hdfs experts in-house, you
should talk to them.  If not, you may need to find a hadoop mailing list.

Looking up the exception, I've seen a couple of answers that say when
this happens you have to format your datanode and lose all your data. 
Or it could be a configuration problem, a permission problem, or a disk
space problem.  Perhaps if I knew anything about HDFS, I could make
sense of the google search results.

The logs on your hadoop servers might have more information, but I do
not know how to interpret them.

Thanks,
Shawn



Re: Logging fails when starting Solr in Windows using solr.cmd

2018-07-03 Thread Shawn Heisey
On 7/3/2018 2:53 AM, ja...@jafurrer.ch wrote:
> I was intending to open an Issue in Jira when I read that I'm supposed
> to first contact this mailinglist.
>
> Problem description
> ==
>
> System: Microsoft Windows 10 Enterprise Version 10.0.16299 Build 16299
>
> Steps to reproduce the problem:
> 1) Download solr-7.4.0.tgz
> 2) Unzip to C:\solr-7.4.0
> 3) No changes (configuration or otherwise) whatsoever
> 4) Open cmd.exe
> 5) Execute the following command: cd c:\solr-7.4.0\bin
> 6) Execute the following command: solr.cmd start -p 8983
> 7) The following console output appears:
>
> c:\solr-7.4.0\bin>solr.cmd start -p 8983
> ERROR StatusLogger Unable to access
> file:/c:/solr-7.4.0/server/file:c:/solr-7.4.0/server/scripts/cloud-scripts/log4j2.xml

I'm seeing the same behavior on Windows 7.  I started with the .zip
download, so the fact that you have the .tgz download is likely not a
factor.  The .zip is a better option for Windows -- it has correct line
endings for Windows in most files, and Windows knows how to extract it
without installing additional software.

This is looking to me like a probable windows-specific bug in log4j2.  I
have asked the log4j mailing list about it.  The solr.cmd script appears
to be functioning correctly and not producing the strange pathname shown
in the error, and the same parameter syntax (with the file: prefix) is
working correctly on Linux.

Erick, the config in cloud-scripts logs to stderr rather than files. 
I'm all for moving it to resources so we don't have to keep track of
logging config files in multiple locations, but it does need to be a
different config file specifically for command-line tools.  Perhaps
log4j2-cli.xml as the filename?

This is the first version of Solr that includes log4j2.  All previous
releases used log4j 1.2.x.  Operating systems like Linux and MacOS get a
lot more testing than Windows does.  It's not good that this problem
exists.  Thank you for finding a workaround.

Solr 7.4 is using the most current version of log4j that is currently
available.

If you would like to proceed, please feel free to open an issue in
Jira.  A suggested title for the issue would be "log4j exceptions during
startup on Windows".  Even though I think this is a bug in software
other than Solr, it is a problem that Solr is experiencing, so we need
to track it and make sure we fix it.  It might end up being a two-part
fix, where we initially apply your workaround and then later revert that
change and upgrade log4j.

Thanks,
Shawn



Re: Block Join Child Query returns incorrect result

2018-07-03 Thread Mikhail Khludnev
Hello.

{!parent} always searching for parents, some improvement is in progress,
but you need to use [child] or [subquery] to see children.
If you don't have an idea about search result add =true param to
get through matching details.

On Mon, Jul 2, 2018 at 10:41 PM kristaclaire14 
wrote:

> Hi,
>
> I'm having a problem in my solr when querying third level child documents.
> I
> want to retrieve parent documents that have specific third level child
> documents. The example data is:
>
> [{
> "id":"1001"
> "path":"1.Project",
> "Project_Title":"Sample Project",
> "_childDocuments_":[
> {
> "id":"2001",
> "path":"2.Project.Submission",
> "Submission_No":"1234-QWE",
> "_childDocuments_":[
> {
> "id":"3001",
> "path":"3.Project.Submission.Agency",
> "Agency_Cd":"QWE"
> }
> ]
> }]
> }, {
> "id":"1002"
> "path":"1.Project",
> "Project_Title":"Test Project QWE",
> "_childDocuments_":[
> {
> "id":"2002",
> "path":"2.Project.Submission",
> "Submission_No":"4567-AGY",
> "_childDocuments_":[
> {
> "id":"3002",
> "path":"3.Project.Submission.Agency",
> "Agency_Cd":"AGY"
> }]
> }]
> }]
>
> I want to retrieve the parent with *Agency_Cd:ZXC* in third level child
> document.
> So far, this is my query:
> q={!parent which="path:1.Project" v="path:3.Project.Submission.Agency AND
> Agency_Cd:ZXC"}
>
> My expected result is 0 but solr return parents with no matching child
> documents based on the query. Am I doing something wrong on the query?
> Thanks in advance.
>
>
>
> --
> Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html
>


-- 
Sincerely yours
Mikhail Khludnev


Re: Solrcloud collection sharding and querying

2018-07-03 Thread Erick Erickson
bq.  I am trying to see how sharding can be employed to
improve the query performance by adding the route to a shard based on a
field in schema.xml.

This is actually straightforward, just create the collections with
implicit routing.
See the collections API for CREATE and the "Document Routing" section of
the reference guide.

That said, for an index that size I suspect you have other problems
and splitting up to shards isn't a long-term solution. I regularly see
200G indexes in the wild (admittedly some fairly beefy machines) that
return sub-second response times.

FWIW,
Erick

On Mon, Jul 2, 2018 at 7:43 PM, Sushant Vengurlekar
 wrote:
> We have two collections which are 21G and constantly growing. The index on
> one of them is also 12G. I am trying to see how sharding can be employed to
> improve the query performance by adding the route to a shard based on a
> field in schema.xml. So I am trying to figure out how to split the
> collections into shards based on this one field and then query them further
> by routing the query to a particular shard based on this field.
>
> Thank you
>
> On Mon, Jul 2, 2018 at 7:36 PM, Erick Erickson 
> wrote:
>
>> This seems like an "XY problem". _Why_ do you want to do this?
>> Has your collection outgrown one shard and you feel you have to
>> split it? Sharding should only be used when you can't host your
>> entire collection on a single replica and still get adequate performance.
>>
>> When you do reach that point, the usual process is to just
>> decide how many shards you need and let Solr do the rest
>> of the work. Why do you think you need to specify how docs
>> are routed based on some field?
>>
>> Best,
>> Erick
>>
>> On Mon, Jul 2, 2018 at 6:06 PM, Sushant Vengurlekar
>>  wrote:
>> > I want to split a collection based on one field. How do I do it and then
>> > query based off that.
>> >
>> > Ex: collection1. Field to split off col1
>> >
>> > Thank you
>>


Re: please unsubscribe

2018-07-03 Thread Erick Erickson
Please follow the instructions here:
http://lucene.apache.org/solr/community.html#mailing-lists-irc

. You must use the _exact_ same e-mail as you used to subscribe.

If the initial try doesn't work and following the suggestions at the
"problems" link doesn't work for you, let us know. But note you need
to show us the _entire_ return header to allow anyone to diagnose the
problem.

Best,
Erick

On Mon, Jul 2, 2018 at 9:11 PM, Karl Hampel  wrote:
>


Re: Logging fails when starting Solr in Windows using solr.cmd

2018-07-03 Thread Erick Erickson
Jakob:

I don't have Windows so rely on people who do to vet changes. But I'm
working on https://issues.apache.org/jira/browse/SOLR-12008 which
seeks to get rid of the confusing number of log4j config files and
solr.cmd has a lot of paths to change. So if you do get agreement that
you should raise a JIRA would you please link it to 12008? The "More"
drop-down has a "Link" option.

The (tentative) place where the "one true" log4j config file will
reside is in ...server/resources, so if people work this script over
using that one would help... And if it's impossible to use that one
that'd also be good information.

Thanks,
Erick

On Tue, Jul 3, 2018 at 1:53 AM,   wrote:
> Hi,
>
> I was intending to open an Issue in Jira when I read that I'm supposed to
> first contact this mailinglist.
>
> Problem description
> ==
>
> System: Microsoft Windows 10 Enterprise Version 10.0.16299 Build 16299
>
> Steps to reproduce the problem:
> 1) Download solr-7.4.0.tgz
> 2) Unzip to C:\solr-7.4.0
> 3) No changes (configuration or otherwise) whatsoever
> 4) Open cmd.exe
> 5) Execute the following command: cd c:\solr-7.4.0\bin
> 6) Execute the following command: solr.cmd start -p 8983
> 7) The following console output appears:
>
>
> c:\solr-7.4.0\bin>solr.cmd start -p 8983
> ERROR StatusLogger Unable to access
> file:/c:/solr-7.4.0/server/file:c:/solr-7.4.0/server/scripts/cloud-scripts/log4j2.xml
>  java.io.FileNotFoundException:
> c:\solr-7.4.0\server\file:c:\solr-7.4.0\server\scripts\cloud-scripts\log4j2.xml
> (Die Syntax für den Dateinamen, Verzeichnisnamen oder die
> Datenträgerbezeichnung ist falsch)
> at java.io.FileInputStream.open0(Native Method)
> at java.io.FileInputStream.open(FileInputStream.java:195)
> at java.io.FileInputStream.(FileInputStream.java:138)
> at java.io.FileInputStream.(FileInputStream.java:93)
> at
> sun.net.www.protocol.file.FileURLConnection.connect(FileURLConnection.java:90)
> at
> sun.net.www.protocol.file.FileURLConnection.getInputStream(FileURLConnection.java:188)
> at java.net.URL.openStream(URL.java:1045)
> at
> org.apache.logging.log4j.core.config.ConfigurationSource.fromUri(ConfigurationSource.java:247)
> at
> org.apache.logging.log4j.core.config.ConfigurationFactory$Factory.getConfiguration(ConfigurationFactory.java:404)
> at
> org.apache.logging.log4j.core.config.ConfigurationFactory$Factory.getConfiguration(ConfigurationFactory.java:346)
> at
> org.apache.logging.log4j.core.config.ConfigurationFactory.getConfiguration(ConfigurationFactory.java:260)
> at
> org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:615)
> at
> org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:636)
> at
> org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:231)
> at
> org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:153)
> at
> org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:45)
> at
> org.apache.logging.log4j.LogManager.getContext(LogManager.java:194)
> at
> org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:121)
> at
> org.apache.logging.slf4j.Log4jLoggerFactory.getContext(Log4jLoggerFactory.java:43)
> at
> org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:46)
> at
> org.apache.logging.slf4j.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:29)
> at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:358)
> at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:383)
> at org.apache.solr.util.SolrCLI.(SolrCLI.java:228)
> ERROR StatusLogger Unable to access
> file:/c:/solr-7.4.0/server/file:c:/solr-7.4.0/server/resources/log4j2.xml
>  java.io.FileNotFoundException:
> c:\solr-7.4.0\server\file:c:\solr-7.4.0\server\resources\log4j2.xml (Die
> Syntax für den Dateinamen, Verzeichnisnamen oder die Datenträgerbezeichnung
> ist falsch)
> at java.io.FileInputStream.open0(Native Method)
> at java.io.FileInputStream.open(FileInputStream.java:195)
> at java.io.FileInputStream.(FileInputStream.java:138)
> at java.io.FileInputStream.(FileInputStream.java:93)
> at
> sun.net.www.protocol.file.FileURLConnection.connect(FileURLConnection.java:90)
> at
> sun.net.www.protocol.file.FileURLConnection.getInputStream(FileURLConnection.java:188)
> at java.net.URL.openStream(URL.java:1045)
> at
> org.apache.logging.log4j.core.config.ConfigurationSource.fromUri(ConfigurationSource.java:247)
> at
> org.apache.logging.log4j.core.config.ConfigurationFactory$Factory.getConfiguration(ConfigurationFactory.java:404)
> at
> 

Re: A user defined request handler is failing to fetch the data.

2018-07-03 Thread Shawn Heisey
On 7/3/2018 3:09 AM, Adarsh_infor wrote:
> the below command works . 
>
> http://localhost:8983/solr/FI_idx/select?q=*:*=true=localhost:8983/solr/FI_idx
>
> but the same will not work with filesearch new search handler.  
>
> http://localhost:8983/solr/FI_idx/filesearch?q=*:*=true=localhost:8983/solr/FI_idx
>
> This gives me error, so am trying to figure what difference could have
> caused the failed for the second http command.  
>
> And the second http command works fine if i change the lucenematchversion in
> solrconfig to LUCENE_40 which is totally weird because the indexing has
> happened with lucenematchversion 6.6.3 but search works fine with LUCENE_40. 
> which is totally weird behaviour.  

I think I know what's going on here.

The reason that 4.0 compatibility works is this issue that takes effect
without it:

https://issues.apache.org/jira/browse/SOLR-6311

What this issue does internally is turn your second URL into this (split
into two lines for readability):

http://localhost:8983/solr/FI_idx/filesearch?q=*:*=true
=localhost:8983/solr/FI_idx=/filesearch

Before SOLR-6311 (Solr 5.0 and earlier), shard subrequests would always
go to /select.  Starting with version 5.1, shard subrequests go to the
same handler that received the request.

I bet your /select handler does NOT have a definition for the shards
parameter.  So when issuing the first request, or the second request
with 4.0 compatibility, the initial request has the shards parameter,
the subrequest goes to the /select handler, and everything's good.

With the second request and the 6.x version behavior, you make an
initial request, and include a shards parameter.  The subrequest made by
the shards parameter you included on the URL goes to the same handler
(/filesearch), which has a shards parameter in its definition.  That
shards parameter sends the request AGAIN to /filesearch.  And again. 
And again.  This repeats over and over until the java virtual machine
runs out of some resource (might be heap, might be stack space, might be
something else) and can't follow the rabbit hole down any more.

If you were to include the following parameter on the second request, it
should work:

=/select

Thanks,
Shawn



RE: Filtering solr suggest results

2018-07-03 Thread Peter Lancaster
Hi Arunan,

You can use a context filter query as described 
https://lucene.apache.org/solr/guide/6_6/suggester.html

Cheers,
Peter.

-Original Message-
From: Arunan Sugunakumar [mailto:arunans...@cse.mrt.ac.lk]
Sent: 03 July 2018 12:17
To: solr-user@lucene.apache.org
Subject: Filtering solr suggest results

Hi,

I would like to know whether it is possible to filter the suggestions returned 
by the suggest component according to a field. For example I have a list of 
books published by different publications. I want to show suggestions for a 
book title under a specific publication.

Thanks in Advance,

Arunan

*Sugunakumar Arunan*
Undergraduate - CSE | UOM

Email : aruna ns...@cse.mrt.ac.lk


This message is confidential and may contain privileged information. You should 
not disclose its contents to any other person. If you are not the intended 
recipient, please notify the sender named above immediately. It is expressly 
declared that this e-mail does not constitute nor form part of a contract or 
unilateral obligation. Opinions, conclusions and other information in this 
message that do not relate to the official business of findmypast shall be 
understood as neither given nor endorsed by it.


__

This email has been checked for virus and other malicious content prior to 
leaving our network.
__


Re: Can't recover - HDFS

2018-07-03 Thread Joe Obernberger

Thank you Shawn -

I think the root issue is related to some weirdness with HDFS. Log file 
is here:

http://lovehorsepower.com/solr.log.4
Config is here:
http://lovehorsepower.com/solrconfig.xml
I don't see anything set to 20 seconds.

I believe the root exception is:

org.apache.hadoop.ipc.RemoteException(java.io.IOException): File 
/solr7.1.0/UNCLASS_30DAYS/core_node-1684300827/data/tlog/tlog.0008930 
could only be replicated to 0 nodes instead of minReplication (=1).  
There are 41 datanode(s) running and no node(s) are excluded in this 
operation.
    at 
org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget4NewBlock(BlockManager.java:1724)
    at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:3449)
    at 
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:692)
    at 
org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.addBlock(AuthorizationProviderProxyClientProtocol.java:217)
    at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:506)
    at 
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
    at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)

    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2281)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2277)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)

    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2275)

    at org.apache.hadoop.ipc.Client.call(Client.java:1504)
    at org.apache.hadoop.ipc.Client.call(Client.java:1441)
    at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)

    at com.sun.proxy.$Proxy11.addBlock(Unknown Source)
    at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.addBlock(ClientNamenodeProtocolTranslatorPB.java:423)

    at sun.reflect.GeneratedMethodAccessor28.invoke(Unknown Source)
    at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

    at java.lang.reflect.Method.invoke(Method.java:498)
    at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:258)
    at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)

    at com.sun.proxy.$Proxy12.addBlock(Unknown Source)
    at 
org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.locateFollowingBlock(DFSOutputStream.java:1860)
    at 
org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1656)
    at 
org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:790)
2018-07-02 14:50:24.949 ERROR (indexFetcher-41-thread-1) 
[c:UNCLASS_30DAYS s:shard37 r:core_node-1684300827 
x:UNCLASS_30DAYS_shard37_replica_t-1246382645] 
o.a.s.h.ReplicationHandler Exception in fetching index

org.apache.solr.common.SolrException: Error logging add
    at 
org.apache.solr.update.TransactionLog.write(TransactionLog.java:420)

    at org.apache.solr.update.UpdateLog.add(UpdateLog.java:535)
    at org.apache.solr.update.UpdateLog.add(UpdateLog.java:519)
    at 
org.apache.solr.update.UpdateLog.copyOverOldUpdates(UpdateLog.java:1213)
    at 
org.apache.solr.update.UpdateLog.copyAndSwitchToNewTlog(UpdateLog.java:1168)
    at 
org.apache.solr.update.UpdateLog.copyOverOldUpdates(UpdateLog.java:1155)
    at 
org.apache.solr.cloud.ReplicateFromLeader.lambda$startReplication$0(ReplicateFromLeader.java:100)
    at 
org.apache.solr.handler.ReplicationHandler.lambda$setupPolling$12(ReplicationHandler.java:1160)
    at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)

    at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
    at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
    at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
    at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)

    at java.lang.Thread.run(Thread.java:748)

Thank you very much for the help!

-Joe


On 7/2/2018 8:32 PM, Shawn Heisey wrote:

On 7/2/2018 1:40 PM, Joe Obernberger wrote:

Hi All - having this same 

Errors when using Blob API

2018-07-03 Thread Zahra Aminolroaya
I want to transfer my jar files to my ".system" collection in "Solrcloud".
One of my solr port is 

My jar file name is "norm", and the following is my command for this
transfer:
/
curl -X POST -H 'Content-Type:application/octet-stream' --data-binary
@norm.jar http://localhost:/solr/.system/blob/norm/

*However, I get the following error:*

/

org.apache.solr.common.SolrExceptionorg.apache.solr.common.SolrExceptionURLDecoder: Invalid character encoding detected after position 0
of query string / form data (while parsing as UTF-8)400
/

It is surprising that when I try to transfer the lucene jar files I also get
different errors as follows:

*for example, when I write the command:*

/curl -X POST -H 'Content-Type:application/octet-stream' --data-binary
@lucene-core-6.6.1.jar http://localhost:/solr/.system/blob/lucence/

*I get the following error:*

/

org.apache.solr.common.SolrExceptionorg.apache.solr.common.SolrExceptionapplication/x-www-form-urlencoded content length (2783509 bytes)
exceeds upload limit of 2048 KB400
/
*
or when I use the following command:*

/curl -X POST -H 'Content-Type:application/octet-stream' --data-binary
@slf4j-api-1.7.7.jar http://localhost:/solr/.system/blob/slf


/
*I get the following error:*

/org.apache.solr.common.SolrExceptionorg.apache.solr.common.SolrExceptionURLDecoder: Invalid digit (#19;) in escape (%) pattern400
/

What are these errors for even when I use the lucene default jar files?!!!

Is there any other way to insert jar files to .system collection?








--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


RE: 7.3 appears to leak

2018-07-03 Thread Markus Jelsma
Hello Erick,

Even the silliest ideas may help us, but unfortunately this is not the case. 
All our Solr nodes run binaries from the same source from our central build 
server, with the same libraries thanks to provisioning. Only schema and config 
are different, but the  directive is the same all over.

Are there any other ideas, speculations, whatever, on why only our main text 
collection leaks a SolrIndexSearcher instance on commit since 7.3.0 and every 
version up?

Many thanks?
Markus
 
-Original message-
> From:Erick Erickson 
> Sent: Friday 29th June 2018 19:34
> To: solr-user 
> Subject: Re: 7.3 appears to leak
> 
> This is truly puzzling then, I'm clueless. It's hard to imagine this
> is lurking out there and nobody else notices, but you've eliminated
> the custom code. And this is also very peculiar:
> 
> * it occurs only in our main text search collection, all other
> collections are unaffected;
> * despite what i said earlier, it is so far unreproducible outside
> production, even when mimicking production as good as we can;
> 
> Here's a tedious idea. Restart Solr with the -v option, I _think_ that
> shows you each and every jar file Solr loads. Is it "somehow" possible
> that your main collection is loading some jar from somewhere that's
> different than you expect? 'cause silly ideas like this are all I can
> come up with.
> 
> Erick
> 
> On Fri, Jun 29, 2018 at 9:56 AM, Markus Jelsma
>  wrote:
> > Hello Erick,
> >
> > The custom search handler doesn't interact with SolrIndexSearcher, this is 
> > really all it does:
> >
> >   public void handleRequestBody(SolrQueryRequest req, SolrQueryResponse 
> >rsp) throws Exception {
> > super.handleRequestBody(req, rsp);
> >
> > if (rsp.getToLog().get("hits") instanceof Integer) {
> >   rsp.addHttpHeader("X-Solr-Hits", 
> >String.valueOf((Integer)rsp.getToLog().get("hits")));
> > }
> > if (rsp.getToLog().get("hits") instanceof Long) {
> >   rsp.addHttpHeader("X-Solr-Hits", 
> >String.valueOf((Long)rsp.getToLog().get("hits")));
> > }
> >   }
> >
> > I am not sure this qualifies as one more to go.
> >
> > Re: compiler warnings on resources, yes! This and tests failing due to 
> > resources leaks have always warned me when i forgot to release something or 
> > decrement a reference. But except for the above method (and the token 
> > filters which i really can't disable) are all that is left.
> >
> > I am quite desperate about this problem so although i am unwilling to 
> > disable stuff, i can do it if i must. But i so reason, yet, to remove the 
> > search handler or the token filter stuff, i mean, how could those leak a 
> > SolrIndexSearcher?
> >
> > Let me know :)
> >
> > Many thanks!
> > Markus
> >
> > -Original message-
> >> From:Erick Erickson 
> >> Sent: Friday 29th June 2018 18:46
> >> To: solr-user 
> >> Subject: Re: 7.3 appears to leak
> >>
> >> bq. The only custom stuff left is an extension of SearchHandler that
> >> only writes numFound to the response headers.
> >>
> >> Well, one more to go ;). It's incredibly easy to overlook
> >> innocent-seeming calls that increment the underlying reference count
> >> of some objects but don't decrement them, usually through a close
> >> call. Which isn't necessarily a close if the underlying reference
> >> count is still > 0.
> >>
> >> You may infer that I've been there and done that ;). Sometime the
> >> compiler warnings about "resource leak" can help pinpoint those too.
> >>
> >> Best,
> >> Erick
> >>
> >> On Fri, Jun 29, 2018 at 9:16 AM, Markus Jelsma
> >>  wrote:
> >> > Hello Yonik,
> >> >
> >> > I took one node of the 7.2.1 cluster out of the load balancer so it 
> >> > would only receive shard queries, this way i could kind of 'safely' 
> >> > disable our custom components one by one, while keeping functionality in 
> >> > place by letting the other 7.2.1 nodes continue on with the full 
> >> > configuration.
> >> >
> >> > I am now at a point where literally all custom components are deleted or 
> >> > commented out in the config for the node running 7.4. The only custom 
> >> > stuff left is an extension of SearchHandler that only writes numFound to 
> >> > the response headers, and all the token filters in our schema.
> >> >
> >> > You were right, it was leaking exactly one SolrIndexSearcher instance on 
> >> > each commit. But, with all our stuff gone, the leak is still there! I 
> >> > triple checked it! Of course, the bastard is locally still not 
> >> > reproducible.
> >> >
> >> > So, what is next? I have no clues left.
> >> >
> >> > Many, many thanks,
> >> > Markus
> >> >
> >> > -Original message-
> >> >> From:Markus Jelsma 
> >> >> Sent: Thursday 28th June 2018 23:52
> >> >> To: solr-user@lucene.apache.org
> >> >> Subject: RE: 7.3 appears to leak
> >> >>
> >> >> Hello Yonik,
> >> >>
> >> >> If leaking a whole SolrIndexSearcher would cause this problem, then the 
> >> >> only custom component would be our copy/paste-and-enhance version 

Filtering solr suggest results

2018-07-03 Thread Arunan Sugunakumar
Hi,

I would like to know whether it is possible to filter the suggestions
returned by the suggest component according to a field. For example I have
a list of books published by different publications. I want to show
suggestions for a book title under a specific publication.

Thanks in Advance,

Arunan

*Sugunakumar Arunan*
Undergraduate - CSE | UOM

Email : aruna ns...@cse.mrt.ac.lk


Re: Server refused connection at: http://localhost:xxxx/solr/collectionName

2018-07-03 Thread Ritesh Kumar
I think this might be the problem. I was casting CloudSolrClient object or
HttpSolrClient object to SolrClient inside the indexing service and
performed add or query operation on this SolrClient object.

I had to cast the client object because the client object may be of any
type based on whether the Solr is running in cloud mode or stand-alone mode.

For now, I am using the CloudSolrClient.  Everything seems to be running
fine. I have not even closed the client object after execution.

I am still not sure if casting the client object to SolrClient was the
issue.

I am still looking for an answer where I should be able to run my
application in both the Solr modes with the same piece of code barring the
client object.



On Mon, Jul 2, 2018 at 7:49 PM Erick Erickson 
wrote:

> Given your other e-mail I suspect you're not closing the client
> and creating new ones for every update request.
>
> You should simply not run out of connections, your client is
> most probably incorrect.
>
> Best,
> Erick
>
> On Mon, Jul 2, 2018 at 3:38 AM, Ritesh Kumar
>  wrote:
> > I could get the live Solr nodes using this piece of code
> >
> > ZkStateReader zkStateReader = client.getZkStateReader();
> > ClusterState clusterState = zkStateReader.getClusterState();
> > Set liveNodes = clusterState.getLiveNodes();
> >
> > This way, I will be able to send a query to one of the live nodes and
> > Zookeeper will take care of the rest, but, I was wondering if this is a
> > good practice to query from SolrCloud.
> >
> > What if the Solr node goes down in the middle of bulk indexing.
> >
> > On Mon, Jul 2, 2018 at 3:37 PM Ritesh Kumar <
> ritesh.ku...@hotwaxsystems.com>
> > wrote:
> >
> >> I did use CloudSolrClient to query or index data. I did not have to
> check
> >> which Solr node is active. The problem I am facing during bulk indexing
> is
> >> that the Zookeeper runs out of connections resulting in Connection
> Timeout
> >> error.
> >>
> >> How can I get to know in advance the active Solr nodes? Any reference
> >> would be helpful.
> >>
> >> Thanks
> >>
> >> On Mon, Jul 2, 2018 at 2:36 PM Yasufumi Mizoguchi <
> yasufumi0...@gmail.com>
> >> wrote:
> >>
> >>> Hi,
> >>>
> >>> I think ZooKeeper can not notice requests to dead nodes, if you send
> >>> requests to Solr nodes directly.
> >>> It will be better that asking ZooKeeper which Solr nodes will be
> running
> >>> before requesting Solr nodes with CloudSolrClient etc...
> >>>
> >>> Thanks,
> >>> Yasufumi
> >>>
> >>> 2018年7月2日(月) 16:49 Ritesh Kumar :
> >>>
> >>> > Hello Team,
> >>> >
> >>> > I have two Solr nodes running in cloud mode. I know that we send
> queries
> >>> > and updates directly to Solr's collection e.g.http://host:
> >>> > port/solr/. Any of the Solr nodes can be used.
> If
> >>> the
> >>> > node does not have the collection being queried then the request
> will be
> >>> > forwarded internally to a Solr instance which has that collection.
> >>> >
> >>> > But, my question is what happens when the node being queried is
> down. I
> >>> am
> >>> > getting this
> >>> > error: Server refused connection at http://localhost:
> >>> > /solr/collectionName.
> >>> >
> >>> > Does not Zookeeper handle this scenario?
> >>> >
> >>> > Everything is fine when the node being queried is running. I am able
> to
> >>> > index and fetch data.
> >>> >
> >>> > Please, help me.
> >>> >
> >>> > Best,
> >>> > Ritesh Kumar
> >>> >
> >>>
> >>
>


Re: Scores with Solr Suggester

2018-07-03 Thread Christian Ortner
Hi Christine,

suggesters work differently than regular search as they complete an input
query, usually based on a state machine built from a dictionary. If you
want the similarity of input and suggestion, you can create a search
component to compute it yourself and set the value in the payload field. If
suggestions should be returned in a re-ordered way already, it would have
to be done in such a search component as well.

Cheers,
Chris

On Mon, Jul 2, 2018 at 8:31 PM, Buckler, Christine <
christine.buck...@nordstrom.com> wrote:

> Is it possible to return a score field for Suggester results like it does
> with standard search? I am looking for the score which quantifies how close
> of a match between type entered and suggestion result (not the weight
> associated with the suggestion). Is this possible?
>
>
>
> Christine Buckler
>
> [image: id:image001.png@01D3F81F.AF489300]christinebuckler
>
> [image: id:image002.png@01D3F81F.AF489300]206.295.6772
>


Re: A user defined request handler is failing to fetch the data.

2018-07-03 Thread Adarsh_infor
@Erick @Shawn

Adding to  my previous comment. 


the below command works . 

http://localhost:8983/solr/FI_idx/select?q=*:*=true=localhost:8983/solr/FI_idx

but the same will not work with filesearch new search handler.  

http://localhost:8983/solr/FI_idx/filesearch?q=*:*=true=localhost:8983/solr/FI_idx

This gives me error, so am trying to figure what difference could have
caused the failed for the second http command.  

And the second http command works fine if i change the lucenematchversion in
solrconfig to LUCENE_40 which is totally weird because the indexing has
happened with lucenematchversion 6.6.3 but search works fine with LUCENE_40. 
which is totally weird behaviour.  

Thanks





--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Logging fails when starting Solr in Windows using solr.cmd

2018-07-03 Thread jakob

Hi,

I was intending to open an Issue in Jira when I read that I'm supposed 
to first contact this mailinglist.


Problem description
==

System: Microsoft Windows 10 Enterprise Version 10.0.16299 Build 16299

Steps to reproduce the problem:
1) Download solr-7.4.0.tgz
2) Unzip to C:\solr-7.4.0
3) No changes (configuration or otherwise) whatsoever
4) Open cmd.exe
5) Execute the following command: cd c:\solr-7.4.0\bin
6) Execute the following command: solr.cmd start -p 8983
7) The following console output appears:


c:\solr-7.4.0\bin>solr.cmd start -p 8983
ERROR StatusLogger Unable to access 
file:/c:/solr-7.4.0/server/file:c:/solr-7.4.0/server/scripts/cloud-scripts/log4j2.xml
 java.io.FileNotFoundException: 
c:\solr-7.4.0\server\file:c:\solr-7.4.0\server\scripts\cloud-scripts\log4j2.xml 
(Die Syntax für den Dateinamen, Verzeichnisnamen oder die 
Datenträgerbezeichnung ist falsch)

at java.io.FileInputStream.open0(Native Method)
at java.io.FileInputStream.open(FileInputStream.java:195)
at java.io.FileInputStream.(FileInputStream.java:138)
at java.io.FileInputStream.(FileInputStream.java:93)
at 
sun.net.www.protocol.file.FileURLConnection.connect(FileURLConnection.java:90)
at 
sun.net.www.protocol.file.FileURLConnection.getInputStream(FileURLConnection.java:188)

at java.net.URL.openStream(URL.java:1045)
at 
org.apache.logging.log4j.core.config.ConfigurationSource.fromUri(ConfigurationSource.java:247)
at 
org.apache.logging.log4j.core.config.ConfigurationFactory$Factory.getConfiguration(ConfigurationFactory.java:404)
at 
org.apache.logging.log4j.core.config.ConfigurationFactory$Factory.getConfiguration(ConfigurationFactory.java:346)
at 
org.apache.logging.log4j.core.config.ConfigurationFactory.getConfiguration(ConfigurationFactory.java:260)
at 
org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:615)
at 
org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:636)
at 
org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:231)
at 
org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:153)
at 
org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:45)
at 
org.apache.logging.log4j.LogManager.getContext(LogManager.java:194)
at 
org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:121)
at 
org.apache.logging.slf4j.Log4jLoggerFactory.getContext(Log4jLoggerFactory.java:43)
at 
org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:46)
at 
org.apache.logging.slf4j.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:29)

at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:358)
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:383)
at org.apache.solr.util.SolrCLI.(SolrCLI.java:228)
ERROR StatusLogger Unable to access 
file:/c:/solr-7.4.0/server/file:c:/solr-7.4.0/server/resources/log4j2.xml
 java.io.FileNotFoundException: 
c:\solr-7.4.0\server\file:c:\solr-7.4.0\server\resources\log4j2.xml (Die 
Syntax für den Dateinamen, Verzeichnisnamen oder die 
Datenträgerbezeichnung ist falsch)

at java.io.FileInputStream.open0(Native Method)
at java.io.FileInputStream.open(FileInputStream.java:195)
at java.io.FileInputStream.(FileInputStream.java:138)
at java.io.FileInputStream.(FileInputStream.java:93)
at 
sun.net.www.protocol.file.FileURLConnection.connect(FileURLConnection.java:90)
at 
sun.net.www.protocol.file.FileURLConnection.getInputStream(FileURLConnection.java:188)

at java.net.URL.openStream(URL.java:1045)
at 
org.apache.logging.log4j.core.config.ConfigurationSource.fromUri(ConfigurationSource.java:247)
at 
org.apache.logging.log4j.core.config.ConfigurationFactory$Factory.getConfiguration(ConfigurationFactory.java:404)
at 
org.apache.logging.log4j.core.config.ConfigurationFactory$Factory.getConfiguration(ConfigurationFactory.java:346)
at 
org.apache.logging.log4j.core.config.ConfigurationFactory.getConfiguration(ConfigurationFactory.java:260)
at 
org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:615)
at 
org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:636)
at 
org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:231)
at 
org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:153)
at 
org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:45)
at 
org.apache.logging.log4j.LogManager.getContext(LogManager.java:194)
at 
org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:121)
at 

Re: A user defined request handler is failing to fetch the data.

2018-07-03 Thread Adarsh_infor
@Shawn Heisey-2

When we say recursive shards, what does that mean?  My distributed node will
not have any data in it it will be just used for searching all the
shards(nodes) where the documents are indexed and try to get consolidated
data from it. My only problem here is if i change the 
to LUCENE_40 everything seems to be working fine, but if we change that to
6.6.3 or LUCENE_CURRENT it starts breaking so does that meaning the
distributed search is not supported from Lucene 6.* version ?

Thanks 




--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html