Re: Doing calculation for fields indexed in StandardTokenizer

2018-02-18 Thread Zheng Lin Edwin Yeo
Hi,

This is the Aggregation Functions in Solr Facet Functions, which I
understand that it only works in numeric field. But if the field is already
indexed in string/text, is it possible for the field to be mapped to
integer/float without re-indexing?

Regards,
Edwin


On 18 February 2018 at 09:20, Zheng Lin Edwin Yeo 
wrote:

> Hi Sir,
>
> We have a field that has been extracted from Tika under the attr_* dynamic
> field which is indexed using StandardTokenizer.
>
> Now we have a field called attr_stream_size, which we want to do some
> calculation (Eg: sum) using JSON Facet. However, as the field is not
> indexed in integer/float, is it possible to still do the calculation using
> JSON Facet, or can the field be mapped to integer/float without re-indexing?
>
> Regards,
> Edwin
>


Re: solr cloud unique key query request is sent to all shards!

2018-02-18 Thread Tomas Fernandez Lobbe
In real-time get, the parameter name is “id”, regardless of the name of the 
unique key. 

The request should be in your case: 
http://:8080/api/collections/col1/get?id=69749398

See: https://lucene.apache.org/solr/guide/7_2/realtime-get.html

Sent from my iPhone

> On Feb 18, 2018, at 9:28 PM, Ganesh Sethuraman  
> wrote:
> 
> I tried this real time get on my collection using the both V1 and V2 URL
> for real time get, but did not work!!!
> 
> http://:8080/api/collections/col1/get?myid:69749398
> 
> it returned...
> 
> {
>  "doc":null}
> 
> same issue with V1 URL as well, http://
> :8080/solr/col1/get?myid:69749398
> 
> however if i do q=myid:69749398 with "select" request handler seems to
> fine. I checked my schema again and it is configured correctly.  Like below:
> 
> myid
> 
> Also i see that this implicit request handler is configured correctly Any
> thoughts, what I might be missing?
> 
> 
> 
> On Sun, Feb 18, 2018 at 11:18 PM, Tomas Fernandez Lobbe 
> wrote:
> 
>> I think real-time get should be directed to the correct shard. Try:
>> [COLLECTION]/get?id=[YOUR_ID]
>> 
>> Sent from my iPhone
>> 
>>> On Feb 18, 2018, at 3:17 PM, Ganesh Sethuraman 
>> wrote:
>>> 
>>> Hi
>>> 
>>> I am using Solr 7.2.1. I have 8 shards in two nodes (two different m/c)
>>> using Solr Cloud. The data was indexed with a unique key (default
>> composite
>>> id) using the CSV update handler (batch indexing). Note that I do NOT
>> have
>>>  while indexing.   Then when I try to  query the
>>> collection col1 based on my primary key (as below), I see that in the
>>> 'debug' response that the query was sent to all the shards and when it
>>> finds the document in one the shards it sends a GET FIELD to that shard
>> to
>>> get the data.  The problem is potentially high response time, and more
>>> importantly scalability issue as unnecessarily all shards are being
>> queried
>>> to get one document (by unique key).
>>> 
>>> http://:8080/solr/col1/select?debug=true=id:69749278
>>> 
>>> Is there a way to query to reach the right shard based on the has of the
>>> unique key?
>>> 
>>> Regards
>>> Ganesh
>> 


Re: solr cloud unique key query request is sent to all shards!

2018-02-18 Thread Ganesh Sethuraman
I tried this real time get on my collection using the both V1 and V2 URL
for real time get, but did not work!!!

http://:8080/api/collections/col1/get?myid:69749398

it returned...

{
  "doc":null}

same issue with V1 URL as well, http://
:8080/solr/col1/get?myid:69749398

however if i do q=myid:69749398 with "select" request handler seems to
fine. I checked my schema again and it is configured correctly.  Like below:

myid

Also i see that this implicit request handler is configured correctly Any
thoughts, what I might be missing?



On Sun, Feb 18, 2018 at 11:18 PM, Tomas Fernandez Lobbe 
wrote:

> I think real-time get should be directed to the correct shard. Try:
> [COLLECTION]/get?id=[YOUR_ID]
>
> Sent from my iPhone
>
> > On Feb 18, 2018, at 3:17 PM, Ganesh Sethuraman 
> wrote:
> >
> > Hi
> >
> > I am using Solr 7.2.1. I have 8 shards in two nodes (two different m/c)
> > using Solr Cloud. The data was indexed with a unique key (default
> composite
> > id) using the CSV update handler (batch indexing). Note that I do NOT
> have
> >  while indexing.   Then when I try to  query the
> > collection col1 based on my primary key (as below), I see that in the
> > 'debug' response that the query was sent to all the shards and when it
> > finds the document in one the shards it sends a GET FIELD to that shard
> to
> > get the data.  The problem is potentially high response time, and more
> > importantly scalability issue as unnecessarily all shards are being
> queried
> > to get one document (by unique key).
> >
> > http://:8080/solr/col1/select?debug=true=id:69749278
> >
> > Is there a way to query to reach the right shard based on the has of the
> > unique key?
> >
> > Regards
> > Ganesh
>


Re: solr cloud unique key query request is sent to all shards!

2018-02-18 Thread Tomas Fernandez Lobbe
I think real-time get should be directed to the correct shard. Try:  
[COLLECTION]/get?id=[YOUR_ID]

Sent from my iPhone

> On Feb 18, 2018, at 3:17 PM, Ganesh Sethuraman  
> wrote:
> 
> Hi
> 
> I am using Solr 7.2.1. I have 8 shards in two nodes (two different m/c)
> using Solr Cloud. The data was indexed with a unique key (default composite
> id) using the CSV update handler (batch indexing). Note that I do NOT have
>  while indexing.   Then when I try to  query the
> collection col1 based on my primary key (as below), I see that in the
> 'debug' response that the query was sent to all the shards and when it
> finds the document in one the shards it sends a GET FIELD to that shard to
> get the data.  The problem is potentially high response time, and more
> importantly scalability issue as unnecessarily all shards are being queried
> to get one document (by unique key).
> 
> http://:8080/solr/col1/select?debug=true=id:69749278
> 
> Is there a way to query to reach the right shard based on the has of the
> unique key?
> 
> Regards
> Ganesh


Re: Solr streaming expression - options for Full Outer Join

2018-02-18 Thread Joel Bernstein
I forgot to mention, in order to do a join you would merge the streams
together that you wanted to join. Then reduce by the join key.
This is basic structure:

reduce(merge(search(), search()))

Joel Bernstein
http://joelsolr.blogspot.com/

On Sun, Feb 18, 2018 at 10:45 PM, Joel Bernstein  wrote:

> If you aren't getting the join functionality you want with the current
> join implementations you could try the reduce function using the group
> operation.
>
> Here is the sample syntax:
>
> reduce(search(collection1, q=*:*, fl="id,a_s,a_i,a_f", sort="a_s asc, a_f 
> asc"),
>by="a_s",
>group(sort="a_f desc", n="4")
> )
>
>
> The is a basic map/reduce grouping operation.
>
>
> Joel Bernstein
> http://joelsolr.blogspot.com/
>
> On Sun, Feb 18, 2018 at 6:24 PM, GaneshSe  wrote:
>
>> Any help solr streaming expression option is greatly appreciated. Please
>> help
>>
>>
>>
>> --
>> Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html
>>
>
>


Re: Solr streaming expression - options for Full Outer Join

2018-02-18 Thread Joel Bernstein
If you aren't getting the join functionality you want with the current join
implementations you could try the reduce function using the group
operation.

Here is the sample syntax:

reduce(search(collection1, q=*:*, fl="id,a_s,a_i,a_f", sort="a_s asc, a_f asc"),
   by="a_s",
   group(sort="a_f desc", n="4")
)


The is a basic map/reduce grouping operation.


Joel Bernstein
http://joelsolr.blogspot.com/

On Sun, Feb 18, 2018 at 6:24 PM, GaneshSe  wrote:

> Any help solr streaming expression option is greatly appreciated. Please
> help
>
>
>
> --
> Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html
>


RE: Index size increases disproportionately to size of added field when indexed=false

2018-02-18 Thread Howe, David

Hi Erick & Alessandro,

I have solved my problem by re-ordering the data in the SQL query.  I don't 
know why it works but it does.  I can consistently re-produce the problem 
without changing anything else except the database table.  As our Solr build is 
scripted and we always build a new Solr server from scratch, I'm pretty 
confident that the defaults haven't changed between test runs as when we create 
the Solr index, Solr doesn't know what order the data in the database table is 
in.

I did try removing the geo location field to see if that made a difference, and 
it didn't.

Due to project commitments, I don't have any time to investigate this further 
at the moment.  When/if things quiet down I may see if I can reproduce the 
problem with a smaller number of records loaded from a flat file to make it 
easier to share a project that shows the problem occurring.

Thanks again for all of your assistance and suggestions.

Regards,

David

David Howe
Java Domain Architect
Postal Systems
Level 16, 111 Bourke Street Melbourne VIC 3000

T  0391067904

M  0424036591

E  david.h...@auspost.com.au

W  auspost.com.au
W  startrack.com.au

Australia Post is committed to providing our customers with excellent service. 
If we can assist you in any way please telephone 13 13 18 or visit our website.

The information contained in this email communication may be proprietary, 
confidential or legally professionally privileged. It is intended exclusively 
for the individual or entity to which it is addressed. You should only read, 
disclose, re-transmit, copy, distribute, act in reliance on or commercialise 
the information if you are authorised to do so. Australia Post does not 
represent, warrant or guarantee that the integrity of this email communication 
has been maintained nor that the communication is free of errors, virus or 
interference.

If you are not the addressee or intended recipient please notify us by replying 
direct to the sender and then destroy any electronic or paper copy of this 
message. Any views expressed in this email communication are taken to be those 
of the individual sender, except where the sender specifically attributes those 
views to Australia Post and is authorised to do so.

Please consider the environment before printing this email.


Re: coord in SolR 7

2018-02-18 Thread Ahmet Arslan
Hi Andreas,

Can weak AND (WAND) be used in your use case?

https://issues.apache.org/jira/browse/LUCENE-8135

Ahmet




On Monday, February 12, 2018, 1:44:38 PM GMT+3, Moll, Dr. Andreas 
 wrote: 





Hi,

I try to upgrade our SolR installation from SolR 5 to 7.
We use a customized similarity class that heavily depends on the coordination 
factor to scale the similarity for OR-queries with multiple terms.
Since SolR 7 this feature has been removed. Is there any hook to implement this 
in our own similarity class with SolR 7?

Best regards

Andreas Moll

Vertraulichkeitshinweis
Diese Information und jeder uebermittelte Anhang beinhaltet vertrauliche 
Informationen und ist nur fuer die Personen oder das Unternehmen bestimmt, an 
welche sie tatsaechlich gerichtet ist. Sollten Sie nicht der 
Bestimmungsempfaenger sein, weisen wir Sie darauf hin, dass die Verbreitung, 
das (auch teilweise) Kopieren sowie der Gebrauch der empfangenen E-Mail und der 
darin enthaltenen Informationen gesetzlich verboten sein kann und 
gegebenenfalls Schadensersatzpflichten ausloesen kann. Sollten Sie diese 
Nachricht aufgrund eines Uebermittlungsfehlers erhalten haben, bitten wir Sie 
den Sender unverzueglich hiervon in Kenntnis zu setzen.
Sicherheitswarnung: Bitte beachten Sie, dass das Internet kein sicheres 
Kommunikationsmedium ist. Obwohl wir im Rahmen unseres Qualitaetsmanagements 
und der gebotenen Sorgfalt Schritte eingeleitet haben, um einen 
Computervirenbefall weitestgehend zu verhindern, koennen wir wegen der Natur 
des Internets das Risiko eines Computervirenbefalls dieser E-Mail nicht 
ausschliessen.


RE: Getting the error - The field '*********' does not support spatial filtering

2018-02-18 Thread Howe, David
Hi Aakanksha,

We use the following for geo queries which works for us:

/solr/core/select?defType=edismax=on=0=0=json=true=on=*=%7B!geofilt%7D=-6.08165,145.8612430=10=geoLocation=geodist()%20asc=10=*,score,distance:geodist()

This gives us the results closest to the provided point in order of their 
distance from the point.

Our field definition is:

  echo "$(date) Creating geoLocation field"
  curl -X POST -H 'Content-type:application/json' --data-binary '{
"add-field":{
   "name":"geoLocation",
   "type":"location",
   "stored":true,
   "indexed":true
}
  }' http://localhost:8983/solr/core/schema

We are running Solr 7.1.0.

Hope this helps.

Regards,

David


From: Aakanksha Gupta [mailto:aakankshagupta2...@gmail.com]
Sent: Monday, 19 February 2018 12:27 AM
To: solr-user@lucene.apache.org
Subject: Getting the error - The field '*' does not support spatial 
filtering

Hi all,
I'm a newbie to Solr. I'm trying to use it for GeoSpatial Search and I'm facing 
an issue while using it. I've tried using the new 'location' field 
type() as well as the deprecated solr.LatLonType fieldtype, but I 
always get the error:



org.apache.solr.common.SolrException: The field latlong does not support 
spatial filtering
Here's a snippet of my field definition in schema.xml in the conf folder of my 
core:














And here are the field type definitions:


Here's the Query I'm running:
http://localhost:8983/solr/geo2/select?wt=json=:={!geofilt
 sfield=latlong}=-6.08165,145.8612430=100

http://localhost:8983/solr/geo2/select/?q=*:*={!geofilt}=latlong2=-6.08165,145.8612430=100=json
And here's the Java snippet I'm using to insert data:
String urlString = "http://localhost:8983/solr/geo2;;
SolrClient solr = new HttpSolrClient.Builder(urlString).build();
SolrInputDocument document = new SolrInputDocument();
document.addField("id", UUID.randomUUID().toString());
document.addField("driverid", "1");
document.addField("latlong", "-6.081689,145.391881");
document.addField("time", "7:01:17");
document.addField("timestamp", Long.valueOf("1518908477190"));
document.addField("latlong2", "-6.081689,145.391881");
document.addField("location_0_coordinate", Double.valueOf(-6.081689));
document.addField("location_1_coordinate", Double.valueOf(145.391881));
UpdateResponse response = solr.add(document);
solr.commit();
response.getQTime();

I've attached my schema.xml file herewith. Can someone let me know what I'm 
doing wrong?



David Howe
Java Domain Architect
Postal Systems
Australia Post

Level 16, 111 Bourke Street Melbourne VIC 3000

T  0391067904

M 0424036591

E  david.h...@auspost.com.au
[Australia Post website]
[StarTrack website]
[Follow us on Twitter] [Like us on Facebook] 
  [Connect with us on LinkedIn] 



Australia Post is committed to providing our customers with excellent service. 
If we can assist you in any way please telephone 13 13 18 or visit our website.

The information contained in this email communication may be proprietary, 
confidential or legally professionally privileged. It is intended exclusively 
for the individual or entity to which it is addressed. You should only read, 
disclose, re-transmit, copy, distribute, act in reliance on or commercialise 
the information if you are authorised to do so. Australia Post does not 
represent, warrant or guarantee that the integrity of this email communication 
has been maintained nor that the communication is free of errors, virus or 
interference.

If you are not the addressee or intended recipient please notify us by replying 
direct to the sender and then destroy any electronic or paper copy of this 
message. Any views expressed in this email communication are taken to be those 
of the individual sender, except where the sender specifically attributes those 
views to Australia Post and is authorised to do so.

Please consider the environment before printing this email.


Re: Solr streaming expression - options for Full Outer Join

2018-02-18 Thread GaneshSe
Any help solr streaming expression option is greatly appreciated. Please help



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


solr cloud unique key query request is sent to all shards!

2018-02-18 Thread Ganesh Sethuraman
Hi

I am using Solr 7.2.1. I have 8 shards in two nodes (two different m/c)
using Solr Cloud. The data was indexed with a unique key (default composite
id) using the CSV update handler (batch indexing). Note that I do NOT have
 while indexing.   Then when I try to  query the
collection col1 based on my primary key (as below), I see that in the
'debug' response that the query was sent to all the shards and when it
finds the document in one the shards it sends a GET FIELD to that shard to
get the data.  The problem is potentially high response time, and more
importantly scalability issue as unnecessarily all shards are being queried
to get one document (by unique key).

http://:8080/solr/col1/select?debug=true=id:69749278

Is there a way to query to reach the right shard based on the has of the
unique key?

Regards
Ganesh


Re: nullpointer exception - while adding pdf document

2018-02-18 Thread Erick Erickson
That error is not from the PDF parsing, it's from using the _default
(schemaless) schema. As the startup message says, schemaless mode is
not recommended for production because it's impossible to guess right
every time.

So I'd start with a schema that does not do this, see the reference
guide section "schemaless mode", and "managed schema" and also the
section of "classic" schemas...

Best,
Erick

On Sat, Feb 17, 2018 at 2:13 PM, Chinmay Tijare
 wrote:
> stack trace below. Please help. Anybody able to fix it before or seen this?
>
> Error from server at http://localhost:8983/solr/demo2:
> java.lang.NullPointerException
> at
> org.apache.solr.update.processor.AddSchemaFieldsUpdateProcessorFactory$AddSchemaFieldsUpdateProcessor.mapValueClassesToFieldType(AddSchemaFieldsUpdateProcessorFactory.java:508)
> at
> org.apache.solr.update.processor.AddSchemaFieldsUpdateProcessorFactory$AddSchemaFieldsUpdateProcessor.processAdd(AddSchemaFieldsUpdateProcessorFactory.java:395)
> at
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
> at
> org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)
> at
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
> at
> org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)
> at
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
> at
> org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)
> at
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
> at
> org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)
> at
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
> at
> org.apache.solr.update.processor.FieldNameMutatingUpdateProcessorFactory$1.processAdd(FieldNameMutatingUpdateProcessorFactory.java:74)
> at
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
> at
> org.apache.solr.update.processor.FieldMutatingUpdateProcessor.processAdd(FieldMutatingUpdateProcessor.java:118)
> at
> org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:55)
> at
> org.apache.solr.update.processor.AbstractDefaultValueUpdateProcessorFactory$DefaultValueUpdateProcessor.processAdd(AbstractDefaultValueUpdateProcessorFactory.java:91)
> at
> org.apache.solr.handler.loader.JavabinLoader$1.update(JavabinLoader.java:98)
> at
> org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec$1.readOuterMostDocIterator(JavaBinUpdateRequestCodec.java:188)
> at
> org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec$1.readIterator(JavaBinUpdateRequestCodec.java:144)
> at
> org.apache.solr.common.util.JavaBinCodec.readObject(JavaBinCodec.java:311)
> at org.apache.solr.common.util.JavaBinCodec.readVal(JavaBinCodec.java:256)
> at
> org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec$1.readNamedList(JavaBinUpdateRequestCodec.java:130)
> at
> org.apache.solr.common.util.JavaBinCodec.readObject(JavaBinCodec.java:276)
> at org.apache.solr.common.util.JavaBinCodec.readVal(JavaBinCodec.java:256)
> at org.apache.solr.common.util.JavaBinCodec.unmarshal(JavaBinCodec.java:178)
> at
> org.apache.solr.client.solrj.request.JavaBinUpdateRequestCodec.unmarshal(JavaBinUpdateRequestCodec.java:195)
> at
> org.apache.solr.handler.loader.JavabinLoader.parseAndLoadDocs(JavabinLoader.java:108)
> at org.apache.solr.handler.loader.JavabinLoader.load(JavabinLoader.java:55)
> at
> org.apache.solr.handler.UpdateRequestHandler$1.load(UpdateRequestHandler.java:97)
> at
> org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:68)
> at
> org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:177)
> at org.apache.solr.core.SolrCore.execute(SolrCore.java:2484)
> at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:720)
> at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:526)
> at
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:382)
> at
> org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:326)
> at
> org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1751)
> at
> org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:582)
> at
> org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
> at
> org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:548)
> at
> org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226)
> at
> org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180)

Getting the error - The field '*********' does not support spatial filtering

2018-02-18 Thread Aakanksha Gupta
Hi all,
I'm a newbie to Solr. I'm trying to use it for GeoSpatial Search and I'm
facing an issue while using it. I've tried using the new 'location' field
type() as well as the deprecated solr.LatLonType fieldtype,
but I always get the error:

org.apache.solr.common.SolrException: The field latlong does not
support spatial filtering

Here's a snippet of my field definition in schema.xml in the conf folder of
my core:















And here are the field type definitions:



Here's the Query I'm running:
http://localhost:8983/solr/geo2/select?wt=json=:={!geofilt
sfield=latlong}=-6.08165,145.8612430=100

http://localhost:8983/solr/geo2/select/?q=*:*={!geofilt}=latlong2=-6.08165,145.8612430=100=json

And here's the Java snippet I'm using to insert data:
String urlString = "http://localhost:8983/solr/geo2;;
SolrClient solr = new HttpSolrClient.Builder(urlString).build();
SolrInputDocument document = new SolrInputDocument();
document.addField("id", UUID.randomUUID().toString());
document.addField("driverid", "1");
document.addField("latlong", "-6.081689,145.391881");
document.addField("time", "7:01:17");
document.addField("timestamp", Long.valueOf("1518908477190"));
document.addField("latlong2", "-6.081689,145.391881");
document.addField("location_0_coordinate",
Double.valueOf(-6.081689));
document.addField("location_1_coordinate",
Double.valueOf(145.391881));
UpdateResponse response = solr.add(document);
solr.commit();
response.getQTime();


I've attached my schema.xml file herewith. Can someone let me know what I'm
doing wrong?







	
	
	
	
	



	

id

	
	


































  

  




  




  
  




  





  









  
  








  





  









  
  







  





  

  
  

  





  

  
  

  














  


  


  


  


  


  







   







  





   
 

 
   
  





   





   
  





  






  





   



   
  





   



   
  





   







  





   





  





   





  





   




  





  







  





   





  





   








  





   








  





   





  





   








  





   




   
  





   




  





   





  



  
  
   







  





  















  





   




  





   





  





   


Custom Solr Sorting + Query

2018-02-18 Thread ~$alpha`
I want to write a custom Solr query and sorting and when I am calling
http://127.0.0.1:8983/solr/techproducts/select?q={!myparsername}12=uid
desc, some of the section as marked is not getting called.

Issue: Documentation of solr customization is pretty poor

Help: Can you please tell why my code is not getting called.

Why I needed customization: Based on logged in user info like user age,
location and 10 other parameters I need to sort. Also, need sorting value
for tagging.
Default Solr expression is very costly.

=== class-1==

public class MyPlugin extends QParserPlugin {

@Override
public QParser createParser(String qstr, SolrParams localParams,
SolrParams params, SolrQueryRequest req) {
return new CustomQParser(qstr, localParams, params, req);   

}

private static class CustomQParser extends QParser{

private Query innerquery;

public CustomQParser(String qstr, SolrParams localParams, 
SolrParams
params, SolrQueryRequest req) {
//--> getting called ...(1)
super(qstr, localParams, params, req);
try {
QParser parser = getParser(qstr, getReq());
this.innerquery = parser.parse();
}catch(SyntaxError ex) {
throw new RuntimeException("error parsing 
query", ex);
}
}

@Override
public Query parse() throws SyntaxError {
//--> getting called ...(2)
return new MatchTheQuery(innerquery);
}

}

}


== class-2 == 
public class MatchTheQuery extends CustomScoreQuery {

private Query subquery;

public MatchTheQuery(Query subquery) {
//--> getting called ...(3)
super(subquery); 
this.subquery = subquery;
}

@Override
protected CustomScoreProvider getCustomScoreProvider(LeafReaderContext
context) {
// not getting called :-(
return new MyScoreProvider(subquery, context);
}
}

== class-3 == 
public class MyScoreProvider extends CustomScoreProvider {

private Query subquery;

public MyScoreProvider(Query subquery, LeafReaderContext context) {
super(context);
this.subquery = subquery;
}

@Override
public float customScore(int doc, float subqueryScore, float
valSrcScores[]) throws IOException {

// not getting called :-(

}


}



--
Sent from: http://lucene.472066.n3.nabble.com/Solr-User-f472068.html


Re: Solr running on Tomcat

2018-02-18 Thread Shawn Heisey

On 2/17/2018 9:27 PM, GVK Prasad wrote:

My assumption was hosting on Jetty may not work for production and higher 
performance systems needs and we have to go for Servers like Tomcat.  If it is 
not supported with latest version not sure how it will help us. Thanks for 
clarification.


It is probably possible to still run Solr in Tomcat.  I can't find 
anything outside of tests and the embedded server that explicitly uses 
Jetty classes -- so the application should work properly in most servlet 
containers, as long as there aren't dependency conflicts with Solr's 
large list of dependencies, which are a problem for some containers.


Because official support has been removed for all containers other than 
Jetty, it is possible that at some point in the future Solr *will* 
explicitly use libraries from Jetty, and if that happens, then running 
in any other container will not work.  If whatever feature is being 
developed can be done with container-agnostic code, that will be preferred.


The Jetty install that ships with Solr has been adjusted a little bit 
for Solr's needs.  It is likely that a Tomcat install would require some 
config changes if it needs to scale very much.


===

A little company you might have heard of (Google) needed a servlet 
container for something they were working on, currently known as Google 
App Engine.


Initially, they were running Tomcat.  But after a while (no idea how 
long) they switched to Jetty.


https://www.infoq.com/news/2009/08/google-chose-jetty

If there was any concern about Jetty's performance compared to the 800 
pound gorilla that Tomcat is in its space, Google wouldn't be using it.  
They care a great deal about things like memory usage and modularity, 
but they wouldn't switch if they couldn't get similar (or better) 
performance.


Jetty is used by a LOT of other programs.  Here's a few of them:

https://www.eclipse.org/jetty/powered/powered.html

(Just noticed that Solr is NOT on that page!  I will look into that!)

Thanks,
Shawn