Re: Failed to connect to server

2020-01-17 Thread rhys J
On Fri, Jan 17, 2020 at 12:10 PM David Hastings < hastings.recurs...@gmail.com> wrote: > something like this in your solr config: > > autosuggest "exactMatchFirst">false text str> 0.005 > DocumentDictionaryFactory title "weightField">weight true "buildOnOptimize">true > > I checked both

Re: Failed to connect to server

2020-01-17 Thread rhys J
On Thu, Jan 16, 2020 at 3:48 PM David Hastings wrote: > > 'Error: Solr core is loading' > > do you have any suggesters or anything configured that would get rebuilt? > > > I don't think so? But I'm not quite sure what you are asking? > Rhys

Re: Error while updating: java.lang.NumberFormatException: empty String

2020-01-16 Thread rhys J
On Thu, Jan 16, 2020 at 3:10 PM Edward Ribeiro wrote: > Hi, > > There is a status_code in the JSON snippet and it is going as a string with > single space. Maybe it is an integer? > > Best, > Edward > > Oh wow, yes you are right. When I adjusted the status_code to not be a space, it fixed

Re: Failed to connect to server

2020-01-16 Thread rhys J
On Thu, Jan 16, 2020 at 3:27 PM Edward Ribeiro wrote: > A regular update is a delete followed by an indexing of the document. So > technically both are indexes. :) If there's an atomic update ( > https://lucene.apache.org/solr/guide/8_4/updating-parts-of-documents.html > ), Solr would throw some

Error while updating: java.lang.NumberFormatException: empty String

2020-01-16 Thread rhys J
While updating my Solr core, I ran into a problem with this curl statement. When I looked up the error, the only reference I could find was that maybe a float was being added as null. So I changed all the float fields from 'null' to '0.00'. But I still get the error. Float fields as per the

Failed to connect to server

2020-01-16 Thread rhys J
I have noticed that if I am using curl to index a csv file *and* using curl thru a script to update the Solr cores, that I get the following error: curl: (7) Failed to connect to 10.40.10.14 port 8983: Connection refused Can I only index *or* update, but not do both? I am not running shards or

Re: Trouble adding a csv file - invalid date string error

2020-01-14 Thread rhys J
I went ahead and adjusted the time_stamp field to be UTC, and that took care of the problem. On Tue, Jan 14, 2020 at 10:24 AM rhys J wrote: > I am trying to add a csv file while indexing a core. > > curl command: > > sudo -u solr curl ' > http://localhost:8983/solr/dbtraddr

Re: Increase Physical Memory in Solr

2020-01-14 Thread rhys J
On Mon, Jan 13, 2020 at 3:42 PM Terry Steichen wrote: > Maybe solr isn't using enough of your available memory (a rough check is > produced by 'solr status'). Do you realize you can start solr with a > '-m xx' parameter? (for me, xx = 1g) > > Terry > > I changed the java_mem field in solr.in.sh,

Trouble adding a csv file - invalid date string error

2020-01-14 Thread rhys J
I am trying to add a csv file while indexing a core. curl command: sudo -u solr curl ' http://localhost:8983/solr/dbtraddr/update/csv?commit=true=\=%7C=/tmp/csv/dbtrphon_0.csv ' The header of the csv file:

Re: Increase Physical Memory in Solr

2020-01-13 Thread rhys J
On Mon, Jan 13, 2020 at 3:11 PM Gael Jourdan-Weil < gael.jourdan-w...@kelkoogroup.com> wrote: > Hello, > > If you are talking about "physical memory" as the bar displayed in Solr > UI, that is the actual RAM your host have. > If you need more, you need more RAM, it's not related to Solr. > >

Increase Physical Memory in Solr

2020-01-13 Thread rhys J
I am trying to figure out how to increase the physical memory in Solr. I see how to increase the JVM size, and I've done that. But my physical memory is 97% out of 7.79G of physical memory, and I'm trying to index a lot more documents as I move this live. Is there any documentation that I've

Re: updating documents via csv

2019-12-17 Thread rhys J
On Mon, Dec 16, 2019 at 11:58 PM Paras Lehana wrote: > Hi Rhys, > > I use CDATA for XMLs: > > > > > There should be a similar solution for JSON though I couldn't find the > specific one on the internet. If you are okay to use XMLs for indexing, you > can use this. > > We are set on using

updating documents via csv

2019-12-16 Thread rhys J
Is there a way to update documents already stored in the solr cores via csv? The reason I am asking is because I am running into a problem with updating via script with single quotes embedded into the field itself. Example: curl http://localhost:8983/solr/dbtr/update?commit=true -d '[{ "id":

Re: backing up and restoring

2019-12-16 Thread rhys J
On Mon, Dec 16, 2019 at 1:42 AM Paras Lehana wrote: > Looks like a write lock. Did reloading the core fix that? I guess it would > have been fixed by now. I guess you had run the delete query few moments > after restoring, no? > > Restoring setting the name parameter only worked the once. This

Re: unable to update using empty strings or 'null' in value

2019-12-16 Thread rhys J
On Mon, Dec 16, 2019 at 2:51 AM Paras Lehana wrote: > Hey Rhys, > > > Short Answer: Try using "set": null and not "set": "null". > > Thank you, this worked! Rhys

unable to update using empty strings or 'null' in value

2019-12-13 Thread rhys J
When I do the following update: curl http://localhost:8983/solr/dbtr/update?commit=true -d '[{ "id": "601000", "agen t": {"set": "null"},"assign_id": {"set": "320"},"client_group": {"set": "null"},"credit_class": {"se t": "null"},"credit_hold": {"set": "null"},"credit_hold_date": {"set":

Re: Updates via curl and json not showing in api

2019-12-13 Thread rhys J
On Fri, Dec 13, 2019 at 11:51 AM Shawn Heisey wrote: > > Is there a step I'm missing? > > It appears that you have not executed a commit that opens a new searcher. > > Thanks for explaining this. I turned on commit=true, and everything works as expected. Thanks again, Rhys

Updates via curl and json not showing in api

2019-12-13 Thread rhys J
When I do the following update: curl http://localhost:8983/solr/debt/update -d '[ {"id": "393291-18625", "orig_int_amt":{ "set" : "2.5"}, }]' and then: curl http://localhost:8983/solr/debt/get?id=393291-18625 I see the document is updated via the command line. It shows the following: {

Re: backing up and restoring

2019-12-12 Thread rhys J
I was able to successfully restore a backup by specifying name and location in the restore command. But now when i try to run: sudo -u solr curl http://localhost:8983/solr/debt/update -H "Content-type: text/xml" --data-binary '*:*' I get the following error: no segments* file found in

Re: backing up and restoring

2019-12-12 Thread rhys J
This page seems to indicate that I should copy the files from the backup directory back into the index? Is this accurate? https://codice.atlassian.net/wiki/spaces/DDF22/pages/2785407/Solr+Standalone+Server+Backup Thanks, Rhys

Re: backing up and restoring

2019-12-12 Thread rhys J
On Thu, Dec 12, 2019 at 3:49 AM sudhir kumar wrote: > once you backup index with some location, you have to specify the same > location to restore. > > ie in your case /tmp/solr is the location indexed was backed up , use same > location for restore. > > you did not provide name so latest

Re: user solr created by install not working with default password

2019-12-11 Thread rhys J
> That page talks about setting up authentication for HTTP access to the > Solr API. It has nothing at all to do with the OS user created by the > service install script. > > When the service install creates the OS user for the service, it is > created in such a way that its password is disabled.

backing up and restoring

2019-12-11 Thread rhys J
I made backups with the following command: sudo -u solr curl ' http://localhost:8983/solr/debt/replication?command=backup=/tmp/solr backups/debt/' I double checked that I had made the backup, and I had a backup. To

user solr created by install not working with default password

2019-12-11 Thread rhys J
I installed Solr following the directions on this site: https://lucene.apache.org/solr/guide/6_6/installing-solr.html I am running standalone Solr with no authentication added because it is all in-house with no access to outside requests. When I try to su solr, using the password mentioned

Re: Search returning unexpected matches at the top

2019-12-10 Thread rhys J
n28 = n, number of documents containing term\n2894478 = N, total number of documents with field\n 0.4322195 = tf, computed as freq / (freq + k1 * (1 - b + b * dl / avgdl)) from:\n1.0 = freq, occurrences of term within document\n1.2 = k1, term saturation parameter\n

Re: Search returning unexpected matches at the top

2019-12-10 Thread rhys J
On Tue, Dec 10, 2019 at 12:35 AM Paras Lehana wrote: > That's great. > > But I also wanted to know why the concerned document was scored lower in > the original query. Anyways, glad that the issue is resolved. :) > > That I need to look into. If I find an answer, I will let you know. Thanks,

Re: Search returning unexpected matches at the top

2019-12-09 Thread rhys J
On Mon, Dec 9, 2019 at 12:06 AM Paras Lehana wrote: > Hi Rhys, > > Use Solr Query Debugger > < > https://chrome.google.com/webstore/detail/solr-query-debugger/gmpkeiamnmccifccnbfljffkcnacmmdl?hl=en > > > Chrome > Extension to see what's making up the score for both of them. I guess > fieldNorm

Re: Search returning unexpected matches at the top

2019-12-06 Thread rhys J
On Fri, Dec 6, 2019 at 11:21 AM David Hastings wrote: > whats the field type for: > clt_ref_no > It is a text_general field because it can have numbers or alphanumeric characters. *_no isnt a default dynamic character, and owl-2924-8 usually translates > into > owl 2924 8 > > So it's matching

Search returning unexpected matches at the top

2019-12-06 Thread rhys J
I have a search box that is just searching every possible core, and every possible field. When I enter 'owl-2924-8', I expect the clt_ref_no of OWL-2924-8 to float to the top, however it is the third result in my list. Here is the code from the search: on_data({ "responseHeader":{

Re: Using an & in an indexed field and then querying for it.

2019-11-25 Thread rhys J
On Mon, Nov 25, 2019 at 2:36 PM David Hastings wrote: > its breaking on the & because its in the url and you are most likely > sending a get request to solr. you should send it as post or as %26 > > The package I am using doesn't have a postJSON function available, so I'm using their getJSON

Using an & in an indexed field and then querying for it.

2019-11-25 Thread rhys J
I have some fields that have text like so: Reliable Van & Storage. They indexed fine when I used curl and csv files to read them into the core. Now when I try to query for them, I get errors. If I try escaping it like so \&, I get the following error: on_data({ "responseHeader":{

Re: How to tell which core was used based on Json or XML response from Solr

2019-11-25 Thread rhys J
On Mon, Nov 25, 2019 at 10:43 AM David Hastings < hastings.recurs...@gmail.com> wrote: > you missed the part about adding = to the query: > =all=mega > > returns for me: > > "responseHeader":{ > "status":0, > "QTime":0, > "params":{ > "q":"*:*", > "core":"mega", >

Re: How to tell which core was used based on Json or XML response from Solr

2019-11-25 Thread rhys J
On Mon, Nov 25, 2019 at 1:10 AM Paras Lehana wrote: > Hey rhys, > > What David suggested is what we do for querying Solr. You can figure out > our frontend implementation of Auto-Suggest by seeing the AJAX requests > fired when you type in the search box on www.indiamart.com. > That is pretty

Re: How to tell which core was used based on Json or XML response from Solr

2019-11-25 Thread rhys J
> if you are taking the PHP route for the mentioned server part then I would > suggest > using a client library, not plain curl. There is solarium, for instance: > > https://solarium.readthedocs.io/en/stable/ > https://github.com/solariumphp/solarium > > It can use curl under the hood but you can

Re: How to tell which core was used based on Json or XML response from Solr

2019-11-25 Thread rhys J
On Mon, Nov 25, 2019 at 2:10 AM Erik Hatcher wrote: > add ==all and the parameter will be in the response > header. > >Erik > Thanks. I just tried this, and all I got was this response: http://localhost:8983/solr/dbtr/select?q=debtor_id%3A%20393291=all

Re: How to tell which core was used based on Json or XML response from Solr

2019-11-22 Thread rhys J
On Fri, Nov 22, 2019 at 1:39 PM David Hastings wrote: > 2 things (maybe 3): > 1. dont have this code facing a client thats not you, otherwise anyone > could view the source and see where the solr server is, which means they > can destroy your index or anything they want. put at the very least

How to tell which core was used based on Json or XML response from Solr

2019-11-22 Thread rhys J
I'm implementing an autocomplete search box for Solr. I'm using JSON as my response style, and this is the jquery code. var url='http://10.40.10.14:8983/solr/'+core+'/select/?q='+queryField + query+'=2.2=true=0=50=on=json=?=on_data'; jQuery_3_4_1.getJSON(url); ___ on_data(data) { var

Re: Highlighting on typing in search box

2019-11-21 Thread rhys J
r of Solr. > > For the visualization part: Angular has a suggestion box that can ingest > the results from Solr. > > > Am 21.11.2019 um 16:42 schrieb rhys J : > > > > Are there any recommended APIs or code examples of using Solr and then > > highlighting results below th

Highlighting on typing in search box

2019-11-21 Thread rhys J
Are there any recommended APIs or code examples of using Solr and then highlighting results below the search box? I'm trying to implement a search box that will search solr as the user types, if that makes sense? Thanks, Rhys

Re: exact matches on a join

2019-11-21 Thread rhys J
On Thu, Nov 21, 2019 at 8:04 AM Jason Gerlowski wrote: > Are these fields "string" or "text" fields? > > Text fields receive analysis that splits them into a series of terms. > That's why the query "Freeman" matches the document "A-1 Freeman". > "A-1 Freeman" gets split up into multiple terms,

exact matches on a join

2019-11-19 Thread rhys J
I am trying to do a join, which I have working properly on 2 cores. One core has report_as, and the other core has debt_id. If I enter 'report_as: "Freeman", I expect to get 272 results. But I get 557. When I do a database search on the matched fields, it shows me that report_as: "Freeman" is

Attempting to do a join with 3 cores

2019-11-18 Thread rhys J
I was hoping to be able to do a join with 3 cores. I found this page that seemed to indicate it's possible? https://stackoverflow.com/questions/52380302/solr-how-to-join-three-cores Here's my query: http://localhost:8983/solr/dbtrphon/select?indent=on=1000=score desc, id desc=(phone:*Meredith*

Re: using scoring to find exact matches while using a cursormark

2019-11-18 Thread rhys J
> ...so w/o a score param you're getting the default sort: score "desc" > (descending)... > > > https://lucene.apache.org/solr/guide/8_3/common-query-parameters.html#CommonQueryParameters-ThesortParameter > > "If the sort parameter is omitted, sorting is performed as though > the >

using scoring to find exact matches while using a cursormark

2019-11-18 Thread rhys J
I am trying to use scoring to get the expected results at the top of the stack when doing a Solr query. I am looking up clt_ref_no: OWL-2924-8^2 OR contract_number: OWL-2924-8^2 If I use the following query:in the browser, I get the expected results at the top of the returned values from Solr.

Re: attempting to get an exact match on a textField

2019-11-16 Thread rhys J
I figured it out. It was a combination of problems. 1. not fully indexing the data. that made the result set return smaller than expected. 2. using the join statement without adding a field at the end of it to search the other core on. On Fri, Nov 15, 2019 at 1:39 PM rhys J wrote: > >

attempting to get an exact match on a textField

2019-11-15 Thread rhys J
I am trying to use the API to get an exact match on clt_ref_no. At one point, I was using ""s to enclose the text such as: clt_ref_no: "OWL-2924-8", and I was getting 5 results. Which is accurate. Now when I use it, I only get one match. If I try to build the url in perl, and then post the

using NOT or - to exclude results with a textField type

2019-11-15 Thread rhys J
I'm trying to exclude results based on the documentation about the boolean NOT symbol, but I keep getting errors. I've tried: http://localhost:8983/solr/debt/select?q=clt_ref_no:-”owl-2924-8” and http://localhost:8983/solr/debt/select?q=clt_ref_no:NOT”owl-2924-8” I have tried with and without

Re: using gt and lt in a query

2019-11-14 Thread rhys J
On Thu, Nov 14, 2019 at 1:28 PM Erick Erickson wrote: > You might be able to make this work with function queries…. > > > I managed to decipher something along the lines of this: http://10.40.10.14:8983/solr/debt/select?q=orig_princ_amt: 0 TO

Re: using gt and lt in a query

2019-11-14 Thread rhys J
> Range queries are done with brackets and/or braces. A square bracket > indicates that the range should include the precise value mentioned, and > a curly brace indicates that the range should exclude the precise value > mentioned. > > >

using gt and lt in a query

2019-11-14 Thread rhys J
I am trying to duplicate this line from a db query: (debt.orig_princ_amt > 0 AND debt.princ_paid > 0 AND debt.orig_princ_amt > debt.princ_paid) I have the following, but it returns no results: http://localhost:8983/solr/debt/select?q=orig_princ_amt

Re: Query More Than One Core

2019-11-13 Thread rhys J
On Wed, Nov 13, 2019 at 3:16 PM Jörn Franke wrote: > You can use nested indexing and Index both types of documents in one core. > > https://lucene.apache.org/solr/guide/8_1/indexing-nested-documents.html I had read that, but it doesn't really fit our needs right now. I figured out how to do a

Query More Than One Core

2019-11-13 Thread rhys J
I have more than one core. Each core represents one database table. They are coordinated by debt_id/debtor_id, so we can do join statements on them with Sybase/SQL. Is there a way to query more than one core at a time, or do I need to do separate queries per core, and then somehow with perl

Re: date fields and invalid date string errors

2019-11-13 Thread rhys J
> You could do it that way ... but instead, I'd create a new fieldType, > not change an existing one. The existing name is "pdate" which implies > "point date". I would probably go with "daterange" or "rdate" as the > name, but that is completely up to you. > > I did that, deleted docs, stopped,

Re: date fields and invalid date string errors

2019-11-13 Thread rhys J
> If you use DateRangeField instead of DatePointField for your field's > class, then you can indeed use partial timestamps for both indexing and > querying. This only works with DateRangeField. > > I don't see that as an option in the API? Do I need to change what pdate's type is in the

date fields and invalid date string errors

2019-11-13 Thread rhys J
I have date fields in my documents that are just -MM-DD. I set them as a pdate field in the schema as such: and When I use the API to do a search and try: 2018-01-01 [2018-01-01 TO NOW] I get 'Invalid Date String'. Did I type my data wrong in the schema? Is there something I'm

Re: different results in numFound vs using the cursor

2019-11-12 Thread rhys J
> : I am going to adjust my schema, re-index, and try again. See if that > : doesn't fix this problem. I didn't know that having the uniqueKey be a > : textField was a bad idea. > > > https://lucene.apache.org/solr/guide/8_3/other-schema-elements.html#OtherSchemaElements-UniqueKey > > "The

Re: different results in numFound vs using the cursor

2019-11-12 Thread rhys J
On Tue, Nov 12, 2019 at 12:18 PM Chris Hostetter wrote: > > : > a) What is the fieldType of the uniqueKey field in use? > : > > : > : It is a textField > > whoa... that's not normal .. what *exactly* does the fieldType declaration > (with all analyzers) look like, and what does the declaration

Re: using fq means no results

2019-11-12 Thread rhys J
On Tue, Nov 12, 2019 at 11:57 AM Erik Hatcher wrote: > fq is a filter query, and thus narrows the result set provided by the q > down to what also matches all specified fq's. > > So this can be used instead of scoring? Or alongside scoring? > You gave it a query, "cat_ref_no", which literally

using fq means no results

2019-11-12 Thread rhys J
If I do this query in the browser: http://10.40.10.14:8983/solr/debt/select?q=(clt_ref_no:+owl-2924-8)^=1.0+clt_ref_no:owl-2924-8 I get 84662 results. If I do this query: http://10.40.10.14:8983/solr/debt/select?q=(clt_ref_no:+owl-2924-8)^=1.0+clt_ref_no:owl-2924-8=clt_ref_no I get 0 results.

Re: different results in numFound vs using the cursor

2019-11-12 Thread rhys J
On Mon, Nov 11, 2019 at 8:32 PM Chris Hostetter wrote: > > Based on the info provided, it's hard to be certain, but reading between > the lines here are hte assumptions i'm making... > > 1) your core name is "dbtr" > 2) the uniqueId field for the "dbtr" core is "debtor_id" > > ..are those

different results in numFound vs using the cursor

2019-11-11 Thread rhys J
i am using this logic in perl: my $decoded = decode_json( $solrResponse->{_content} ); my $numFound = $decoded->{response}{numFound}; $cursor = "*"; $prevCursor = ''; while ( $prevCursor ne $cursor ) { my $solrURI = "\"http://[SOLR URL]:8983/solr/"; $solrURI .= $fdat{core}; $solrSort = (

Using solr API to return csv results

2019-11-07 Thread rhys J
If I am using the Solr API to query the core, is there a way to tell how many documents are found if i use wt=CSV? Thanks, Rhys

Re: creating a core with a custom managed-schema

2019-11-04 Thread rhys J
On Mon, Nov 4, 2019 at 1:36 PM Erick Erickson wrote: > Well, just what it says. -schema isn’t a recognized parameter, where did > you get it? Did you try bin/solr create -help and follow the instructions > there? > > I am confused. This page:

creating a core with a custom managed-schema

2019-11-04 Thread rhys J
I have created a tmp directory where I want to have reside custom managed-schemas to use when creating cores. /tmp/solr_schema/CORENAME/managed-schema Based on this page: https://lucene.apache.org/solr/guide/7_0/coreadmin-api.html#coreadmin-create , I am running the following command: sudo -u

Re: Parts of the Json response to a curl query are arrays, and parts are hashes

2019-10-28 Thread rhys J
I forgot to include the fields created through the API: Thanks, Rhys On Mon, Oct 28, 2019 at 11:30 AM rhys J wrote: > > >> Did you reload the core/collection or restart Solr so the new schema >> would take effect? If it's SolrClo

Re: Parts of the Json response to a curl query are arrays, and parts are hashes

2019-10-28 Thread rhys J
> Did you reload the core/collection or restart Solr so the new schema > would take effect? If it's SolrCloud, did you upload the changes to > zookeeper and then reload the collection? SolrCloud does not use config > files on disk. > So I have not done this part yet, but I noticed some things in

Re: Parts of the Json response to a curl query are arrays, and parts are hashes

2019-10-25 Thread rhys J
> > > > "dl2":["Great Plains"], > > "do_not_call":false, > > There are no hashes inside the document. If there were, they would be > surrounded by {} characters. The whole document is a hash, which is why > it has {} characters. Referring to the snippet that I included above,

Parts of the Json response to a curl query are arrays, and parts are hashes

2019-10-25 Thread rhys J
Is there some reason that text_general fields are returned as arrays, and other fields are returned as hashes in the json response from a curl query? Here's my curl query: curl "http://10.40.10.14:8983/solr/dbtr/select?indent=on=debtor_id:393291; Here's the response:

Re: using the df parameter to set a default to search all fields

2019-10-22 Thread rhys J
> Solr does not have a way to ask for all fields on a search. If you use > the edismax query parser, you can specify multiple fields with the qf > parameter, but there is nothing you can put in that parameter as a > shortcut for "all fields." Using qf with multiple fields is the > cleanest way

using the df parameter to set a default to search all fields

2019-10-22 Thread rhys J
How do I make Solr search on all fields in a document? I read the documentation about the df field, and added the following to my solrconfig.xml: explicit 10 _text_ in my managed-schema file i have the following: I have deleted the documents, and re-indexed the

Re: Importing a csv file encapsulated by " creates a large copyField field of all fields combined.

2019-10-21 Thread rhys J
Thank you, that worked perfectly. I can't believe I didn't notice the separator was a tab.

Re: Importing a csv file encapsulated by " creates a large copyField field of all fields combined.

2019-10-21 Thread rhys J
films/README.txt#L39 > Also reference documentation: > > https://lucene.apache.org/solr/guide/8_1/uploading-data-with-index-handlers.html > > Regards, > Alex. > > On Mon, 21 Oct 2019 at 13:04, rhys J wrote: > > > > I am trying to import a csv file to my solr core

Importing a csv file encapsulated by " creates a large copyField field of all fields combined.

2019-10-21 Thread rhys J
I am trying to import a csv file to my solr core. It looks like this: "user_id","name","email","client","classification","default_client","disabled","dm_password","manager" "A2M","Art Morse","amo...@morsemoving.com","Morse Moving","Morse","","X","blue0show","" "ABW","Amy