Re: Help for importing large data (approx. 8GB) from old solr version to new solr version
Hello Erick, Thanks for your reply! You mean that, we should follow below steps right? Here is the data directory path : solr/solr-8.2.0/server/solr/product/item_core/data STEPS :- 1. Stop old solr-8.2.0 server 2. Copy data directory (from old solr version to new solr version) copy solr/solr-8.2.1/server/solr/product/item_core/data to solr/solr-8.3.1/server/solr/product/item_core/data 3. Start new solr version solr-8.3.1 Is it correct way to copy just index only from old to new solr version? Is it lose any data or anything break in new solr version ? Thanks in advance! -Ken
Re: does copyFields increase indexe size ?
On 12/24/2019 5:11 PM, Nicolas Paris wrote: Do you mean "copy fields" is only an action of changing the schema ? I was thinking it was adding a new field and eventually a new index to the collection The copy that copyField does happens at index time. Reindexing is required after changing the schema, or nothing happens. If you are redoing the indexing after changing the schema and reloading/restarting, then you can ignore me. Thanks, Shawn
Re: does copyFields increase indexe size ?
> The action of changing the schema makes zero changes in the index. It > merely changes how Solr interacts with the index. Do you mean "copy fields" is only an action of changing the schema ? I was thinking it was adding a new field and eventually a new index to the collection On Tue, Dec 24, 2019 at 10:59:03AM -0700, Shawn Heisey wrote: > On 12/24/2019 10:45 AM, Nicolas Paris wrote: > > From my understanding, copy fields creates an new indexes from the > > copied fields. > > From my tests, I copied 1k textual fields into _text_ with copyFields. > > As a result there is no increase in the size of the collection. All the > > source fields are indexed and stored. The _text_ field is indexed but > > not stored. > > > > This is a great surprise but is this behavior expected ? > > The action of changing the schema makes zero changes in the index. It > merely changes how Solr interacts with the index. > > If you want the index to change when the schema is changed, you need to > restart or reload and then re-do the indexing after the change is saved. > > https://cwiki.apache.org/confluence/display/solr/HowToReindex > > Thanks, > Shawn > -- nicolas
Cache fails to warm after Replication Recovery in solr cloud
Hi! I have some custom cache set up in solrconfig XML for a solr cloud cluster in Kubernetes. Each node has Kubernetes persistence set up. After I execute a “delete pod” command to restart a node it goes into Replication Recovery successfully but my custom cache’s warm() method never gets called. Is this expected behavior? The events I observed are: 1. Cache init() method called 2. Searcher created and registered 3. Replication recovery Thanks! Li
Re: does copyFields increase indexe size ?
On 12/24/2019 10:45 AM, Nicolas Paris wrote: From my understanding, copy fields creates an new indexes from the copied fields. From my tests, I copied 1k textual fields into _text_ with copyFields. As a result there is no increase in the size of the collection. All the source fields are indexed and stored. The _text_ field is indexed but not stored. This is a great surprise but is this behavior expected ? The action of changing the schema makes zero changes in the index. It merely changes how Solr interacts with the index. If you want the index to change when the schema is changed, you need to restart or reload and then re-do the indexing after the change is saved. https://cwiki.apache.org/confluence/display/solr/HowToReindex Thanks, Shawn
does copyFields increase indexe size ?
Hi >From my understanding, copy fields creates an new indexes from the copied fields. >From my tests, I copied 1k textual fields into _text_ with copyFields. As a result there is no increase in the size of the collection. All the source fields are indexed and stored. The _text_ field is indexed but not stored. This is a great surprise but is this behavior expected ? -- nicolas
Re: Help for importing large data (approx. 8GB) from old solr version to new solr version
Here’s the very simplest way: 1> shut down your 8.2 Solr instance 2> install your 8.3.1 instance on the same machine 3> when you start your 8.3.1 instance, specify the environment variable SOLR_HOME to point to the same one you used in 8.2 If you don’t know what SOLR_HOME used to point to, bring up your 8.2 instance first and look at the admin UI, your environment variables will point there. NOTE: If yo do it this way, you ma _NOT_ have both 8.2 and 8.3.1 running the same time. Best, Erick > On Dec 24, 2019, at 5:54 AM, Ken Walker wrote: > > Hello Jörn, > > Thanks for your reply! > > As per Shawn "Why not just copy the index and use it directly rather > than importing it? Solr 8.x can directly use indexes built by > versions back to 7.0.0." in previous mail comment. > > Is it possible and how we can do that ? > > Thanks in advance > - Ken > > On Tue, Dec 24, 2019 at 3:26 PM Jörn Franke wrote: >> >> It seems that you got this handed over with little documentation. You have >> to explore what the import handler does. This is a custom configuration that >> you need to check how it works. >> >> Then as already said. You can simply install another version of Solr if you >> are within a Solr major version 8.x in Linux is simply a symbolic link >> pointing from one Solr version to the other. In this way you can easily >> switch back as well. >> >> Finally, check your memory consumption. Normally heap is significant smaller >> then the total available memory as the non-heap memory is used by Solr for >> caching. >> >> If you have 8g mb of heap I would expect that the total amount of memory >> available is more than 32 gb. >> As always it depends, but maybe you can give more details on no of cores, >> heap memory, total memory and if other processes than Solr run on the >> machine. >> >>> Am 24.12.2019 um 05:59 schrieb Ken Walker : >>> >>> Hello, >>> >>> We are using solr version 8.2.0 in our production server. >>> >>> We are upgrading solr version from solr 8.2.0 version to solr 8.3.1 >>> version but we have faced out of memory error while importing data and >>> then we have extended memory in our server and then again start >>> importing process but it has work too slowy for 8GB data ( it has >>> taken more than 2 days for importing data from solr 8.2.0 version to >>> solr 8.3.1 version). >>> >>> Could you please help me how we can do it fast for importing 8GB data >>> from old solr version to new solr version? >>> >>> We are using below command for importing data from one solr version to >>> another solr version >>> $ curl >>> 'http://IP-ADDRESS:8983/solr/items/dataimport?command=full-import=true=false=json=true=false=false=false' >>> >>> Thanks in advance! >>> - Ken
Re: Help for importing large data (approx. 8GB) from old solr version to new solr version
Hello Jörn, Thanks for your reply! As per Shawn "Why not just copy the index and use it directly rather than importing it? Solr 8.x can directly use indexes built by versions back to 7.0.0." in previous mail comment. Is it possible and how we can do that ? Thanks in advance - Ken On Tue, Dec 24, 2019 at 3:26 PM Jörn Franke wrote: > > It seems that you got this handed over with little documentation. You have to > explore what the import handler does. This is a custom configuration that you > need to check how it works. > > Then as already said. You can simply install another version of Solr if you > are within a Solr major version 8.x in Linux is simply a symbolic link > pointing from one Solr version to the other. In this way you can easily > switch back as well. > > Finally, check your memory consumption. Normally heap is significant smaller > then the total available memory as the non-heap memory is used by Solr for > caching. > > If you have 8g mb of heap I would expect that the total amount of memory > available is more than 32 gb. > As always it depends, but maybe you can give more details on no of cores, > heap memory, total memory and if other processes than Solr run on the machine. > > > Am 24.12.2019 um 05:59 schrieb Ken Walker : > > > > Hello, > > > > We are using solr version 8.2.0 in our production server. > > > > We are upgrading solr version from solr 8.2.0 version to solr 8.3.1 > > version but we have faced out of memory error while importing data and > > then we have extended memory in our server and then again start > > importing process but it has work too slowy for 8GB data ( it has > > taken more than 2 days for importing data from solr 8.2.0 version to > > solr 8.3.1 version). > > > > Could you please help me how we can do it fast for importing 8GB data > > from old solr version to new solr version? > > > > We are using below command for importing data from one solr version to > > another solr version > > $ curl > > 'http://IP-ADDRESS:8983/solr/items/dataimport?command=full-import=true=false=json=true=false=false=false' > > > > Thanks in advance! > > - Ken
Re: Help for importing large data (approx. 8GB) from old solr version to new solr version
It seems that you got this handed over with little documentation. You have to explore what the import handler does. This is a custom configuration that you need to check how it works. Then as already said. You can simply install another version of Solr if you are within a Solr major version 8.x in Linux is simply a symbolic link pointing from one Solr version to the other. In this way you can easily switch back as well. Finally, check your memory consumption. Normally heap is significant smaller then the total available memory as the non-heap memory is used by Solr for caching. If you have 8g mb of heap I would expect that the total amount of memory available is more than 32 gb. As always it depends, but maybe you can give more details on no of cores, heap memory, total memory and if other processes than Solr run on the machine. > Am 24.12.2019 um 05:59 schrieb Ken Walker : > > Hello, > > We are using solr version 8.2.0 in our production server. > > We are upgrading solr version from solr 8.2.0 version to solr 8.3.1 > version but we have faced out of memory error while importing data and > then we have extended memory in our server and then again start > importing process but it has work too slowy for 8GB data ( it has > taken more than 2 days for importing data from solr 8.2.0 version to > solr 8.3.1 version). > > Could you please help me how we can do it fast for importing 8GB data > from old solr version to new solr version? > > We are using below command for importing data from one solr version to > another solr version > $ curl > 'http://IP-ADDRESS:8983/solr/items/dataimport?command=full-import=true=false=json=true=false=false=false' > > Thanks in advance! > - Ken
can you help me?
Highlight, display when the query ID is available Change to title. The return value of highlighting is only the ID information, and the content is empty
Re: Help for importing large data (approx. 8GB) from old solr version to new solr version
Hello Shawn, Thanks for your reply! Actually we don't know how its works ( just copy the index ) so could you please give us some reference urls or any steps for it? Thanks in advance - Ken On Tue, Dec 24, 2019 at 11:56 AM Shawn Heisey wrote: > > On 12/23/2019 9:58 PM, Ken Walker wrote: > > We are upgrading solr version from solr 8.2.0 version to solr 8.3.1 > > version but we have faced out of memory error while importing data and > > then we have extended memory in our server and then again start > > importing process but it has work too slowy for 8GB data ( it has > > taken more than 2 days for importing data from solr 8.2.0 version to > > solr 8.3.1 version). > > > > Could you please help me how we can do it fast for importing 8GB data > > from old solr version to new solr version? > > Why not just copy the index and use it directly rather than importing > it? Solr 8.x can directly use indexes built by versions back to 7.0.0. > > Thanks, > Shawn