Shawn's way will work, of course you have to be sure you didn't index
any data before editing all the core.properties files.

There's another way to set the dataDir per core though that has the
advantage of not entailing any down time or hand editing files:
> create the collection with createNodeSet=EMPTY. No replicas are created
> Now add each replica with ADDREPLICA. In addition to all the regular params, 
> use property.dataDir=place_you_want_the_index

The ADDREPLICA property.blah param is intended exactly to set
arbitrary properties in core.properties for the replica...

Best,
Erick

On Wed, Jul 12, 2017 at 6:24 AM, Shawn Heisey <apa...@elyograg.org> wrote:
> On 7/12/2017 12:38 AM, Zheng Lin Edwin Yeo wrote:
>> I found that we can set the path under <dataDir> in solrconfig.xml
>>
>> However, this seems to work only if there is one replica. How do we set it
>> if we have 2 or more replica?
>
> Setting dataDir in solrconfig.xml is something that really only works in
> standalone Solr.  For SolrCloud, this method has issues that are
> difficult to get around.
>
> Another option that works in ANY Solr mode is changing dataDir in the
> core.properties file that every core uses.  Create the collection,
> allowing Solr to create the cores in the default way.  Shut down Solr
> and edit the core.properties file for each core that you want to have
> the data in a different location.  Add a dataDir property pointing at
> the new location for that core's data.  If the core actually has any
> contents, you can move the data to that location, but if not, you can
> simply let Solr create the data itself when it starts back up.
>
> The core.properties file is in Java properties format, which is well
> documented in multiple places around the Internet.
>
> https://www.google.com/search?q=java+properties+format&ie=utf-8&oe=utf-8
>
> If the dataDir location is not an absolute path, then it will be
> relative to the instanceDir -- the place where core.properties is.  The
> dataDir value defaults to a simple value of "data".
>
> Thanks,
> Shawn
>

Reply via email to