On 03/12/12 16:39, Erick Erickson wrote:
There's no real need to do what you ask.

First thing is that you should always be prepared, in the worst-case
scenario, to regenerate your entire index.

That said, perhaps the easiest way to back up Solr is just to use
master/slave replication. Consider having a machine that's a slave to the
master (but not necessarily searched against) and periodically poll your
master (say daily or whatever your interval is). You can configure Solr to
keep N copies of the index as extra insurance. These will be fairly static
so if you_really_  wanted to you could just copy the <solrhome>/data
directory somewhere, but I don't know if that's necessary.

See:http://wiki.apache.org/solr/SolrReplication

Best
Erick
Hi Erick,

Thanks for that, I'll take a look.

However, wouldn't re-creating the index on a large dataset take an inordinate amount of time? The system I will be backing up is likely to undergo rapid development and thus schema changes, so I need some kind of insurance against corruption if we need to roll-back after a change.

How should I go about creating multiplebackup versions I can put aside (e.g. on tape) to hedge against the down-time which would be required to regenerate the indexes from scratch?

Regards,
-Andy

--
Andy D'Arcy Jewell

SysMicro Limited
Linux Support
E:  andy.jew...@sysmicro.co.uk
W:  www.sysmicro.co.uk

Reply via email to