There's no real need to do what you ask.

First thing is that you should always be prepared, in the worst-case
scenario, to regenerate your entire index.

That said, perhaps the easiest way to back up Solr is just to use
master/slave replication. Consider having a machine that's a slave to the
master (but not necessarily searched against) and periodically poll your
master (say daily or whatever your interval is). You can configure Solr to
keep N copies of the index as extra insurance. These will be fairly static
so if you _really_ wanted to you could just copy the <solrhome>/data
directory somewhere, but I don't know if that's necessary.

See: http://wiki.apache.org/solr/SolrReplication

Best
Erick


On Mon, Dec 3, 2012 at 6:07 AM, Andy D'Arcy Jewell <
andy.jew...@sysmicro.co.uk> wrote:

> Hi all.
>
> I'm new to SolR, and I have recently had to set up a SolR server running
> 4.0.
>
> I've been searching for info on backing it up, but all I've managed to
> come up with is "it'll be different" or "you'll be able to do push
> replication" or using http and the command=backup parameter, which doesn't
> sound like it will be effective for a production setup (unless I've got
> that wrong)...
>
>
> I was wondering if I can just stop or suspend the SolR server, then do an
> LVM snapshot of the data store, before bringing it back on line, but I'm
> not sure if that will cut it. I gather merely rsyncing the data files won't
> do...
>
> Can anyone give me a pointer to that "easy-to-find" document I have so far
> failed to find? Or failing that, maybe some sound advice on how to proceed?
>
> Regards,
> -Andy
>
>
>
>
> --
> Andy D'Arcy Jewell
>
> SysMicro Limited
> Linux Support
> E:  andy.jew...@sysmicro.co.uk
> W:  www.sysmicro.co.uk
>
>

Reply via email to