How about you do a SQL dump from a hosted MediaWiki (or your wiki of
choice), and regularly restore it onto multiple local laptops at your
site(s)?  You can practically cron/script/procmail this, I think.

I like laptops because they are portable (you can carry them into a
datacenter or out to a backups site) and have built-in UPS :-)

At the top of the hour or whenever you like, the hosted wiki goes into
its "safe mode", or stops.
Backup the SQL DB into a file.
Restart the wiki.

At hour+45mins, or upon an email(?), the laptops download the SQL
backup onto local storage. You can either just hope that the initial
DB backup took less than X minutes, or have the main wiki restart
script send an email to your backup sites(s). Procmail or equiv can
then trigger a download of the SQL DB backup file. Or you could even
do outbound SCP from the main site into the backup laptops from the
wiki restart script.

At this point you can restore the SQL files and (re)start the wiki(s)
on the local laptops, or only actually do a restore and wiki start
when you really need it.

Hackery, for sure.

This way you get true local and resilient wiki copies, and people can
actually use the link/search function when things have Gone To Hell.

--tep



On Mon, Dec 23, 2013 at 3:58 PM, Phil Pennock
<[email protected]> wrote:
> Folks, seeking recommendations,
>
> I'm looking for a decent wiki, as a *hosted* solution, for use for
> operational runbooks, logbooks, process controls, post-mortem archives,
> etc for my area, plus more besides for other areas if we can manage to
> get to one commonly acceptable wiki solution.  Because of the runbooks
> step, in particular, it's a deal-breaking requirement that we enable
> (some or all) users to get a dump of the contents in a readily readable
> form, so that when Things Are Broken, the content can be read locally.
> Ideally, with history so that people can debug a broken runbook entry by
> looking and seeing that "step 4" keeps changing and is likely the place
> to debug further.
>
> Extra kudos if the dump is sane, with git history or the like.  The
> hosted by someone else (which we pay for) is a hard requirement.
>
> We're outgrowing GitHub's wikis and the limitations.  The git model is
> great, the markdown-only is limiting adoption by non-eng, especially as
> we're moving to Jira for ticketing because of the more severe
> limitations there.
>
> Atlassian's Confluence is nice, with their OnDemand offering, but the
> only way to get contents for offline access is via a wiki administrator
> account, to take a backup, and then parse apart some XML which is very
> clearly geared for backup/restore as the use-case.  I _can_ script to
> parse apart the XML tree, but support fora suggest there have been
> format changes, so I'd be playing a game of catchup every so often, with
> our ability to take local copies of our runbooks broken in the meantime,
> which is unacceptable fragility.
>
> Really, looking for something which is "close enough to Confluence in
> features" but has "sane exports of entire spaces, not just individual
> pages", ideally via DVCS.  We have some budget to pay, and we're
> currently a small company (less than 20 people).  My model is to have
> process docs but to keep things as simple and streamlined as possible,
> so that bureaucracy becomes simple checklists people control for
> themselves, rather than unpleasant mandates.
>
> Suggestions appreciated,
> -Phil
> _______________________________________________
> Discuss mailing list
> [email protected]
> https://lists.lopsa.org/cgi-bin/mailman/listinfo/discuss
> This list provided by the League of Professional System Administrators
>  http://lopsa.org/
_______________________________________________
Discuss mailing list
[email protected]
https://lists.lopsa.org/cgi-bin/mailman/listinfo/discuss
This list provided by the League of Professional System Administrators
 http://lopsa.org/

Reply via email to