Design/implementation for backups, based on thread in allura-dev:

Add a bulk_export() method to Application which would be responsible for
generating json for all the artifacts in the tool.  The format should match the
API format for artifacts so that we're consistent.  Thus any tool that
implements bulk_export() would typically loop through all the artifacts for this
instance (matching app_config_id) and convert to json the same way the API json
is generated (e.g. call the `__json__` method or RestController method; some
refactoring might be needed).  Multiple types of artifacts/objects could be
listed out in groups, e.g. Tracker app could have a list of tickets, list of
saved search bins, list of milestones, and the tracker config data.  Discussion
threads would need to be included too, ideally inline with the artifact they go
with.  No permission checks would be done since this export would only be
available to admins (makes it faster & simpler).

Provide a page on the Admin sidebar to generate a bulk export.  Project admins
could choose individual tool instances, or all tools in the project (that
support it).  That form would kick off a background task which goes through the
selected tools and runs their bulk_export() methods.  Make sure we don't have 
multiple backups running for the same project (I think code snapshot has 
similar protection too).  Save each tool's data as mount_point.json and zip 
them all together.

Store the zip files in a configurable directory.  Should have an ini setting 
for this that is tiny template (similar to 'short_url.url_pattern') so that 
project name and stuff can be interpolated.

When the task is complete, notify the user via email.  The basic content of the 
email (telling them that it's done, timestamp, etc) can be standard. Below 
that, we need configurable instructions to go into the email (telling users how 
to SFTP or whatever to get to their file).  I guess an ini setting will be 
best, but the value will be quite long.  This should also support project name 
interpolation.

So that a giant json string doesn't have to be held in memory for each tool, the
export task should open a file handle for mount_point.json and send call
bulk_export() with that open file handle and each App can append to their file
incrementally.

If mongo performance is slow, some refactoring may be needed to avoid lots of
individual mongo calls and be more batch oriented.  We can see how it goes.

For now, this will NOT include:

* attachments in the zip
* an API for backups
* performance optimizations, like parallelization of bulk_export() across all 
tools


---

** [tickets:#3154] Users need a way to backup allura data [idt 824]**

**Status:** open
**Labels:** support feature-parity p2 
**Created:** Wed Nov 02, 2011 06:41 PM UTC by Chris Tsai
**Last Updated:** Fri Apr 12, 2013 06:07 AM UTC
**Owner:** nobody

https://sourceforge.net/apps/ideatorrent/sourceforge/ideatorrent/idea/824/

>SF clearly says that the backup of the project data is left to project admins:

>https://sourceforge.net/apps/trac/sourceforge/wiki/Backup%20your%20data

>Fine.

>However, there is currently no way to backup the SF2.0 tools, for instance the 
>data from the ticket tool. 

Backups would be the first step, then a way to restore (though we never offered 
that part in SF classic to my knowledge)?


---

Sent from sourceforge.net because allura-dev@incubator.apache.org is subscribed 
to https://sourceforge.net/p/allura/tickets/

To unsubscribe from further messages, a project admin can change settings at 
https://sourceforge.net/p/allura/admin/tickets/options.  Or, if this is a 
mailing list, you can unsubscribe from the mailing list.

Reply via email to