On 02/06/10 19:36, mehma sarja wrote: > WHY POST THIS? > I am posting this message for two reasons: > a) Bacula and other enterprise backup tools do not particularly like > unreliable bandwidth connections and a bittorrent-like technology fills > the gap. The gap is that Bacula and other tools are making disk backups > convenient. As people move away from tape, the disk-based systems are > increasingly at risk from natural disasters and wear and tear over time. > Thus increasing risk as compared to tape backups. Although they too are > susceptible to wear and tear. Do you consider some sort of off-site as a > natural cost of doing on-disk backups? > > b) If someone is doing it or thinking about using bittorrent > technologies - I'd like to know how of your experiences (and config > files) and what hardware you use.
My first thoughts on this: (1) With a comparatively small number of hosts and clients, all with reasonably fast business-grade network connections, there are no advantages to using BitTorrent. BitTorrent is fast because it distributes the server load across a large number of peers to serve a large number of clients, which themselves re-seed to pick up the server load. You do not have that situation. BitTorrent is a good technology for software distribution, but it's not really applicable to the problem of network backup. (2) The problem of unreliable network connections can be trivially solved using rsync. Server A rsyncs changes in its data set every hour, say, to mirrors at sites B and C, which sites B and C then back up every night. Likewise, site B rsyncs *its* server data pool to mirrors at sites A and C, which they then back up. And this is a worst case assuming sires A, B and C have different data sets. If the three sites are hosting the same data, then they simply maintain nightly local backups of their own data, and rsync the changes between themselves as needed to keep themselves in sync. For that matter, if all three sites A, B and C have the same data, you need not even necessarily bother backing them up at all; you could have a master repository at a hardened facility somewhere geographically remote from all three, which has a master copy of the data and simply pushes all changes out to the slave sites. In short, your question isn't really a backup-software question so much as it is a distributed site architecture and data consistency question. -- Phil Stracchino, CDK#2 DoD#299792458 ICBM: 43.5607, -71.355 ala...@caerllewys.net ala...@metrocast.net p...@co.ordinate.org Renaissance Man, Unix ronin, Perl hacker, Free Stater It's not the years, it's the mileage. ------------------------------------------------------------------------------ The Planet: dedicated and managed hosting, cloud storage, colocation Stay online with enterprise data centers and the best network in the business Choose flexible plans and management services without long-term contracts Personal 24x7 support from experience hosting pros just a phone call away. http://p.sf.net/sfu/theplanet-com _______________________________________________ Bacula-users mailing list Bacula-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/bacula-users