Hi

I think that the idea that Kyle coments about your cms upload files to
the backup and the server is very good. Depending how you implement
this you can easy made multiple replicas, and thistribute the load of
your system.

With the rate that your system grows you must think in segmet it at
any point, a made replicas for the segments in other to suport more
concurrent access to your site

Hope this help

On 4/29/06, Kyle Lutze <[EMAIL PROTECTED]> wrote:
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

one idea for at least getting a diff of what has changed in the system,
run 'ls -rl * > list' from the base dir of all of the files, and then
compare that to what's been saved already. from there use a form of
copying, rsync, ftp, etc.

another idea - have apache/squid/whatever download to a different folder
 structure of your main repo, then from 0 to 8 copy that into the
backups and then move them into the main folder structure

yet another idea - have your CMS (I'm just guessing, doesn't matter)
upload the files to both the main section and the backup servers

food for thought. don't know if any of them have been mentioned but I
didn't have time to read through all new posts. :/

Kyle
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.2.2-ecc0.1.6 (GNU/Linux)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org

iD8DBQFEU4L3VFIipMnXxfYRAkM1AKCBsVlIMbcJsKGndilAwsKVu+ZHiACfbSRD
WGcSGBB6KqAoiPtxwf/fKu0=
=xhPT
-----END PGP SIGNATURE-----
--
[email protected] mailing list




--
play tetris http://pepone.on-rez.com/tetris
run gentoo http://gentoo-notes.blogspot.com/
--
[email protected] mailing list

Reply via email to