Just ran into a problem with my backup system....
    my server is co-located at an ISP, so I have to backup over the internet..
  I have an exact copy of the website on my local server, which is backed 
up to CDs. I am the only person who writes code for this server..  so I 
know I have all of the source code and graphic files backed up already, so 
I never have to back them up from the server.

The only thing that changes on the webserver is the database, stored in SQL 
server.  I has an automated task in SQL server back up the data every night 
to a file, then used a CF scheduled task to use CF_ZIP to zip the backup 
file up and every night before I go to sleep, I log in with PC anywhere and 
download it to my home computer and back it up over my home network and 
once a week to CDs.

Worked fine until today. It seems the data file got too large to zip.  Does 
CF_ZIP have a size limit? The file is 2.1 gigs unzipped (200 megs zipped - 
goes very fast with my cable modem:).  I have over 20 gigs of free hard 
disk space on the server.  Is there any work-around, or do I have to break 
the database up into sections and do the parts individually?  How do you 
handle back-up from remote servers?

Al 


~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Structure your ColdFusion code with Fusebox. Get the official book at 
http://www.fusionauthority.com/bkinfo.cfm
FAQ: http://www.thenetprofits.co.uk/coldfusion/faq
Archives: http://www.mail-archive.com/[email protected]/
Unsubscribe: http://www.houseoffusion.com/index.cfm?sidebar=lists

Reply via email to