Dan,
Thanks, I will probably use your code and do something like
this. I appreciate the help.
At 03:58 PM 6/16/2008, you wrote:
I run this file every hour on my server. I run it on the server because it
only takes a couple of seconds to run vs. network traffic.
This places each table's data in its own file due to the file sizes gets to
be rather large. I do not backup the structure hourly because it does not
change. If I need to restore I pull the previous nights backup and restore
the table's data that has changed for the day.
Connect dbname
SET VAR vtablename TEXT,vfilename TEXT
--delete previous backup
delete c:\backups\*.dat
--gets only the table data
DECLARE cursor1 CURSOR FOR SELECT sys_table_name FROM sys_tables +
WHERE sys_table_type = 'table'
OPEN cursor1
FETCH cursor1 INTO vtablename INDICATOR vi1
WHILE SQLCODE <> 100 THEN
SET VAR vfilename = ('C:\BACKUPS\' + .vtablename + '.dat')
OUTPUT &vfilename
UNLOAD DATA FOR &vtablename
OUTPUT SCREEN
FETCH cursor1 INTO vtablename INDICATOR vi1
ENDWHILE
DROP CURSOR cursor1
CLEAR VAR v%
exit
Dan Goldberg
________________________________
From: [email protected] [mailto:[EMAIL PROTECTED] On Behalf Of Dan
Sent: Monday, June 16, 2008 12:18 PM
To: RBASE-L Mailing List
Subject: [RBASE-L] - Hourly Backups
After loosing a day last week and having to revert to yesterdays
data (only for a while thank God)
I was wondering if anyone has ideas on hourly backup routines?
I ran an import which went awry, and R:base allowed itself to grow past the
2 gig limit making the whole database unusable.
Is anyone using a backup procedure that would allow you to be up to the
hour?
Thanks in advance.
Dan Champion
Service Department Manager
Vredevoogd Heating & Cooling
Grandville, MI. 49418
616-534-8271 x 14
Dan Champion
Service Department Manager
Vredevoogd Heating & Cooling
Grandville, MI. 49418
616-534-8271 x 14