Ferro Backup System <http://www.ferrobackup.com/>  is doing  the trick  for
me

greetings
Piotr


2013/5/12 Gerard van Loenhout <[email protected]>

> Bob, why not use a NAS? I have a Qnap in the office with two 3tb drives.
> 7 DP users in the office and two on VPN / RDP outside the office.
> Every input is mirrored on the second drive.
>
> Regards,
> Gerard van Loenhout
>
>
> 2013/5/12 Bob DeRosier <[email protected]>
>
>>
>>
>> This approach has me concerned about database coherence as it takes some
>> time to run.  I have reports which export data to another application and
>> those take at least 12min to run when the machine is not doing anything
>> else. Even running a t-log file for the whole thing takes several minutes.
>> I would trust a transaction log file more  - though that adds some more
>> complexity to the mix. If users are making changes to the db- when does the
>> transaction log file stop ?   Also, if the server is caching writes to the
>> transaction log and gets clobbered before closing it, are the files still
>> usable, especially if it dies during some middle part of the process before
>> all the tables have been exported ?
>>
>> I admit I am not sure how to dynamically assign file names or manually
>> create a transaction log from within a report.
>>
>> At this point, I am leaning towards the Acronis solution.  For either
>> solution, I want to write to an external disk (can't have a power spike
>> kill the primary and backup hardware) and possibly put it on a network
>> attached drive.   Even better would be to mirror that to the cloud in some
>> manner to have it completely offsite.  That raises other security and
>> access issues but does address the dataloss problem.
>>
>> Bob
>>
>>
>> At 03:16 PM 05/10/2013, Tim Rude wrote:
>>
>> What about writing a report that exports the data to file(s) that could
>> then be imported to rebuild the database in case of a disaster. Depending
>> on how ambitious you felt like being, the data could be exported to
>> multiple files (one for each panel), or even better would be to create a
>> synthetic transaction log file with all of the data in one big file. You
>> could have the report dynamically assign the filename(s) based on the
>> current date and time. Then you could have a scheduled task on one of the
>> machines run the report periodically, using the command line.
>>
>> That should suffice to keep everyone from having to exit from the
>> database every couple of hours, and provide some protection for the data
>> being entered.
>>
>> The downside is the time it takes to create the report to export all of
>> the data, and the need to update the report if the database structure
>> changes. It would be much easier if there was a command-line option to
>> export the entire database to a T-log file (doing the same thing as
>> Shift-F9, A), but there isn't that I know of.
>>
>> Tim Rude
>>  ----- Original Message -----
>> From: Bob DeRosier <[email protected]>
>> To: [email protected]
>> Sent: Friday, May 10, 2013 12:49 PM
>> Subject: [Dataperf] How to do multiple backups during work day ?
>>
>> Hi All;
>>
>> I have an application that the users would like to change their
>> workflow.  One of the ideas would require multiple backups during the day,
>> just to avoid losing work.   When I backup DP, I usually have everybody off
>> the system, then copy the files to the backup location and work from
>> there.  Obviously, this would be a bit cumbersome with people working on it
>> all day, especially if I want to do this multiple times a day.    Any
>> thoughts ?
>>
>> If this were running on a VM or storage area network or some such with a
>> SQL server, I would just schedule a job to quiesce the database and take a
>> snapshot of the files at that time and go from there.  I don't know of a
>> similar command for DP, that is if there are pending writes or the database
>> is in an incoherent state where the records are out of synch, I don't know
>> how to force them to be in synch without just throwing everybody out of the
>> system.    As it stands, it is running on XP in a low budget environment,
>> no SAN, no NAS, no Windows server with VSS,  no fancy backup appliance,
>> just a non-profit with a small database and a need..
>>
>> thanks for any advice.
>>
>> Bob
>>
>> ------------------------------
>> _______________________________________________
>> Dataperf mailing list
>> [email protected]
>>  http://lists.dataperfect.nl/mailman/listinfo/dataperf
>>
>> _______________________________________________
>> Dataperf mailing list
>> [email protected]
>>  http://lists.dataperfect.nl/mailman/listinfo/dataperf
>>
>>
>> _______________________________________________
>> Dataperf mailing list
>> [email protected]
>> http://lists.dataperfect.nl/mailman/listinfo/dataperf
>>
>>
>
> _______________________________________________
> Dataperf mailing list
> [email protected]
> http://lists.dataperfect.nl/mailman/listinfo/dataperf
>
>


-- 
Pozdrawiam,
Piotr
_______________________________________________
Dataperf mailing list
[email protected]
http://lists.dataperfect.nl/mailman/listinfo/dataperf

Reply via email to