no, this is out of line...you should have something like the *slowest/worst
case* access time(20ms?) * number of files + filesize/2(transfering from
diskA to diskA so thoroughput is 1/2) so 10000 files *something like 20ms
and 10Gb would be something like 15 minutes for the backup. 20Gb and twice
as many files would be like 30minutes.. I apply this math to my file system
stats and get these kind of results. the math also works to fine out how
long an incremental backup would take.. 10000 files takes about 3 minutes
for the incremental.
hope that info helps..
i would check top processes and see if something is eating up processor
time.
On 10/9/07, [EMAIL PROTECTED] <[EMAIL PROTECTED]> wrote:
>
> Hi when backing-up my local host using .tar it would sometimes take 3 days
> or more.
> I only know this because of disk usage.the hd led is constantly on. It
> this normal?
>
>
> -------------------------------------------------------------------------
> This SF.net email is sponsored by: Splunk Inc.
> Still grepping through log files to find problems? Stop.
> Now Search log events and configuration files using AJAX and a browser.
> Download your FREE copy of Splunk now >> http://get.splunk.com/
> _______________________________________________
> BackupPC-users mailing list
> [email protected]
> https://lists.sourceforge.net/lists/listinfo/backuppc-users
> http://backuppc.sourceforge.net/
>
-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems? Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
BackupPC-users mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/