Le samedi 15 janvier 2011 15:27:48, Bart Swedrowski a écrit :
> On 15 January 2011 14:12, Eric Bollengier
>
> > wrote:
> >
> > It sounds to be a bug when the FileDaemon is computing the checksum of
> > the file, it updates the Bytes Written counter when it shouldn't.
> >
> > Looks trivial to fi
On 15 January 2011 14:12, Eric Bollengier wrote:
> It sounds to be a bug when the FileDaemon is computing the checksum of the
> file, it updates the Bytes Written counter when it shouldn't.
>
> Looks trivial to fix, but I need some time to test the patch
That's interesting. Would you like me t
Hello,
> Now the bit that is particularly interesting to me is:
>
> * FD Bytes Written: 40,119,463,364 (40.11 GB)*
> * SD Bytes Written: 256,785,265 (256.7 MB) *
>
> Nothing has been written to the FD. FD was being read during the backup
> time only. And the amount shown as "SD B
On 14 January 2011 20:18, Martin Simmons wrote:
> It sounds like you have some large files which compress a lot.
>
Nah, I don't think that is the case. I know what are those files and those
are mainly small, tiny files like emails, small log files.
Have a look at below's output.
*14-Jan 02:38
> On Fri, 14 Jan 2011 09:23:37 +, Bart Swedrowski said:
>
> 2011/1/13 Mark :
> > Have you done a 'list files jobid=' for one of your incrementals?
> > Maybe you have a few really large files that are getting changed every day,
> > and therefore getting backed up each day.
>
> Yeah, I tri
On 14 January 2011 09:23, Bart Swedrowski wrote:
> Also, it's Bacula 5.0.3-2 re-compiled from sources provided on www.bacula.org.
Sorry - that is Bacula 5.0.3-1 re-compiled from sources on www.bacula.org.
--
Protect Your
2011/1/13 Mark :
> Have you done a 'list files jobid=' for one of your incrementals?
> Maybe you have a few really large files that are getting changed every day,
> and therefore getting backed up each day.
Yeah, I tried that, too. It's only listing files that got changed/are
new and should be b
another solution... though not quite the best...
create a pool and jobs specific to the PST file.. set the retention on the pool
to be say 7 days.. That way you can backup/restore the pst file separately and
not effect the backup of the rest of your system.
Obviously you still have to send over
On Jan 13, 2011, at 3:44 PM, Lawrence Strydom wrote:
> I understand that something is adding data and logically the backup should
> grow. What I don't understand is why the entire file has to be backed up if
> only a few bytes of data has changed. It is mainly outlook.pst files and
> MSSQL data
Thanks for the clear answer Paul.
Seems like I will have to enable Acurate and buy more disks.
On 13 January 2011 23:27, Paul Mather wrote:
> On Jan 13, 2011, at 3:44 PM, Lawrence Strydom wrote:
>
> > I understand that something is adding data and logically the backup
> should grow. What I do
2011/1/13 Lawrence Strydom
> Hi And thanks for all the replies so far.
>
> I'm running Bacula 5.0.3 on OpenSuSE 11.3. Self compiled with the following
> configure options:
>
> * --enable-smartalloc --sbindir=/usr/local/bacula/bin
> --sysconfdir=/usr/local/bacula/bin -with-mysql -with-openssl -ena
Sorry Bacula is not that clever..indeed it's just checking for files which
changes.. It's not able to determine how the file changed, or just back up
those bits which changed.
---Guy
Sent from my iPad
On 13 Jan 2011, at 20:44, Lawrence Strydom wrote:
> Hi And thanks for all the replies so fa
Hi And thanks for all the replies so far.
I'm running Bacula 5.0.3 on OpenSuSE 11.3. Self compiled with the following
configure options:
* --enable-smartalloc --sbindir=/usr/local/bacula/bin
--sysconfdir=/usr/local/bacula/bin -with-mysql -with-openssl -enable-bat
-sysconfdir=/etc/bacula -enable-t
On Thu, Jan 13, 2011 at 4:42 AM, Bart Swedrowski wrote:
>
> I think what Lawrence meant was that say full backup takes 33GB, as
> the one below.
>
> | 1,089 | tic FS | 2011-01-08 02:05:03 | B| F |
> 464,798 | 33,390,404,320 | T |
>
> Now, if you do Incremental backup, it'
> First there's something adding data everyday, so that's why there's more and
> more data.
>
I hope you put a limit on the file size or usage duration so that this
volume does not grow until it fills up the disk. Remember / retention
does not work until the volume is marked Full or used and for
On 01/13/2011 11:42 AM, Bart Swedrowski wrote:
> 2011/1/12 Kleber Leal
>> Yes. The entire file is backed up again when gets modification.
>> Incremental backups include all modified files since last backup (Full,
>> Incremental ou differential). Incremental and differential are file based.
>> if
2011/1/12 Lawrence Strydom :
> This leads me
> to believe that the entire file is being backed up instead of only the
> changed data which is my understanding of a differential backup.
The only program I know that work in that way is rdiff-backup.
It's very efficent in saving sapce but you do now
2011/1/12 Kleber Leal
> Yes. The entire file is backed up again when gets modification.
> Incremental backups include all modified files since last backup (Full,
> Incremental ou differential). Incremental and differential are file based.
> if you have a 100GB file and this was modified, it will
Yes. The entire file is backed up again when gets modification.
Incremental backups include all modified files since last backup (Full,
Incremental ou differential). Incremental and differential are file based.
if you have a 100GB file and this was modified, it will be backed up and
will use this s
Hi list.
My understanding of an incremental backup is that only changed data is
backed up. I use Bacula for backups to a disk array and configured it to do
full backups once a month and daily incremental backups.
I have been noticing that the incremental backups seem to be fairly large
though - no
20 matches
Mail list logo