Hello Eddie,
On Thu, Mar 24, 2016 at 7:13 PM, Eddie Appel wrote:
> Hello List,
>
> Forgive the long post, but I have a situation here that requires some
> help...
>
> My current setup...
> -- NetApp filer containing ~35TB of data to be backed up to tape
> -- Quantumi40 w/
Yes, her client compilation confirmed compression for gzip and lzo...so now I
would think there is an issue here. I also have one client out of around 40
that is not compressing. Dir and fd compiled from source version 7.4.0.
On March 25, 2016 5:29:20 AM GMT+07:00, "Ana Emília M. Arruda"
Hello List,
Forgive the long post, but I have a situation here that requires some
help...
My current setup...
-- NetApp filer containing ~35TB of data to be backed up to tape
-- Quantumi40 w/ 2 LTO-5 drives
-- Quantum SL3 w/ 1 LTO-5 drive
-- HP DL-380G5 with 16GB RAM and 3 x spool hdd
-- CentOS
Hello Ankush,
As Heitor said in a previous post, zlib development libraries must also be
installed on clients. Have you checked this? Compression is working on your
bacula server (and client) for your catalog backup because the zlib
development libraries are installed on the bacula director host.
Hello Randy,
Bacula server(7.x) is installed with Linux RPM package.
Catalog backup on bacula server work with compression = GZIP option.
Job log:
Software Compression: 84.9% 6.6:1
But clients does not work
Job log:
Software Compression: None
Thank you,
Ankush More
From: Randy Katz
How do I recover from this though? There are still several other TB that
need to be saved to tape from the client. On disk I had 25GB volumes
going all the way up to 900-something.
Dan
On 03/24/2016 09:02 AM, Wanderlei Huttel wrote:
Hello Daniel
It looks you have an error in your volume
Hello,
Complete FileSet:
# List of files to be backed up
FileSet {
Name = "chrome02 Set"
Include {
Options {
signature = MD5
compression = GZIP
}
#
#
File = /rman
options {
exclude = yes
wildfile = "*.data*"
}
}
#
# If you backup the root directory, the
Hello Daniel
It looks you have an error in your volume File0586.
14-Mar 12:52 bacula-sd JobId 90: Error: block_util.c:352 Volume data error
at 0:257145043!
Block checksum mismatch in block=29547158 len=64512: calc=dad7face
blk=34eff4a7
Best Regards
*Wanderlei Hüttel*
http://www.huttel.com.br
Hi Randy,
Backing up are Database files which has very high compression ratio.
I have check same files with linux command 'gzip' on client working fine with
good compression ratio.
But Bacula does not work.
Thank you,
Ankush More
From: Randy Katz [mailto:rk...@simplicityhosting.com]
Sent:
Any insights on this? I would love to try and move to bacula, but
cannot until this is sorted out. It is acting as if the migration job
successfully completed, but large amounts of the data have not been
transferred to tape. In addition new migration jobs or backup jobs from
the client do
> Hello Alex
> It's not possible.
> bacula-dir and bacula-sd must be in higher version of bacula-fd.
Hello, Wanderlei & Alex: this is the recommended: [dir version == sd version >=
fd version]
> Best Regards
> Wanderlei Hüttel
> Enviado de Motorola Moto X2
> Em 24 de mar de 2016 12:47 AM,
Hello Alex
It's not possible.
bacula-dir and bacula-sd must be in higher version of bacula-fd.
Best Regards
Wanderlei Hüttel
Enviado de Motorola Moto X2
Em 24 de mar de 2016 12:47 AM, "Alex Brandt"
escreveu:
> Hey,
>
> I'm assuming the answer will be to stop mixing
Hi Ankush, are you sure the files you are backing up can be compressed?
You might
want to try a separate backup on the same client of something like
/usr/local /etc
just to test, these should definitely compress. Sometimes video, audio
and other dense
files do not compress.
On 3/23/2016 11:44 AM,
13 matches
Mail list logo