Hello,

Thanks!!, but I've already have it, and what I thought is exactly this,
compress with pigz before the backup and copy the file to several sites
(bareos, other server...).
My problem comes because I want to also backup the raw file in bareos to be
able to recover the file directly without import a dump, because the dump
takes too much to be recovered, and raw file less than 30m.

Greetings!

El vie., 21 sept. 2018 20:51, Eric Browning <
[email protected]> escribió:

> Why not use a task to backup the database to the local drive first then
> the script to compress it before backing it up with bareos and then
> deleting the backup on the local drive.
>
> On Fri, Sep 21, 2018 at 11:02 AM Daniel Carrasco <[email protected]>
> wrote:
>
>> Thanks for your response.
>>
>> Yeah, that's what I thought for the SQL backup file on that machine, but
>> I've a MsSQL data file that is backup using VSS because is in use, and that
>> option is not valid.
>>
>> I also think that don't use the full power is good to don't slowdown the
>> other services, but also having an option to select how many cores to use
>> is a good idea. For example, this server is the most of time at less than
>> 50% of CPU, and at night is just waiting on less than 5%, so you can use
>> the 40% at day and 90% at night without problem and will work at x10 and
>> x22 the single core speed.
>>
>> Greetings.
>>
>> El vie., 21 sept. 2018 17:06, Douglas K. Rand <[email protected]> escribió:
>>
>>> On 09/21/18 06:20, [email protected] wrote:
>>> > Is there any compression method with Multithreading support?.
>>> >
>>> > I want to backup an SQL file of +300GB, and the problem is that GZIP
>>> and
>>> > LZ4HC at least are single threaded.
>>>
>>> You can hack it up by having a ClientRunBeforeJob script that uses
>>> pbzip2 (or
>>> another tool) to compress the file before the backup.  You may want a
>>> ClientRunAfterJob script that then deletes the file after the backup.
>>>
>>> We do something kinda like this, and we built a specific backup job just
>>> for
>>> the thing we needed to backup. So we have the general backup for that
>>> system
>>> that then specifically ignores the directory, and then another job comes
>>> along
>>> and does just that one directory.
>>>
>>> Checkout the Catalog backup in the Bareos docs, it has a lot of the same
>>> features.
>>>
>>> I personally like that the Bareos FD only uses one core to do
>>> compression.
>>> Yes, it is slower, but it controls the impact that backups have on
>>> systems.
>>>
>>> --
>>> You received this message because you are subscribed to a topic in the
>>> Google Groups "bareos-users" group.
>>> To unsubscribe from this topic, visit
>>> https://groups.google.com/d/topic/bareos-users/ihBUmKx5eAE/unsubscribe.
>>> To unsubscribe from this group and all its topics, send an email to
>>> [email protected].
>>> To post to this group, send email to [email protected].
>>> For more options, visit https://groups.google.com/d/optout.
>>>
>> --
>> You received this message because you are subscribed to the Google Groups
>> "bareos-users" group.
>> To unsubscribe from this group and stop receiving emails from it, send an
>> email to [email protected].
>> To post to this group, send email to [email protected].
>> For more options, visit https://groups.google.com/d/optout.
>>
>
>
> --
> Eric Browning
> Systems Administrator
> 801-984-7623
>
> Skaggs Catholic Center
> Juan Diego Catholic High School
> Saint John the Baptist Middle
> Saint John the Baptist Elementary
>
> Twitter: @SCCMrB
>

-- 
You received this message because you are subscribed to the Google Groups 
"bareos-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to