Re: [BackupPC-users] Multiple CPU usage for compression?

2019-10-02 Thread Adam Goryachev



On 3/10/19 5:26 am, Daniel Berteaud wrote:

- Le 2 Oct 19, à 18:51,  p2k-...@roosoft.ltd.uk a écrit :


On 02/10/2019 15:46, Daniel Berteaud wrote:

- Le 1 Oct 19, à 10:51,  p2k-...@roosoft.ltd.uk a écrit :


Hmmm I am not so sure about that.. because it appears the time it takes
compress files also slows down the transfer of them. I was getting like
6Mb/s from a server on the same switch as the backup machine. One CPU
out of 16 was pegged at a 100% under compression.

How do you know compression is the bottleneck ?


I happened to be watching htop at the time. I was suprrised to see only
one core pegged at 100%

That doesn't mean this process is busy only doing compression (it might, but it 
could be doing something else)



Surely it would be
trivial to replace gzip with pigz and bzip2 with pbzip2?

BackupPC does not use an external binary to compress data so no, it wouldn't be
as trivial as s/gzip/pigz/


Oh? Then why is there a config variable for the gzip path ? What is it
used for it not compression?

Curious.

It's for compression of archives, not pooled files

++



You could always disable compression, and then see if it solves your CPU 
issue


Regards,
Adam

--
Adam Goryachev Website Managers www.websitemanagers.com.au
--
The information in this e-mail is confidential and may be legally privileged.
It is intended solely for the addressee. Access to this e-mail by anyone else
is unauthorised. If you are not the intended recipient, any disclosure,
copying, distribution or any action taken or omitted to be taken in reliance
on it, is prohibited and may be unlawful. If you have received this message
in error, please notify us immediately. Please also destroy and delete the
message from your computer. Viruses - Any loss/damage incurred by receiving
this email is not the sender's responsibility.


___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Multiple CPU usage for compression?

2019-10-02 Thread Daniel Berteaud
- Le 2 Oct 19, à 18:51,  p2k-...@roosoft.ltd.uk a écrit :

> On 02/10/2019 15:46, Daniel Berteaud wrote:
>> - Le 1 Oct 19, à 10:51,  p2k-...@roosoft.ltd.uk a écrit :
>>
>>> Hmmm I am not so sure about that.. because it appears the time it takes
>>> compress files also slows down the transfer of them. I was getting like
>>> 6Mb/s from a server on the same switch as the backup machine. One CPU
>>> out of 16 was pegged at a 100% under compression.
>> How do you know compression is the bottleneck ?
> 
> 
> I happened to be watching htop at the time. I was suprrised to see only
> one core pegged at 100%

That doesn't mean this process is busy only doing compression (it might, but it 
could be doing something else)


> 
>>> Surely it would be
>>> trivial to replace gzip with pigz and bzip2 with pbzip2?
>> BackupPC does not use an external binary to compress data so no, it wouldn't 
>> be
>> as trivial as s/gzip/pigz/
>>
> 
> Oh? Then why is there a config variable for the gzip path ? What is it
> used for it not compression?
> 
> Curious.

It's for compression of archives, not pooled files

++

-- 
[ https://www.firewall-services.com/ ]  
Daniel Berteaud 
FIREWALL-SERVICES SAS, La sécurité des réseaux 
Société de Services en Logiciels Libres 
Tél : +33.5 56 64 15 32 
Matrix: @dani:fws.fr 
[ https://www.firewall-services.com/ | https://www.firewall-services.com ]



___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Multiple CPU usage for compression?

2019-10-02 Thread Daniel Berteaud

- Le 1 Oct 19, à 10:51,  p2k-...@roosoft.ltd.uk a écrit :

> Hmmm I am not so sure about that.. because it appears the time it takes
> compress files also slows down the transfer of them. I was getting like
> 6Mb/s from a server on the same switch as the backup machine. One CPU
> out of 16 was pegged at a 100% under compression.

How do you know compression is the bottleneck ?

> Surely it would be
> trivial to replace gzip with pigz and bzip2 with pbzip2?

BackupPC does not use an external binary to compress data so no, it wouldn't be 
as trivial as s/gzip/pigz/

-- 
[ https://www.firewall-services.com/ ]  
Daniel Berteaud 
FIREWALL-SERVICES SAS, La sécurité des réseaux 
Société de Services en Logiciels Libres 
Tél : +33.5 56 64 15 32 
Matrix: @dani:fws.fr 
[ https://www.firewall-services.com/ | https://www.firewall-services.com ]



___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Multiple CPU usage for compression?

2019-10-01 Thread p2k-dev
Hmmm I am not so sure about that.. because it appears the time it takes
compress files also slows down the transfer of them. I was getting like
6Mb/s from a server on the same switch as the backup machine. One CPU
out of 16 was pegged at a 100% under compression. Surely it would be
trivial to replace gzip with pigz and bzip2 with pbzip2? On systems with
multiple cores you would see an immediate benefit and single core
instances would be unaffected.

Either way I was surprised to see one CPU pegged for so long and feel
that not optimising for resources is a bug in some way?


Thanks.


On 01/10/2019 04:58, Craig Barratt via BackupPC-users wrote:
> Each rsync backup has two processes on the backup server, but only one
> will be doing compression.  So, yes, compression for a single backup
> is single-threaded (as rsync is).
>
> However, the backup server usually runs multiple backups
> (configurable), and, in the steady state, the amount of compression
> required isn't very large: only new files not already in the pool need
> to be compressed.  It's not likely compression is a bottleneck.
>
> Craig
>
> On Mon, Sep 30, 2019 at 5:33 PM  > wrote:
>
> Hey guys,
>
>
> Is there such a setting or is backuppc genuinely single threaded
> when it
> comes to compression? I ask because seems a little silly these
> days when
> pretty much all CPUs are multicore and all server are multicore
> and multcpu?
>
>
> So is there a setting I am missing? I scanned the documentation and it
> does not really talk about how to improver compression speeds. That
> seems to be a bottle next  to me?
>
>
> Thanks.
>
>
> -- 
> ==
>
>
> Don Alexander.
>
>
>
>
> ___
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net
> 
> List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
> Wiki:    http://backuppc.wiki.sourceforge.net
> Project: http://backuppc.sourceforge.net/
>
>
>
> ___
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net
> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
> Wiki:http://backuppc.wiki.sourceforge.net
> Project: http://backuppc.sourceforge.net/




___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Multiple CPU usage for compression?

2019-09-30 Thread Craig Barratt via BackupPC-users
Each rsync backup has two processes on the backup server, but only one will
be doing compression.  So, yes, compression for a single backup is
single-threaded (as rsync is).

However, the backup server usually runs multiple backups (configurable),
and, in the steady state, the amount of compression required isn't very
large: only new files not already in the pool need to be compressed.  It's
not likely compression is a bottleneck.

Craig

On Mon, Sep 30, 2019 at 5:33 PM  wrote:

> Hey guys,
>
>
> Is there such a setting or is backuppc genuinely single threaded when it
> comes to compression? I ask because seems a little silly these days when
> pretty much all CPUs are multicore and all server are multicore and
> multcpu?
>
>
> So is there a setting I am missing? I scanned the documentation and it
> does not really talk about how to improver compression speeds. That
> seems to be a bottle next  to me?
>
>
> Thanks.
>
>
> --
> ==
>
>
> Don Alexander.
>
>
>
>
> ___
> BackupPC-users mailing list
> BackupPC-users@lists.sourceforge.net
> List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
> Wiki:http://backuppc.wiki.sourceforge.net
> Project: http://backuppc.sourceforge.net/
>
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/