Re: [BackupPC-users] 8.030.000, Too much files to backup ?

2012-01-12 Thread Jean Spirat
cutting in 4 the backup done the trick, i also moved the part that took the longest to tar. Curiously this is not the part with the most files but the part with the most directoryies that takes so long to backup :) anyway the 8Millions files are backed up now. thanks for your help. regards,

Re: [BackupPC-users] 8.030.000, Too much files to backup ?

2011-12-19 Thread gagablubber
You could transfer to the backuppc host not into the pool but to a temp directory and unpackage the tars there. This all via pre backup script. Then backuppc steps in and creates a local backup of this temporary files so you get the pooling. In the post backup script you flush this temp files.

Re: [BackupPC-users] 8.030.000, Too much files to backup ?

2011-12-19 Thread Les Mikesell
On Mon, Dec 19, 2011 at 2:27 AM, gagablub...@vollbio.de wrote: You could transfer to the backuppc host not into the pool but to a temp directory and unpackage the tars there. This all via pre backup script. Then backuppc steps in and creates a local backup of this temporary files so you get

Re: [BackupPC-users] 8.030.000, Too much files to backup ?

2011-12-19 Thread Tim Fletcher
On Mon, 2011-12-19 at 12:32 -0600, Les Mikesell wrote: On Mon, Dec 19, 2011 at 12:04 PM, Jean Spirat jean.spi...@squirk.org wrote: I directly mount the nfs share on the backuppc server so no need for rsyncd here this is like local backup with the NFS overhead of course. The whole point

Re: [BackupPC-users] 8.030.000, Too much files to backup ?

2011-12-19 Thread Pedro M. S. Oliveira
sorry to take long to reply. yes it saves me a lot of time, let me explain. although I have a fast san and servers the time for fetching lots of small files is high, the max bandwidth i could get was about 5MB/s, increasing concurrecy i can get about 20-40MB/s depending on what im backingup at

Re: [BackupPC-users] 8.030.000, Too much files to backup ?

2011-12-18 Thread gagablub...@vollbio.de
Why don't you ask the developers to write a script that creates one or a few tar files out of this massive number of files? The execution of that script could be triggered via http request (with authentification). On the backuppc side you could call this script via pre backup command before

Re: [BackupPC-users] 8.030.000, Too much files to backup ?

2011-12-18 Thread Pedro M. S. Oliveira
you may try to use a rsyncd directly on the server. This may speed up things. another thing is to split the large backup into several smaller ones. I've an email cluster with 8TB and millions of small files (I'm using dovecot), theres also a san involved. in order to use all the bandwidth

Re: [BackupPC-users] 8.030.000, Too much files to backup ?

2011-12-18 Thread Timothy J Massey
I'd rather deal with a few tarfiles, too, but you'll lose pooling... Unless the script that makes the tarfiles is intelligent. In which case BackupPC is somewhat overkill. Basically, your choices are poor no matter what. Garbage in, garbage out, and all that... Timothy J. Massey Out of the Box

[BackupPC-users] 8.030.000, Too much files to backup ?

2011-12-16 Thread Jean Spirat
hi, I use backuppc to save a webserver. The issue is that the application used on it is making thousand of little files used for a game to create maps and various things. The issue is that we are now at 100GB of data and 8.030.000 files so the backups takes 48H and more (to help the files

Re: [BackupPC-users] 8.030.000, Too much files to backup ?

2011-12-16 Thread Tim Fletcher
On Fri, 2011-12-16 at 10:42 +0100, Jean Spirat wrote: hi, I use backuppc to save a webserver. The issue is that the application used on it is making thousand of little files used for a game to create maps and various things. The issue is that we are now at 100GB of data and 8.030.000

Re: [BackupPC-users] 8.030.000, Too much files to backup ?

2011-12-16 Thread Jean Spirat
r. I would suggest you try the following: Move to tar over ssh on the remote webserver, the first full backup might well take a long time but the following ones should be faster. tar+ssh backups however use more bandwidth but as you are already using nfs I am assuming you are on a local

Re: [BackupPC-users] 8.030.000, Too much files to backup ?

2011-12-16 Thread Tim Fletcher
On Fri, 2011-12-16 at 11:49 +0100, Jean Spirat wrote: I would suggest you try the following: tar+ssh backups however use more bandwidth but as you are already using nfs I am assuming you are on a local network of some sort. for my understanding rsync had allways seems to be the most

Re: [BackupPC-users] 8.030.000, Too much files to backup ?

2011-12-16 Thread Les Mikesell
On Fri, Dec 16, 2011 at 4:49 AM, Jean Spirat jean.spi...@squirk.org wrote: Hum i cannot directly use the FS i have no access to the NFS server that is on the hosting company side i just have access to the webserver that use the nfs partition to store  it's content. Right now i also mount the

Re: [BackupPC-users] 8.030.000, Too much files to backup ?

2011-12-16 Thread Steve
On Fri, Dec 16, 2011 at 4:42 AM, Jean Spirat jean.spi...@squirk.org wrote: The issue is that we are now at 100GB of data and 8.030.000 files so the backups takes 48H and more (to help the files are on NFS share). I think i come to the point where file backup is at it's limit. What about a

Re: [BackupPC-users] 8.030.000, Too much files to backup ?

2011-12-16 Thread Tim Fletcher
On Fri, 2011-12-16 at 07:33 -0600, Les Mikesell wrote: On Fri, Dec 16, 2011 at 4:49 AM, Jean Spirat jean.spi...@squirk.org wrote: for my understanding rsync had allways seems to be the most efficient of the two but i never challenged this fact ;p Rsync working natively is very efficient,

Re: [BackupPC-users] 8.030.000, Too much files to backup ?

2011-12-16 Thread Arnold Krille
Hi, On Friday 16 December 2011 10:42:00 Jean Spirat wrote: I use backuppc to save a webserver. The issue is that the application used on it is making thousand of little files used for a game to create maps and various things. The issue is that we are now at 100GB of data and 8.030.000 files

Re: [BackupPC-users] 8.030.000, Too much files to backup ?

2011-12-16 Thread Jean Spirat
Excuse my off topic-ness, but with that many small files I kind of expect a filesystem to reach certain limits. Why is that webapp written to use many little files? Why not with a database where all that stuff is in blobs? That whould be easier to maintain and easier to back up. Have fun,

Re: [BackupPC-users] 8.030.000, Too much files to backup ?

2011-12-16 Thread Les Mikesell
On Fri, Dec 16, 2011 at 9:00 AM, Jean Spirat jean.spi...@squirk.org wrote: Excuse my off topic-ness, but with that many small files I kind of expect a filesystem to reach certain limits. Why is that webapp written to use many little files? Why not with a database where all that stuff is in