* On Sunday 25 March 2007 15:18, Krsnendu dasa [EMAIL PROTECTED]
wrote:
It is hard to find files in the backups by browsing. If there were a
search feature that allowed you to search one or more computers
backups that would be great.
I know this isn't a real solution per se, but if you're in a
I am using backuppc but it is extremely slow. I narrowed it down to disk
bottleneck. (ad2 being the backup disk). Also checked the archives of
the mailing list and it is mentioned that this is happening because of
too many hard links.
Disks ad0 ad2
KB/t 4.00 25.50
tps 175
MB/s
Evren Yurtesen wrote:
I am using backuppc but it is extremely slow. I narrowed it down to disk
bottleneck. (ad2 being the backup disk). Also checked the archives of
the mailing list and it is mentioned that this is happening because of
too many hard links.
[snip]
The basic problem is
John Pettitt wrote:
Evren Yurtesen wrote:
I am using backuppc but it is extremely slow. I narrowed it down to disk
bottleneck. (ad2 being the backup disk). Also checked the archives of
the mailing list and it is mentioned that this is happening because of
too many hard links.
[snip]
Evren Yurtesen wrote:
John Pettitt wrote:
Evren Yurtesen wrote:
I am using backuppc but it is extremely slow. I narrowed it down to disk
bottleneck. (ad2 being the backup disk). Also checked the archives of
the mailing list and it is mentioned that this is happening because of
too many hard
On 3/26/07, Evren Yurtesen [EMAIL PROTECTED] wrote:
John Pettitt wrote:
The basic problem is backuppc is using the file system as a database -
specifically using the hard link capability to store multiple references
to an object and the link count to manage garbage collection. Many
Les Mikesell wrote:
Evren Yurtesen wrote:
John Pettitt wrote:
Evren Yurtesen wrote:
I am using backuppc but it is extremely slow. I narrowed it down to
disk
bottleneck. (ad2 being the backup disk). Also checked the archives of
the mailing list and it is mentioned that this is happening
Evren Yurtesen wrote:
I know that the bottleneck is the disk. I am using a single ide disk to
take the backups, only 4 machines and 2 backups running at a time(if I
am not remembering wrong).
I see that it is possible to use raid to solve this problem to some
extent but the real solution
Evren Yurtesen wrote:
I know that the bottleneck is the disk. I am using a single ide disk to
take the backups, only 4 machines and 2 backups running at a time(if I
am not remembering wrong).
I see that it is possible to use raid to solve this problem to some
extent but the real
Original Message
Subject: Re:[BackupPC-users] very slow backup speed
From: Evren Yurtesen [EMAIL PROTECTED]
To: David Rees [EMAIL PROTECTED]
Date: 26.03.2007 23:37
David Rees wrote:
It is true that BackupPC is great, however backuppc is slow because it
is trying to make
John Pettitt wrote:
Evren Yurtesen wrote:
I know that the bottleneck is the disk. I am using a single ide disk
to take the backups, only 4 machines and 2 backups running at a
time(if I am not remembering wrong).
I see that it is possible to use raid to solve this problem to some
extent
Hello
I've just installed BackupPC and love it. It's really great, and
great to see an open source application which competes with
similar enterprise level products.
I only need to backup Linux servers with rsync over SSH, and have set
up a test deployement of BackupPC as described in the docs.
Evren Yurtesen wrote:
If your filesystem isn't a good place to store files, there is not much
an application can do about it. Perhaps it would help if you mentioned
what kind of scale you are attempting with what server hardware. I know
there are some people on the list handling what I
John Hannfield wrote:
Hello
I've just installed BackupPC and love it. It's really great, and
great to see an open source application which competes with
similar enterprise level products.
I only need to backup Linux servers with rsync over SSH, and have set
up a test deployement of
John Pettitt wrote:
Changing backuppc would be decidedly non-trivial - eyeballing it to hack
in a real database to store the relationship between pool and individual
files would touch almost just about every part of the system.
And there's not much reason to think that a database could do
On 3/26/07, Bernhard Ott [EMAIL PROTECTED] wrote:
It is true that BackupPC is great, however backuppc is slow because it
is trying to make backup of a single instance of each file to save
space. Now we are wasting (perhaps even more?) space to make it fast
when we do raid1.
You can't be
On 3/26/07, Evren Yurtesen [EMAIL PROTECTED] wrote:
And, you could consider buying a faster drive, or one with a larger
buffer. Some IDE drives have pathetically small buffers and slow
rotation rates. That makes for a greater need for seeking, and worse
seek performance.
Well this is a
Let's start at the beginning:
On 3/26/07, Evren Yurtesen [EMAIL PROTECTED] wrote:
I am using backuppc but it is extremely slow. I narrowed it down to disk
bottleneck. (ad2 being the backup disk). Also checked the archives of
the mailing list and it is mentioned that this is happening because
On 3/20/07, Henrik Genssen [EMAIL PROTECTED] wrote:
are there any issues upgrading from 2.1.2.pl1?
None that I know of. The upgrade process is pretty smooth. (though I
opted to convert to the new configuration file layout at the same time
which does take a bit of tweaking).
is 3.0 yet
is 3.0 yet apt-getable?
Don't know, I always install from source.
-Dave
Yes, it is. It is only in unstable though, so you'll need to specify that
apt-get use the unstable repositories to get version 3.0.
Peace,
Jim
-
I had been running BackupPC on an Ubuntu computer for several months to
back the computer to a spare hard drive without problem. About the time
I added a new host (Windows XP computer using Samba), I started getting
the following behavior:
BackupPC backs both hosts properly onto the spare hard
Jason Hughes wrote:
Evren Yurtesen wrote:
And, you could consider buying a faster drive, or one with a larger
buffer. Some IDE drives have pathetically small buffers and slow
rotation rates. That makes for a greater need for seeking, and worse
seek performance.
Well this is a seagate
Les Mikesell wrote:
Evren Yurtesen wrote:
If your filesystem isn't a good place to store files, there is not
much an application can do about it. Perhaps it would help if you
mentioned what kind of scale you are attempting with what server
hardware. I know there are some people on the
I've got one customer who's server has taken 3600 minutes to
backup. 77 Gigs of Data. 1,972,859 small files. Would tar be
better or make this faster? It's directly connected via 100 Mbit to
the backup box.
--
Jesse Proudman, Blue Box Group, LLC
David Rees wrote:
Let's start at the beginning:
On 3/26/07, Evren Yurtesen [EMAIL PROTECTED] wrote:
I am using backuppc but it is extremely slow. I narrowed it down to disk
bottleneck. (ad2 being the backup disk). Also checked the archives of
the mailing list and it is mentioned that this
Jason Hughes wrote:
Evren Yurtesen wrote:
Jason Hughes wrote:
That drive should be more than adequate. Mine is a 5400rpm 2mb
buffer clunker. Works fine.
Are you running anything else on the backup server, besides
BackupPC? What OS? What filesystem? How many files total?
Evren Yurtesen wrote:
There are 4 hosts that have been backed up, for a total of:
* 16 full backups of total size 72.16GB (prior to pooling and
compression),
* 24 incr backups of total size 13.45GB (prior to pooling and
compression).
# Pool is 17.08GB comprising 760528
Winston writes:
I had been running BackupPC on an Ubuntu computer for several months to
back the computer to a spare hard drive without problem. About the time
I added a new host (Windows XP computer using Samba), I started getting
the following behavior:
BackupPC backs both hosts properly
28 matches
Mail list logo