Am 10.03.2010 18:02, Carl Wilhelm Soderstrom wrote:
> On 03/09 11:15 , Tomasz Chmielewski wrote:
>> Is it possible to make BackupPC send an email after a successful backup?
>>
>> Right now, I only see it will notify on backup failures.
>
> This has been asked several t
Is it possible to make BackupPC send an email after a successful backup?
Right now, I only see it will notify on backup failures.
--
Tomasz Chmielewski
http://wpkg.org
--
Download Intel® Parallel Studio Eval
Try the
hen I multiply Size/MB by 30% (that's what I have in "Existing Files -
Comp" ratio, latest backup), it gives me more or less 9 GB - which is
pretty close to "du" output.
But it will all be useless as
your server
uses this amount of space in backup" (without taking non-technical
people through what pooling is etc.).
Sort of "accounting" question.
--
Tomasz Chmielewski
http://wpkg.org
--
Join us D
uot; values too - would be great.
I know that due to pooling, space measured this way for all hosts would
be more than space used on the disk, but that's not my concern.
All I want to know is how much files for a given host "weight".
--
Tomasz Chmielewski
http://wpkg.org
p#, but it does not reveal total values for a given host.
--
Tomasz Chmielewski
http://wpkg.org
--
Join us December 9, 2009 for the Red Hat Virtual Experience,
a free event focused on virtualization and cloud computing.
ble with
newer rsync protocol versions?
--
Tomasz Chmielewski
http://wpkg.org
--
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourcefor
ock device equal or bigger than your current array
- new storage accessible either locally or over iSCSI
- matadata for RAID would be stored in a file (typically, it is stored
in RAID)
This way, you make no changes in your current RAID, but you can still
clone it.
--
Les Mikesell schrieb:
> Tomasz Chmielewski wrote:
>> BackupPC distinguishes users by using a user which authenticated against
>> .htpasswd.
>
> Actually backuppc uses the REMOTE_USER setting established by apache -
> and apache has a large number of authentication mod
BackupPC distinguishes users by using a user which authenticated against
.htpasswd.
I would like to allow users to change their (and only their) password
stored in .htpasswd, in a web interface.
Does anyone have any example code to achieve this?
--
Tomasz Chmielewski
http://wpkg.org
ot;? No, some Outlook files are being exmpted from being included
in the roaming profile (some files are stored in "Local Settings"
directory, which is not uploaded to the server).
2) write a script which copies local user files somewhere else - not in
use, you can copy them on Win
snapshot your drives with shadow copy - this makes sure you don't have
"file in use problem"
- run rsyncd service as SYSTEM user
- use the latest cwrsync with the latest _snapshot_ cygwin1.dll from
http://cygwin.com/snapshots/ - it allows Cygwin programs to open files
which
e) versions of rsync/ssh for Windows that would
> work better?
Your Windows version of rsync is very outdated.
You may try the newest one from:
www.itefix.no/cwrsync
--
Tomasz Chmielewski
http://wpkg.org
-
This SF.Net emai
g
CompressionLevel does not make any sense - SSH protocol 2 has
compression level hardcoded to 6, and you can't change it.
And unless both machines (BackupPC server and the other side) are *very*
loaded, changing cipher specification will
eat.
Unfortunately, BackupPC does not support any sort of "transfer compression".
The best you can do is:
- do rsync transfers over SSH - SSH provides compression
- use VPN with compression, i.e. OpenVPN
--
Tomasz Chmielewski
http://wpkg.org
-
Holger Parplies schrieb:
> Hi,
>
> Tomasz Chmielewski wrote on 2008-10-06 10:47:52 +0200 [[BackupPC-users]
> corrupted files and never-ending backups]:
>> To date, for me, the most serious issue with BackupPC are never-ending
>> backups, caused by corrupted files.
>&g
x27;t uncompress stdin
# BackupPC_zcat < f2008-04-01-full.sql > uncompress
BackupPC_zcat: can't uncompress stdin
It's easiest to spot with big files.
The issue is a problem as it breaks backup for a given host until a
broken file is removed manually.
Is there a workaround to tha
Tomasz Chmielewski schrieb:
(...)
> BackupPC fails on transferring a file which is just 768 MB big (the file
> does not change on the Windows side, as it is a read-only shadow copy
> snapshot):
>
> Unable to read 4751360 bytes from
> /srv/backuppc-data/pc/naveval-sql1/new/
into "new" directory, the file is
already 5510 MB and grows!
Anyone has seen such a problem before?
--
Tomasz Chmielewski
http://wpkg.org
-
Sponsored by: SourceForge.net Community Choice Awards: VOTE NOW!
Studies
Tomasz Chmielewski schrieb:
(...)
> 2. rsync server started on Windows can't open files which long path
> names (over 260 characters)[1].
(...)
> The above two problems can be solved: a recent snapshot of
> cygwin1.dll[2] can open both files with problematic DLLs and long
Tomasz Chmielewski schrieb:
(...)
> The above two problems can be solved: a recent snapshot of
> cygwin1.dll[2] can open both files with problematic DLLs and long
> filenames, when used with a recent version of rsync (cwRsync[3]).
>
> I can verify that it really works by
[2] http://cygwin.com/snapshots
[3]
http://www.itefix.no/phpws/index.php?module=pagemaster&PAGE_user_op=view_page&PAGE_id=6
--
Tomasz Chmielewski
http://wpkg.org
-
Check out the new SourceForge.net Marketplace.
It
with BackupPC,
don't use RAID-5 (but RAID-10, for example), use faster drives etc.
Of course, your mileage will vary, will depend on many other factors.
--
Tomasz Chmielewski
http://wpkg.org
-
This SF.net email is sp
or the whole array.
--
Tomasz Chmielewski
http://wpkg.org
-
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse01200
mount option to ext3.
I found the best results with anticipatory IO scheduler
(Documentation/block/as-iosched.txt), which tries to reduce seeks.
--
Tomasz Chmielewski
http://wpkg.org
-
This SF
vere speed-up for bigger sites.
Unfortunately, BackupPC still uses rsync 2.6.x and therefore, can't
benefit from this new rsync feature.
--
Tomasz Chmielewski
http://wpkg.org
-
This SF.net email is sponsored
servers
> instead of one BIG one.
I guess he meant splitting one big backup job into several smaller
(i.e., instead of backing up 1x350 GB, backup 7x50 GB, all that to one
BackupPC server) - it is always a good idea for large
Fcntl;
my $dir = "/mnt/iscsi_backup/test";
chdir $dir;
sysopen (TESTFILE, '0', O_CREAT, 0644);
close (TESTFILE);
for ( my $i = 1; $i <= 1; $i++ ) {
link 0,$i;
}
# cat mkdirs.sh
#!/bin/bash
DIR="/mnt/iscsi_backup/test"
cd $DIR
seq 1 1 | xargs tim
t you write, it sounds as if with ZFS everything was still
purely in RAM (large RAM usage and 50x faster).
It is fine when you do an operation like that once in a while, but with
BackupPC, ~random reads and writes are being made basically ~24h/day,
which means pressure on RAM sooner or later.
-
h, I forgot about RAID-5 bitmaps. I did a quick search and it
> appears that bitmaps can really kill performance. But it does prevent
> a full-resync after a crash. I don't think it's worth it in your case.
Technically, it is possible to place the bitm
rors.
>>
>
> Yeah, I've been getting stuff like this:
> 2008-02-27 11:51:16 BackupPC_link got error -4 when calling
> MakeFileLink(
>
> I've changed the permissions... how soon should I expe
979,82 14712 9808
Device:tps Blk_read/s Blk_wrtn/s Blk_read Blk_wrtn
sda 135,86 501,90 1353,05 5024 13544
Device:tps Blk_read/s Blk_wrtn/s Blk_read Blk_wrtn
sda 132,87 401,20 1293,
David Rees schrieb:
> On Tue, Feb 26, 2008 at 2:23 PM, Tomasz Chmielewski <[EMAIL PROTECTED]> wrote:
>> > Can you give us more details on your disk array? Controller, disks,
>> > RAID layout, ext3 fs creation options, etc...
>>
>> I said some of that al
Adam Goryachev schrieb:
> -BEGIN PGP SIGNED MESSAGE-
> Hash: SHA1
>
> Tomasz Chmielewski wrote:
> | Les Mikesell schrieb:
> |> Tomasz Chmielewski wrote:
> |>> Although - with IO::Dirent "wa" is now 100% almost all the time, and
> |>>
Les Mikesell schrieb:
> Tomasz Chmielewski wrote:
>>
>> I indeed use iSCSI.
>> The storage is on the SAN, and it's accessible via iSCSI.
>> BackupPC itself runs as a Xen guest.
>
> Is there some point to splitting the box where backuppc runs and its
> sto
Les Mikesell schrieb:
> Tomasz Chmielewski wrote:
>>
>> Although - with IO::Dirent "wa" is now 100% almost all the time, and
>> the system feels much slower. Hm. Let's hope it's coincidence, and
>> assume the system was just committing a big writ
small files at the expense of some
> data integrity in the case of a crash (which should not be a big deal
> since BackupPC does a good job of verifying the pool during backups).
Yeah, I was a bit afraid of data integrity. Using commit=60 sounds like
a good
Paul Archer schrieb:
> 11:17am, Tomasz Chmielewski wrote:
>
>> Why should any filesystem perform seeks better (when writing) than any
>> other filesystem?
>>
>> I imagine it could be true only if:
>>
>> - kernel would cache a large amount of writes
>
etter (when writing) than any
other filesystem?
I imagine it could be true only if:
- kernel would cache a large amount of writes
- kernel would commit these writes not in a FIFO manner, but whenever it
sees that the blocks on the underlying device are close to each other
Can ZFS do it?
--
David Rees schrieb:
> On Mon, Feb 25, 2008 at 1:23 AM, Tomasz Chmielewski <[EMAIL PROTECTED]> wrote:
>> Unfortunately, it doesn't scale very well in terms of performance - you
>> may see this thread on linux-fsdevel list for more info:
>> http://marc.info/?t=1
able.
After all, there are 16 * 16 * 16 = 4096 cpool subdirectories.
What changes would be needed in BackupPC to allow
$Conf{BackupPCNightlyPeriod} to be bigger than 16?
--
Tomasz Chmielewski
http://wpkg.org
-
This SF.net email is s
7;re talking about rsync-specific excludes (options to rsync),
not BackupPC excludes ($Conf{BackupFilesExclude}), right?
--
Tomasz Chmielewski
http://wpkg.org
-
This SF.net email is sponsored by DB2 Express
Download DB2 Express C
'/home/samba/profiles/Domain
Users/contact/*/*/*/*/*/*.MP3',
But as you see, it doesn't scale very well, because you never know how
many subdirectories are really there.
How can I exclude backups of certain filetypes from one
directory/subdirectory only?
--
Tomasz C
time 0ms
rtt min/avg/max/mdev = 45.993/45.993/45.993/0.000 ms
CheckHostAlive: returning 45.993
Backup aborted (Child exited prematurely)
dump failed: Child exited prematurely
link 192.168.100.145
--
Tomasz Chmielewski
http://wpkg.org
-
Tomasz Chmielewski schrieb:
> Normally, when we use BackupPC's "Archive" functionality, it creates one
> tar (gz|bz2) archive per host.
>
> This way, if the hosts are similar, we loose all hardlinks that BackupPC
> used internally to save us precious space.
>
ardlinks) for all the hosts we want?
--
Tomasz Chmielewski
http://wpkg.org
-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get the chance to share your
opinions on I
o extract
I found only one reference on the group to a similar problem.
Ideas?
--
Tomasz Chmielewski
-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForge.net's Techsay panel and you'll get th
- at least I didn't find the problematic big file
partially transferred in the archive.
Am I correct? Is there a way to prevent it (i.e., BackupPC should not
remove not fully transferred files, and should resume).
--
Tomasz Chmielewski
http:
on't we?
For an internet link, and multi-gigabyte shares this doesn't look that
optimistic.
Is there a solution for that?
--
Tomasz Chmielewski
http://wpkg.org
-
Take Surveys. Earn Cash. Influence the Futur
" would solve the issue?
http://www.okisoft.co.jp/esc/utf8-cygwin/
I think I used it for some time, and didn't have these "file has
vanished" problems.
I wonder if it's better to use with BackupPC than
omplete well within this time)
>
> BackupZeroFilesisFatal = 0
Did you try to start the backup manually?
You will see the errors should there be any...
backuppc$ /srv/backuppc/bin/BackupPC_dump -v -f destination_machine
--
Tomasz Chmielewski
http://wpkg.org
ve/c/WINDOWS/system32/u:/Freigaben/mig/Homes/user2/my/µet?f?as?
pt? a?a??t ?a?µ?a?-d?µ???.doc" (in uDrive)
--
Tomasz Chmielewski
-
Take Surveys. Earn Cash. Influence the Future of IT
Join SourceForg
upPC 3.0.0beta2, but the problem persists (at least for
the files that are about 2 GB big).
Any ideas what's still wrong?
--
Tomasz Chmielewski
http://wpkg.org
-
Take Surveys. Earn Cash. Influence the Future of
d way to upgrade from 3.0.0beta1 to 3.0.0beta2
(and eventually, to 3.0.0-stable)?
Is overwriting backuppc/bin/BackupPC* and backuppc/lib/* and files
enough (and probably, fetching a new File::RsyncP)?
Or should I upgrade by sta
Byron Trimble wrote:
> Could someone show me an example of a config.pl file for a archive host?
$Conf{XferMethod} = 'archive';
$Conf{ArchiveDest} = '/srv/backuppc-data/pc/restore/restore';
--
Tomasz Chmi
Tomasz Chmielewski wrote:
(...)
> The log file looks like that - it shows that there was indeed backup
> made for sDrive, tDrive, uDrive - why were they wiped out then, and
> started from scratch?
I think it was about $Conf{PartialAgeMax} being too low. I set this to
30, hope
xfer (aborted by user
(signal=INT))
2006-11-12 16:36:35 Saved partial dump 0
2006-11-12 16:41:55 full backup started for directory sDrive
I'm running BackupPC 3.0.0beta1.
--
Tomasz Chmielewski
http://wpkg.org
-
U
e backup for the host
$BACKUPPCDIR/bin/BackupPC_archiveHost
$BACKUPPCDIR/bin/BackupPC_tarCreate /usr/bin/split /usr/bin/par2 \
"$HOST" "$NUMBER" /usr/bin/gzip .gz 000
$RESTOREDIR/$DATE 0 \*
fi
done
--
Tomasz Chmielewski
Tomasz Chmielewski wrote:
> I use rsyncd to retrieve files from Windows machines.
>
> I transfer such shares:
>
> $Conf{RsyncShareName} = ['hDrive', 'iDrive', 'jDrive', 'kDrive'];
>
>
> And I would like to skip transferring "
t?
I tried the following below, but it doesn't work:
$Conf{BackupFilesExclude} = {
'hDrive' => ['DFS/*',
'/DFS/*',
'ygdrive/c/WINDOWS/system32/s:/DFS/*',
e must be something wrong with that Windows machine, as I'm getting
this error only there.
If I want to reproduce it by using smbclient only - how should I do it?
For example, use smbclient to retrieve files from S$ drive/
e permissions first).
But that's a general problem with Windows IMO.
--
Tomasz Chmielewski
http://wpkg.org
-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly with pre-in
Les Mikesell wrote:
> On Tue, 2006-11-07 at 07:39, Tomasz Chmielewski wrote:
>>>> It looks exactly the same in the web interface - one letter folder
>>>> names, empty inside.
>>>>
>>> I started the backup manually, and it produces the following er
Tomasz Chmielewski wrote:
> Tomasz Chmielewski wrote:
>
>> It looks exactly the same in the web interface - one letter folder
>> names, empty inside.
>>
>
> I started the backup manually, and it produces the following error:
>
>
> /srv/backuppc
Tomasz Chmielewski wrote:
> It looks exactly the same in the web interface - one letter folder
> names, empty inside.
>
I started the backup manually, and it produces the following error:
/srv/backuppc/bin/BackupPC_dump -v -f windows_server
(...)
98208 ( 3688.7 kb/s) \Fre
Nils Breunese (Lemonbit) wrote:
> Tomasz Chmielewski wrote:
>
>> Recently I found out that my BackupPC 2.1.2pl2 backups are not really
>> reliable - lots of files are missing, and I have mostly empty folders.
>>
>> Let me illustrate.
>>
>> A Windows serv
, and
moreover, these folders are empty.
Any ideas why it could happen?
The method I use is smbfs.
--
Tomasz Chmielewski
http://wpkg.org
-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff
Les Stott wrote:
> Tomasz Chmielewski wrote:
>> Les Stott wrote:
>>> Tomasz Chmielewski wrote:
>>>> What is the recommended way to remove older backups?
>>>>
>>>> Suppose, I have these backups:
>>>>
>>>>
>>>
Les Stott wrote:
> Tomasz Chmielewski wrote:
>> What is the recommended way to remove older backups?
>>
>> Suppose, I have these backups:
>>
>>
>> 95/ - full
>> 130/ - inc
>> 157/ - inc
>> 183/ - inc
>> 190/ - inc
>> 194/ -
OK if I
just remove these directories? Or can it affect some future backups in a
bad way?
--
Tomasz Chmielewski
http://wpkg.org
-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done
Les Mikesell wrote:
> On Sun, 2006-10-29 at 04:13, Tomasz Chmielewski wrote:
>
>>> Out of curiosity how long did it take and how big was the entire data
>>> size on the first run? did it eat up all the cpu time when running?
>> As I remember, it took a couple of hou
pying from one HDD
to another.
It was about 80 GB or so, full of hardlinks.
The machine was pretty responsive, but I had to increase the swap size
several times and start from scratch again.
--
Tomasz Chmielewski
http://wpkg.org
--
needing to be rebooted for a software problem.
>>
>> I know of people who do it; but they're doing it on machines with more
>> memory and fewer files.
Just add lots of swap.
I was able to rsync an archive with several million files on a machine
with just 256 MB RAM; it ha
sounds complicated, perhaps the best way to do it is to use NFS,
CIFS (Samba) would be the next choice.
[1] http://iscsi-target.sf.net
[2] http://open-iscsi.org, use the latest svn
--
Tomasz Chmielewski
http://wpkg.org
-
U
://www.okisoft.co.jp/esc/utf8-cygwin/index.html ?
From a couple of tests I made, it seems that one still needs a patched
version.
--
Tomasz Chmielewski
http://wpkg.org
-
Using Tomcat but need to do more? Need to support
oper" tar.
So considering I use a plain tar over a full BackupPC archive - if I
untar it later, is there a way to get all the files reliably (with some
BackupPC magic)?
> and if you run it on an incremental you won't g
, I'm not sure how can I restore such backup later on
(well, I could use BackupPC_zcat manually on each and every file, but
I'd rather avoid that).
--
Tomasz Chmielewski
http://wpkg.org
-
Using Tomcat but need t
ure if I can restore such a backup
reliably later on.
Or should I do it some other way?
--
Tomasz Chmielewski
http://wpkg.org
-
Using Tomcat but need to do more? Need to support web services, security?
Get stuff done quickly
rom user's point of view.
I know I can du "du -sh /backuppc/pc/.../directory" - but that will
calculate only the compressed size.
--
Tomasz Chmielewski
http://wpkg.org
-
Take Surveys. Earn Cash. Influe
> Tomasz Chmielewski writes:
>
>> I saw this problem on the list several times; there are many posts abou=
> t=20
>> this since 2004, with no clear answer.
>>=20
>> I can't backup some hosts; shortly after I start the backup, I keep=20
>> getting "
ppc-data/pc/imap1/new//f%2f/ for empty output\n
create 0 / 0
And it loops like that all the time.
Is there a remedy for that?
--
Tomasz Chmielewski
http://wpkg.org
-
Take Surveys. Earn Cash.
Tomasz Chmielewski wrote:
> I saw this problem on the list several times; there are many posts about
> this since 2004, with no clear answer.
>
> I can't backup some hosts; shortly after I start the backup, I keep
> getting "Can't open /backuppc-data/pc/com
ly from the command line?
Try to decrease your server's and/or your client's MTU, sometimes it
helps on not-so-standard network setup.
--
Tomasz Chmielewski
Software deployment with Samba
http://wpkg.org
-
Using Tomca
Tomasz Chmielewski wrote:
> I noticed that BackupPC 3.0.0beta1 seems to have a serious bug in
> incremental backups - "exclude" options are not respected.
Using BackupPC excludes instead of rsync excludes seems to work - this
one doesn't exclude for incrementals:
re read. After reading various
> suggestions, I have tried putting them in:
This is the one you're looking for:
> /backuppc/data/pc/[host]/config.pl
Note that incremental backups seem to be broken with 3.0.0beta1 (exclude
lists are not respected).
--
'-v',
'--numeric-ids',
'--perms',
'--owner',
'--group',
'--devices',
'--links',
isplay us hosts from that group, something
already know from "host summary" page.
Is it possible to do such host grouping with BackupPC 3.x, or is such a feature
planned?
--
Tomasz Chmielewski
http://wpkg.org
---
ary
obstacle (I may only want to save a couple of directories, not everything).
Is there a way to solve this problem somehow?
--
Tomasz Chmielewski
Silent installs with Samba
http://wpkg.org
---
Using Tomcat but need to do more? Need to suppo
ry
obstacle (I may only want to save a couple of directories, not everything).
Is there a way to solve this problem somehow?
--
Tomasz Chmielewski
Silent installs with Samba
http://wpkg.org
---
Using Tomcat but need to do more? Need to suppo
'--devices',
'--links',
'--times',
'--block-size=2048',
'--recursive',
'--exclude=/proc/*',
'--exclude=/mnt/*',
than
20 times time to backup and consuming about 30 times more bandwidth.
Any help will be appreciated .
What transport method are you using? If rsync, you can set (in config.pl):
'--bwlimit=8',
to limit the bandwidth to 8 kB/s.
--
Tomasz Chmielewski
Silent installers with
ry
obstacle (I may only want to save a couple of directories, not everything).
Is there a way to solve this problem somehow?
--
Tomasz Chmielewski
Silent installs with Samba
http://wpkg.org
---
Using Tomcat but need to do more? Need
ks
at the top of this page (some sources and a binary dll).
I didn't test it much, but it works after a couple of simple tests.
--
Tomasz Chmielewski
Software deployment with Samba
http://wpkg.org
---
Using Tomcat but need to do more?
start cronjobs with:
su -l backuppc -c "/srv/backuppc/bin/BackupPC_dump -v -f old-backups"
&>/dev/null
--
Tomasz Chmielewski
Software deployment with Samba
http://wpkg.org
---
This SF.Net email is sponsored by xPML, a g
Is it possible to "merge" two BackupPC servers (running on separate
hosts), into one - so that in the end, there is only one BackupPC running?
I'm wondering if it's possible to do so with keeping the old archives/pool.
--
Tomasz Chmielewski
Software deployment with S
Guus Houtzager wrote:
On Tuesday 11 April 2006 08:56, Tomasz Chmielewski wrote:
[...]
Unfortunately, it doesn't explain why with $Conf{FullKeepCnt} = 1; I
have two full backups, not one as I should expect.
That's probably because you have something like this for the host i
Nicholas Hall wrote:
On 4/10/06, Tomasz Chmielewski <[EMAIL PROTECTED]> wrote:
I want to keep the backups as following:
- 12 last monthly backups
- 4 last weekly backups
- 6 last daily backups
After reading the documentation, it seems to me that I'd have to do it
with $Conf{
lost, I just can't understand it.
How should I set BackupPC, so that it keeps 12 last monthly backups, 4
last weekly backups, and 6 lats daily backups?
--
Tomasz Chmielewski
Software deployment with Samba
http://wpkg.org
---
This S
ule used by BackupPC,
right?
--
Tomasz Chmielewski
Software deployment with Samba
http://wpkg.org
---
This SF.Net email is sponsored by xPML, a groundbreaking scripting language
that extends applications into web and mobile media. Attend the l
nd why
I get this error (Only privileged users can Archive) when I try to
restore the backups.
In the config.pl file, I have only this one entry:
$Conf{XferMethod} = 'archive';
Any ideas what I'm doing wrong?
--
Tomasz Chm
1 - 100 of 126 matches
Mail list logo