Hi all,
I have a lot of files in my pc-tree which are not correct linked into my
pool. I'll like to verify and linked them correctly into my pool (not
cpool).
I've found some mailings regarding BackupPC_fixLinks.pl as well as hints to
the location
http://sourceforge.net/apps/mediawiki/backuppc
I've developed two enhancements to BackupPC 3.1.0 and use it since 2010.
In June 2013 I've done the upgrade to BackupPC V3.2.1 (debian wheezy) and I
implemented my patch in this release.
In Februar 2019 I've done it again to BackupPC V3.3.2-2 (debian
buster/testing)
As requested by Jeffrey Kosow
Am Sonntag 07 April 2019, 17:11:47 schrieb backu...@kosowsky.org:
> Sometimes you want to save a special backup that for example
> corresponds to a specific change (pre/post) on your system. The
> trouble is that with exponential deleting there is no way to
> guarantee that your specific designated
I've recognized today my firewall issue with ftp://www.backup4u.at.
It is solved now and you can access it again.
Maybe someone can put the patches into a standard place for backuppc patches?
br
Matthias
Am Samstag 16 Februar 2019, 14:13:40 schrieb Matthias Meyer:
> I've
Hello,
I get my backups since a long time wit backuppc/tar.
Now I want to switch to rsync.
I've install rsync onto the backup-Client and configure it to run in
daemon-mode.
I create a also the:
root:/home/Meyer# cat /etc/rsyncd.conf
motd file = /etc/motd
max connections = 5
syslog facility = loca
You can reuse my script "BackupPC_deleteBackup":
http://backuppc.wiki.sourceforge.net/How+to+delete+backups
It will delete a specified backup-number and all incrementals depends on it
if it is a full backup. Furthermore it will modify the pc//backups.
Last but not least it can start BackupPC_nig
Am Montag 12 Mai 2008 schrieb Matthias Meyer:
> Hello,
>
> I get my backups since a long time wit backuppc/tar.
> Now I want to switch to rsync.
> I've install rsync onto the backup-Client and configure it to run in
> daemon-mode.
> I create a also the:
> root:/hom
Hello,
I get up&running my backups with rsync. Used tar before.
But I see no usable advante of using rsync.
should rsync be faster than tar or should it need less space on drive?
Thanks
Matthias
--
Don't Panic
-
This SF.ne
Am Sonntag 18 Mai 2008 schrieb Kurt Jasper:
> Hi,
>
> some of my client-pcs are running win xp/2000.
> They get backed up using SSH and rsync and everything seems to work
> fine, except that every backup gives back some XferLOG Errors, because
> it can not copy some locked files:
>
> Remote[1]:
Hello,
I've renamed my "localhost" to "fileserver".
Thats mean I renamed the directory "backuppc/pc/localhost"
to "backuppc/pc/fileserver" and renamed the entry in /etc/backuppc/hosts
too.
Of course I import the new host key for fileserver for user backuppc.
Unfortunately I get erros in every b
Am Dienstag 20 Mai 2008 schrieb Les Stott:
> Kurt jasper wrote:
> > ok, to exclude the follwing files that can be found in any
> > user-directory C:\Documents and Settings\USERNAME), I followed your
> > suggestion and put intomy CLIENT.PL:
> > [...]
> > $Conf{RsyncShareName} = '/';
> > $Conf{Backup
Am Montag 19 Mai 2008 schrieb Tino Schwarze:
> On Mon, May 19, 2008 at 09:57:03PM +0200, Matthias Meyer wrote:
> > Hello,
> >
> > I've renamed my "localhost" to "fileserver".
> > Thats mean I renamed the directory "backuppc/pc/localhost&q
Hi,
I've installed cygwin and rsync on a vista client and want backup it to
backuppc 3.1.0-2 on my Linux server:
On client side (vista) I use rsync 3.0.4 (protocol version 30)
Capabilities:
64-bit files, 64-bit inums, 32-bit timestamps, 64-bit long ints,
socketpairs, hardlinks, symlinks,
Hi,
I've found the $Conf{Language} = 'en'; in the CGI configuration of the
backuppc-server. I assume that the language configuration will work for all
hosts within this backuppc-server.
It is possible to configure different languages for different hosts?
Thanks
Matthias
--
Don't Panic
---
> >
> >
> > Matthias Meyer wrote:
> > > Hi,
> > >
> > > I've installed cygwin and rsync on a vista client and want backup it
to
> > > backuppc 3.1.0-2 on my Linux server:
> > >
> >
> > 2008/10/17 10:30:28 [1432] rsyn
Hi,
I try to backup WindowsXP as well as Vista to an Linux server.
I use backuppc 3.1.0 as backup-server (Debian, ext3), cygwin as client
environment and rsyncd for the transport.
The backup seems to work well but the restore not. The restored files
get "NT-Authority/System" as new owner, indepen
Nick Smith wrote:
> I currently have several windows server backup clients that i use
> volume shadow to backup the data, i use a pre script that lauches the
> shadow and maps it to drive B on the windows box and then backuppc
> backups B over rsync then a post script to kill the volume shadow.
>
Matthias Meyer wrote:
Hi,
Don't know why but today it have another point of viw ;-)
I try to backup WindowsXP as well as Vista to an Linux server.
I use backuppc 3.1.0 as backup-server (Debian, ext3), cygwin as client
environment and rsyncd for the transport.
The backup seems to work wel
Nick Smith wrote:
>>
>> I declare the exclude list within the GUI of backuppc.
>>
>> Declare the "RsyncShareName" in the same manner as they are declared in
>> your rsyncd.conf in your windows client.
>> Define BackupFilesExclude:
>> NewKey = "*" if it should applicable to all RsyncShareName or th
Nick Smith wrote:
>> Within the Key you can add the directories relativ to the RsyncShareName
>> in Linux syntax.
>> e.g.:
>> $Conf{RsyncShareName} = [
>> 'D',
>> 'C'
>> ];
>> $Conf{BackupFilesExclude} = {
>> 'C' => [
>>'/WINDOWS/Downloaded Program Files',
>>'/WINDOWS/Offline Web Pages'
Hi,
I try to backup my windows clients with backuppc as well as cygwin/rsync.
The backup/restore with rsyncd at windows and rsync on linux (run as root)
works well. Only as normal (linux) user it didn't work because of
insufficient privileges for chmod and chgrp.
With backuppc (run as user backup
Aleksey Tsalolikhin wrote:
> BackupPC 3.1.0 on CentOS 5.2
>
> Backups of a single host are aborting after lots of errors like
>
> sys/block/hdc/capability: md4 doesn't match: will retry in phase 1; file
> removed Remote[1]: rsync: read errors mapping "/sys/block/hdc/capability":
> No data availa
Mark Adams wrote:
>
> I believe the $Conf{RsyncShareName} statement should involve '/' rather
> than '/etc'. The directory I want to backup is /etc which is a
> subdirectory of /. I also believe the /home/madams directory should be
> /home. Am I right?
>
$Conf{RsyncShareName} should contain
Matthias Meyer wrote:
> Hi,
> I try to backup my windows clients with backuppc as well as cygwin/rsync.
>
> The backup/restore with rsyncd at windows and rsync on linux (run as root)
> works well. Only as normal (linux) user it didn't work because of
> insufficient privile
Jean-Michel Beuken wrote:
> hello,
>
> I have tried to use "pre-xfer exec" with rsyncd / cygwin
>
> I have created a special module to restore in a unique folder (
> d:/BackupPC )
>
>
>
> $ cat /etc/rsyncd.conf
>
>
> [RESTORE]
> path = /cygdrive/d/BackupPC
Jean-Michel Beuken wrote:
> Matthias,
>
> thanks for the quick answer...
>
>> I run pre-xfer successfully under Win XP but has no luck with Vista.
>> Did you use XP or Vista?
>>
>
> XP SP3
>
>> Maybee "#! /bin/sh" is necessary in the first line of your
>> "/bin/pre-exec.sh".
>
> administra
Matthias Meyer wrote:
>
> I have a similiar problem in Vista with failure code (65280). I asked
> today for both failure codes in the rsync.general mailing list.
>
> br
> Matthias
>From rsync.general I get the answer that this are not error codes from rsync
and I cou
I use rsyncd to backup both, windows as well as linux clients.
Is it possible to examine or calculate the progress of an actual running
backup?
One Idea would be to calculate the duration of the last backups and assume
the actual backup will have the same performance. But this can be wrong if
some
Adam Goryachev wrote:
>
> I don't think it is even possible to guess how much rsync will transfer
> on a changed file without running through the whole comparison.
> rsync does have --progress which I find very useful, this at least
> displays how many files need to be looked at, and how many are l
If a user requests a restore I want to restore one extra file and handle it
by the RestorePostUserCmd.
Is it possible to request this additional restore with BackupPC_restore
during the RestorePreUserCmd or RestorePostUserCmd ?
Thanks
Matthias
--
Don't Panic
Craig Barratt wrote:
> Matthias writes:
>
>> If a user requests a restore I want to restore one extra file and handle
>> it by the RestorePostUserCmd.
>> Is it possible to request this additional restore with BackupPC_restore
>> during the RestorePreUserCmd or RestorePostUserCmd ?
>
> Yes, you c
After a system crash and tons of errors in my ext3 filesystem I have to run
e2fsck.
During this I lost some GB of data in /var/lib/backuppc.
For the time being I have disabled BackupPC_nightly by renaming it to
BackupPC_nightly.disabled ;-)
The rest of the backuppc system should run as well as pos
Johan Ehnberg wrote:
> Quoting Matthias Meyer :
>
>> After a system crash and tons of errors in my ext3 filesystem I have to
>> run e2fsck.
>> During this I lost some GB of data in /var/lib/backuppc.
>> For the time being I have disabled Back
> Matthias Meyer wrote:
>> Thanks for your sympathy :-)
>> I would believe the filesystem should be ok in the meantime. e2fsck needs
>> to run 3 or 4 times and need in total more than 2 days. After this
>> lost+found contains approximately 10% of my data :-( No chance to
Holger Parplies wrote:
> Hi,
>
> Matthias Meyer wrote on 2009-01-18 15:33:30 +0100 [Re: [BackupPC-users]
> errors in cpool after e2fsck corrections]:
>> Johan Ehnberg wrote:
>> > Quoting Matthias Meyer :
>> >
>> >> After a system crash and tons
Adam Goryachev wrote:
> Matthias Meyer wrote:
>> Johan Ehnberg wrote:
>>
>> 1) So you would recommend:
>> mv /var/lib/backuppc/cpool /var/lib/backuppc/cpool.sav
>> mkdir /var/lib/backuppc/cpool
>> I would believe that the hardlinks
>> from /var/lib/
Johan Ehnberg wrote:
> Matthias Meyer wrote:
>>> Matthias Meyer wrote:
>>>> Thanks for your sympathy :-)
>>>> I would believe the filesystem should be ok in the meantime. e2fsck
>>>> needs to run 3 or 4 times and need in total more than 2 days. Afte
I plan to backup (with rsync) some machines with nearly identical data
(operating system). Because this machines are only reachable through
internet I want to avoid a lot of traffic for files which are already
on the backuppc server.
I assume rsync will only compare the data between the client and
dan wrote:
> just want to clarify here.
>
> you have x number of remote hosts, all running the same OS and presumably
> have almost identical files (size, permissions etc)? and you want back one
> up and then clone that backup for each of the other machines to avoid
> transfering the data numerou
Juergen Harms wrote:
> I agree, I would separate initial cloning (better use plain rsync) and
> backup operations (a case for backuppc). From what you wrote, I am not
> even sure whether - once you have done the initial cloning - in steady
> state you want to do "backup" (periodically take increme
dan wrote:
> You should run two tests that can pick up any CPU or memory error. On
> boot
> run memtest from grub. That will hammer on your RAM for as long as you
> want. Then run cpuburn to see if your CPU has some issue. I would think
> that your RAM is the more likely candidate here. The p
Simone Marzona wrote:
> Hi
>
> Is there a way to show failed backups only to users who administer the
> host that failed?
>
> I manage users and hosts as documented with the backuppc hosts file like
> this:
>
> serverA 0 backuppc userA
> serverb 0 backuppc userB
It is not necessary to hav
Tino Schwarze wrote:
> On Sun, Jan 18, 2009 at 05:29:21PM -0600, Les Mikesell wrote:
>
>> > you have x number of remote hosts, all running the same OS and
>> > presumably have almost identical files (size, permissions etc)? and you
>> > want back one up and then clone that backup for each of the
Matthias Meyer wrote:
> I plan to backup (with rsync) some machines with nearly identical data
> (operating system). Because this machines are only reachable through
> internet I want to avoid a lot of traffic for files which are already
> on the backuppc server.
> I assume
I do not fully understand how rsync works within backuppc.
In the documentation for BackupPC_dump I found:
As BackupPC_tarExtract extracts the files from smbclient or tar, or as
rsync runs, it checks each file in the backup to see if it is identical to
an existing file from any previous backup o
Is it possible to examine how many data a running BackupPC_dump has copied
from client to server until now?
Thanks
Matthias
--
Don't Panic
--
This SF.net email is sponsored by:
SourcForge Community
SourceForge wants to
Matthias Meyer wrote:
> Johan Ehnberg wrote:
>
>> In 2) you should not delete anything - only when filesystem errors are
>> causing trouble. You need the nightly.
>>
>> Other than that - read the other posts too, they have good pointers to
>> actually dealing
Matthias Meyer wrote:
> As outlined in the documentation:
> su backuppc
> cd NEW_TOPDIR
> mkdir pc
> cd pc
> /usr/share/backuppc/bin/BackupPC_tarPCCopy /var/lib/backuppc/pc | tar xvPf
> - I should run:
> su backuppc
> cd /var/lib/backuppc/pc/host2
> mkdir 0
&g
Les Mikesell wrote:
>
> You get new directories and attrib files in each run. The only plain
> file in the list does have the same inode number - and a count of 4 links.
>
Yes, you are right :-)
cp -rl /var/lib/backuppc/pc/host1/0 /var/lib/backuppc/pc/host2/0
do what I want.
Also
cp -al /var/
Try something like this:
/usr/local/bin/mypreexec $host -f -q -x -w 10 -s $share -t $cmdType
Notify: all "-?" parameters will perform actions in my script "mypreexec".
$host as well as $cmdType will be filled by backuppc with the documented
values.
br
Matthias
--
Don't Panic
--
I use rsyncd to backup my (Windows) clients.
The first backup of each client needs a really long time.
Would it be faster if I use smb or tar for the initial backup?
Is it possible to switch back to rsyncd after this initial backup?
Would I maintained all the benefits of rsyncd starting with the 2
Tino Schwarze wrote:
> On Sat, Feb 07, 2009 at 01:56:33AM +0100, Matthias Meyer wrote:
>> I use rsyncd to backup my (Windows) clients.
>> The first backup of each client needs a really long time.
>>
>> Would it be faster if I use smb or tar for the initial backup?
>
Sometimes my backups abort with different failures.
Anybody know the meaning of/reason for the following failures?
1) Aborting backup up after signal ALRM
2) Backup aborted (Child exited prematurely)
3) Aborting backup up after signal INT
4) Aborting backup up after signal PIPE
5) Backup aborted (l
John Rouillard wrote:
> On Thu, Feb 12, 2009 at 05:44:33PM +0100, Matthias Meyer wrote:
>> Sometimes my backups abort with different failures.
>> Anybody know the meaning of/reason for the following failures?
>> 1) Aborting backup up after signal ALRM
>
> The backup
I have some big files (*.pst ;-)) to backup over a small connetion.
Could I improve the transfer by using --partial-dir?
My goal is that rsync didn't transfer the whole file again but only the part
which not been transfered before.
Thanks
Matthias
--
Don't Panic
Tino Schwarze wrote:
> Hi Matthias,
>
> On Sat, Feb 14, 2009 at 12:39:21PM +0100, Matthias Meyer wrote:
>
>> I have some big files (*.pst ;-)) to backup over a small connetion.
>> Could I improve the transfer by using --partial-dir?
>> My goal is that rsync didn
Hi,
I backup a windows client with rsyncd over ssh. I am pretty sure the ssh
connection was interrupted at 23:27.
In the /var/lib/backuppc/pc/st-ms-wv/XferLOG.0.z I found the error message:
create 770 4294967295/4294967295 240986 Help/Windows/de-DE/artcone.h1s
Read EOF:
Tried again: got
mohan infant wrote:
> hello sir i'm final year engineering student now i'm working a project
> BackupPC
> but i have lot of problem in this project
> the backup is normaly done...but the automatic incremental backup is not
> be done..
> i have seen the error "host Backup summary > * this PC
Dear all,
How scalable is backuppc?
Where are the limits or what can produce performance bottlenecks?
I've heard about hardlinks which can be a problem if theire are millions of
it. Is that true?
I can imagine that backuppc have a lot of work to find identical files if
theire are to much of them
Wayne L Andersen wrote:
> So it sound slike the best solution is to re-install.
>
> Can I do that without destroying my current backups?
:-O Your backups are in cpool (compressed) and/or pool (uncompressed).
regarding your subject you have lost your backups already!
>
> Is there a way to do tha
Mike Dresser wrote:
>
>
> Matthias Meyer wrote:
>> Dear all,
>>
>> How scalable is backuppc?
>> Where are the limits or what can produce performance bottlenecks?
>>
>> I've heard about hardlinks which can be a problem if theire are millions
&
Hi,
I've a problem with the initial backup of a vista client, called "myclient".
This client ist connected over internet with an 1Mbit/s upload bandwidth and
have
100GB of storage to backup.
Unfortunately the internet connection will be interupted each night for 5
minutes.
2009-03-31 23:59:19 B
Laurin d'Volts wrote:
> I'm trying to access the CGI-interface of backuppc using a Debian Sid
> distro. Backuppc doesn't want to display the CGI interface in the
> browser. I got the program to work on a different computer (w/ Debian
> Sid but not apt-get upgrade for a while), so I don't fully und
Odhiambo Washington wrote:
> ls -al BackupPC_Admin
try
ls -alh /usr/share/backuppc/cgi-bin/index.cgi
because that is the "BackupPC_Admin" installed by the debian package.
If it is not suid, you have to suid it:
chmod u+s /usr/share/backuppc/cgi-bin/index.cgi
br
Matthias
--
Don't Panic
---
Hi,
I've installed rrdtool on my system and found the "BackupPC Pool Size" in
the server status page. I didn't know before that backuppc can display
measures from rrdtool.
Their are any other rrdtool data which can be displeyed in the backuppc GUI?
Is there a documentation about this integration?
Nils Breunese (Lemonbit) wrote:
> Matthias Meyer wrote:
>
>> I've installed rrdtool on my system and found the "BackupPC Pool
>> Size" in
>> the server status page. I didn't know before that backuppc can display
>> measures from rrdtool.
&
dan wrote:
> http://linux.slashdot.org/article.pl?sid=09/04/05/39
>
> just an FYI for those that are fans of debian and of freebsd.
>
> Debian is now shipping with a freebsd kernel. This will give a debian
> userland to the freebsd kernel, and access to debian packages.
>
> I cant wait to
Boniforti Flavio wrote:
> Hello list!
>
> This is the log from my last attempt to sync a directory containing 2 big
> files:
>
> Executing DumpPreUserCmd: /var/lib/backuppc/tunnel.sh -fC
> administra...@mail.omvsa.ch -L 8874:127.0.0.1:873 sleep 60
> SSH started successfully.
> incr backup starte
Les Mikesell wrote:
>>> /bin/nice -n 10 /bin/autossh -R :localhost:22 -M
>>> 0 -N -C -i /etc/ssh/id_rsa -p 22 u...@server.domain.at or
>>> install it as a service via cygrunsrv.
>>> The ssh tunnel is open the whole day and my server can
>>> connect to the client whenever he want.
>>
>> Nice, but
Hi,
If I have a partial backup and a new full backup is running,
did backuppc store all already saved files from the partial in the /new
directory or only the new saved files?
Thanks
Matthias
--
Don't Panic
--
Stay on
Boniforti Flavio wrote:
> Hello list.
>
> As I'm checking backups this morning, I see:
>
> delete d 777 0/0 0 VARIE
> attribWrite(dir=fUfficio) ->
> /var/lib/backuppc/pc/mail.cometti.ch/new/fUfficio/attrib
> attribWrite(dir=) -> /var/lib/backuppc/pc/mail.cometti.ch/new//attrib
Boniforti Flavio wrote:
> Hello there.
>
> I read the chapter "How BackupPC Finds Hosts" and I found out that my
> hosts don't get "found" if using the suggested
>
> perl -e 'print(gethostbyname("myhost") ? "ok\n" : "not found\n");'
>
> Thus I'm asking: how does BackupPC ping hosts which actual
soume86 wrote:
> Hello to all,
> I have just installed backuppc on a machine ubuntu.
> The installation it is crossed(spent) well. My problem is the following
> one:
>
> I added a machine to protect goes the interface of administration.
> And when I click to Start the complete saving and when I r
Hi,
I think about an encrypted backup and find rsyncrypto.
Is there a BackupPC_dump support for rsyncrypto?
Or any other way to use rsyncrypto with backuppc?
Thanks
Matthias
--
Don't Panic
--
Register Now & Save for Ve
Chris Robertson wrote:
> Matthias Meyer wrote:
>> Hi,
>>
>> I think about an encrypted backup and find rsyncrypto.
>> Is there a BackupPC_dump support for rsyncrypto?
>> Or any other way to use rsyncrypto with backuppc?
>>
>
> From the looks of
soume86 wrote:
>
> Hello,
>
> I think that I am not good explained.
> I want to save a machine Linux.
> When I click to save the machine,
> I have the following error:
> The saving failed (fileListReceive failed).
>
> And that I would want knowledge that it is necessary
> to install samba ( smb
Chris Robertson wrote:
>
> Rsyncrypto doesn't seem to send the files directly. From the tutorial
> (http://www.linux.com/feature/125322) linked from the home page
> (http://rsyncrypto.lingnu.com/index.php/Home_Page):
>
> "rsyncrypto is designed to be used as a presync option to rsync. That
> is,
Boniforti Flavio wrote:
> Hello list.
>
> Is it possible to extract the value of the *effectively transferred
> data* in a backup session?
>
> Thanks in advance,
> Flavio Boniforti
>
You will find it in $HOME/pc//backups
See backuppc online documentation for a description of this file
br
Matt
Tim Cole wrote:
> I have a similar question. I'm using rsync for transfer and need to
> exclude a few windows directories such as system volume. I added the
> path to the directory as rsync sees it on the other end
> "/mnt/windows/system volume information" but it doesn't get
> excluded. I see
Tim Cole wrote:
> Sorry for the html - I forget thunderbird is set to do that by default.
> I am not using cygwin, so the windows shares are mounted via cifs on an
> embedded linux appliance at the remote end. The path to the mounted
> shares on the remote end is "/mnt/lpl/" I have actually been
Robert J. Phillips wrote:
> When I first started using BackupPC I was letting the backups happen
> with SMB. I have since found and figured out how to make it work the
> process to use vshadow and rsyncd.
>
>
>
> I think now that I understand the vshadow command that I could use
> vshadow and
Dean Lambourn wrote:
> All,
>
> Is there a limit on the folder depth for backups? I recently noticed
> that on at least one system not all files were being backed up despite
> the backup being listed as successful. On the system in question,
> backup files exist in this folder:
>
> fC$/fDocume
Boniforti Flavio wrote:
> Hello list,
>
> I'd like to forget about the archiving host and its function now that
> I've set it up. Is there any way to schedule it to run at a given time
> from the BackupPC interface? The only alternative solution I can think
> of is using a cron job... What are yo
Benedict simon wrote:
>
> Dear All,
>
> I am new to backupPC and do have a problem with backing up a Linux hosts
> here is my configuration
>
> i have a Backup PC server on Centos 5.3
> i have downloaded installed backuppc as per intructions form the site but
> from source rpm
> also installed
Christian Baus wrote:
> Hi Duds,
>
> It's my first time to send to this list! So I want to say thank you very
> much for
> this great software!
>
> We backup an network with 3 windows server and 6 linux servers, data
> about 1TB.
>
> The Backup works fine! But when BackupPC send an mail to noti
clint woodrow wrote:
>
>
> Jeffrey J. Kosowsky wrote:
>> Adam Goryachev wrote at about 13:42:41 +1000 on Wednesday, June 3, 2009:
>>
>> > clint woodrow wrote:
>> >
>> > >
>> > > Matthias Meyer wrote:
>> > &
Hi,
Is it possible to run Backuppc on another server as the web interface?
thanks
Matthias
--
Don't Panic
--
Crystal Reports - New Free Runtime and 30 Day Trial
Check out the new simplified licensing option that enable
Holger Parplies wrote:
> Hi,
>
> Matthias Meyer wrote on 2009-06-08 23:16:48 +0200 [[BackupPC-users] Is it
> possible to split backuppc and gui to two different servers]:
>>
>> Is it possible to run Backuppc on another server as the web interface?
>
> yes, but yo
Hi,
Unfortunately I am not a perl programmer.
Therefore my, maybee stupid, question ;-)
Is it possible to include a specification within the
$Conf{BackupFilesExclude}?
Something like:
$Conf{BackupFilesExclude} = {
'WINDOWS' => [
'/Downloaded Program Files',
'/Offline Web Pages',
'/
Jeffrey J. Kosowsky wrote:
> Matthias Meyer wrote at about 18:00:22 +0200 on Thursday, June 11, 2009:
> > Hi,
> >
> > Unfortunately I am not a perl programmer.
> > Therefore my, maybee stupid, question ;-)
> >
> > Is it possible to includ
Erik Hjertén wrote:
> Hi all
>
> I'm using Backuppc on an Ubuntu-server and Deltacopy on windows clients.
How do you call DeltaCopy from the Backuppc Server?
> The incremental backup runs inludes all files, but only every other day.
What did you mean with "but only every other day" ?
> The back
:-) Yes, thats what I am looking for!
So if I want to include the Vista junctions from another file
I should create a file (e.g. /path/to/junction_definitions.pl)
This file should contain the statements:
# define the list of junctions for Vista in English:
@VistaJunctions_english = ( "/dir1", "/di
Is it possible to integrate the Backuppc GUI within my own homepage and
reuse the login to my homepage as login to backuppc (single sign on)?
There are some directions or hints which I have to give to a web programmer?
Thanks
Matthias
--
Don't Panic
-
Erik Hjertén wrote:
> Thanks for your reply.
>
> Matthias Meyer skrev:
>> Erik Hjertén wrote:
>>
>>
>>> Hi all
>>>
>>> I'm using Backuppc on an Ubuntu-server and Deltacopy on windows clients.
>>>
>> How do y
Matthias Meyer wrote:
> Erik Hjertén wrote:
>
>> Thanks for your reply.
>>
>> Matthias Meyer skrev:
>>> Erik Hjertén wrote:
>>>
>>>
>>>> Hi all
>>>>
>>>> I'm using Backuppc on an Ubuntu-server and
Les Mikesell wrote:
> Matthias Meyer wrote:
>> Is it possible to integrate the Backuppc GUI within my own homepage and
>> reuse the login to my homepage as login to backuppc (single sign on)?
>> There are some directions or hints which I have to give to a web
>> progr
Erik Hjertén wrote:
>
>>> Please set $Conf{XferLogLevel} = 2;
>>> Than check the XferLOG after a backup.
>>> Look for:
>>> create = new for this backup
>>> pool= found a match in the pool
>>> same= file is identical to previous backup
>>> skip= file skipped in incremental because att
Erik Hjertén wrote:
> Matthias Meyer skrev:
>>
>>> -
>>> incr backup started back to 2009-06-13 10:59:37 (backup #341) for
>>> directory data Connected to eddie:873, remote version 29
>>> Negotiated protocol version 28
>>> Connected to mod
Erik Hjertén wrote:
> Matthias Meyer skrev:
>> I'm with you. But your incrementals will backup all files changed since
>> last full. Do you want that?
>>
> Not necessarily, but now it behaves as expected.
>> I make a full backup each 7 days and use $Conf{
vezza wrote:
>
> Could someone please explain me how to solve this problem?
>
> I have installed a new BackupPC server, mounted the __top-dir__ of the old
> server into the __top-dir__ of the new server but when I go to the CGI
> page that lists backups it output this error...
>
> what I am mis
1 - 100 of 281 matches
Mail list logo