lleeaaffss wrote:
>
>
> I downloaded a program called Recover My Files, but in order to save the
> files, I need a serial key, is there any way to get this without purhcasing
> the software? If not, are there any other programs or alternative solutions?
>
> Thanks in advance!
You have post
Hi everyone,
I'm new to this and I'm not sure if I'm posting in the right area, if I'm not,
sorry about that!
I recently had a problem with my laptop, the screen turned black after start up
and wouldn't do anything after that. I took it in to get it fixed and the
technician said he would back
Jeffrey J. Kosowsky wrote:
> This may be a naive question, but I was wondering what is the state of
> BackupPC development? (I couldn't find answers on the sourceforge
> site)
This is the users mailinglist. The development mailinglist may have
more info for you: https://lists.sourceforge.net/li
> Within the Key you can add the directories relativ to the RsyncShareName in
> Linux syntax.
> e.g.:
> $Conf{RsyncShareName} = [
> 'D',
> 'C'
> ];
> $Conf{BackupFilesExclude} = {
> 'C' => [
>'/WINDOWS/Downloaded Program Files',
>'/WINDOWS/Offline Web Pages',
>'/WINDOWS/Temp',
>'
Nick Smith wrote:
>>
>> I declare the exclude list within the GUI of backuppc.
>>
>> Declare the "RsyncShareName" in the same manner as they are declared in
>> your rsyncd.conf in your windows client.
>> Define BackupFilesExclude:
>> NewKey = "*" if it should applicable to all RsyncShareName or th
Matthias Meyer wrote:
Hi,
Don't know why but today it have another point of viw ;-)
I try to backup WindowsXP as well as Vista to an Linux server.
I use backuppc 3.1.0 as backup-server (Debian, ext3), cygwin as client
environment and rsyncd for the transport.
The backup seems to work well but t
>
> I declare the exclude list within the GUI of backuppc.
>
> Declare the "RsyncShareName" in the same manner as they are declared in your
> rsyncd.conf in your windows client.
> Define BackupFilesExclude:
> NewKey = "*" if it should applicable to all RsyncShareName or the
> RsyncShareName to whic
On Fri, Dec 5, 2008 at 8:47 AM, Bowie Bailey <[EMAIL PROTECTED]> wrote:
> David Rees wrote:
>> I've got one server that all it does is back itself up that gives this
>> error. It started out only occasionally failing, but now I can't
>> complete a full backup without it bailing out with
>>
>> I've
Nick Smith wrote:
> I currently have several windows server backup clients that i use
> volume shadow to backup the data, i use a pre script that lauches the
> shadow and maps it to drive B on the windows box and then backuppc
> backups B over rsync then a post script to kill the volume shadow.
>
I have been finding the following intermittent problem when backing up
some of my WinXP machines.
The backup seems to stall for 12+ hours with nothing being written to
disk (i.e. no 'new' folder). When I dequeue the backup, I get a
"fileListReceive failed" error. (I imagine it would hang there unt
Martin Sarsale wrote at about 14:06:37 -0300 on Friday, December 5, 2008:
> > - sshfs to allow you to mount and browse any backup as if it were
> > still a live filesystem. (This would make all the pooling,
> > compression, and database backend transparent to the user). This
> > would be ver
I currently have several windows server backup clients that i use
volume shadow to backup the data, i use a pre script that lauches the
shadow and maps it to drive B on the windows box and then backuppc
backups B over rsync then a post script to kill the volume shadow.
What i would like to do is cr
> - sshfs to allow you to mount and browse any backup as if it were
> still a live filesystem. (This would make all the pooling,
> compression, and database backend transparent to the user). This
> would be very powerful.
this could be easily implemented using FUSE to mount the backup
locally a
David Rees wrote:
> On Thu, Dec 4, 2008 at 7:24 PM, Nick Smith <[EMAIL PROTECTED]>
> wrote:
> >
> > Did you ever get this resolved? Im having the same problem, now all
> > of my backups are failing with the same errors you are getting. Im
> > using 2.6.9 protocol version 29. Ubuntu doesnt seem
Hi,
My largest BackupPC server's disk filled. I had some systems that
needed to be archived and removed, which I did, but I only have 2.9G
free out of 5T. One of the systems I'm no longer backing up was quite
large. What's the right way to clean the pool? Can I run the nightly
cleanup
Jeffrey J. Kosowsky wrote at about 11:19:42 -0500 on Friday, December 5, 2008:
> - sshfs to allow you to mount and browse any backup as if it were
> still a live filesystem. (This would make all the pooling,
> compression, and database backend transparent to the user). This
> would be ver
>
> Specifically, it seems to me that we should distinguish (at least)
> among the following situations for long dump/restore times
> 1. Large backups/slow links - here...
> 2. Disconnected PC or newly degraded link speed - here it would be
> nice to have a separate "timeout"...
>
> 3. Rebooted
This may be a naive question, but I was wondering what is the state of
BackupPC development? (I couldn't find answers on the sourceforge
site)
- Are major new features/extensions/improvements being implemented or
only bugfixes and limited changes?
- Is their a roadmap and/or wishlist?
- Is their
David Rees wrote:
> On Thu, Dec 4, 2008 at 7:24 PM, Nick Smith <[EMAIL PROTECTED]> wrote:
>
>> Did you ever get this resolved? Im having the same problem, now all of
>> my backups are failing with the same errors you are getting. Im using
>> 2.6.9 protocol version 29. Ubuntu doesnt seem to h
Based on my personal experience and what I see from others, it seems
that the current ClientTimeout parameter is too generic to be all that
useful with some people at times finding it too short and at other
times too long.
Given the vagaries of Windows/cygwin/rsync, I find that sometimes
BackupPC
20 matches
Mail list logo