[BackupPC-users] how to backup restore with incremental file

2013-06-27 Thread niraj_vara
Hi

   I have taken the backup via backuppc regularly.  Now problem is that 
backuppc taking a  3 incremental backup and then taking a full backup.

  suppose that system taking backup on fri taking a full backup and on 
sat,sun,mon incremental backup.


suppose my system goes down on sunday when I checked in backup system there 
were incremental backup on it. 

If I want to restored the full backup then I have to choose the friday backup 
but then I will missed the sat,sun backup.

what I required to do to  restored the full backup with latest files. !

+--
|This was sent by niraj.v...@gmail.com via Backup Central.
|Forward SPAM to ab...@backupcentral.com.
+--



--
This SF.net email is sponsored by Windows:

Build for Windows Store.

http://p.sf.net/sfu/windows-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] backup timeout

2013-06-27 Thread Nicola Scattolin
good morning to all
i got some errors when i do the incremental backup, in different directory
Backup aborted (Call timed out: server did not respond after 2 
milliseconds listing \STORAGE\giancarla\documenti giancarla\*)
some times it complete the backup but in takes only less than 5 minutes, 
while usually it takes at least 20 minutes.
i haven't change the configuration and i can access the directory by 
file explorer.
ideas?

-- 
Nicola
Ser.Tec s.r.l.
Via E. Salgari 14/E
31056 Roncade, Treviso
http://dpidgprinting.com


--
This SF.net email is sponsored by Windows:

Build for Windows Store.

http://p.sf.net/sfu/windows-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] how to backup restore with incremental file

2013-06-27 Thread Nicola Scattolin
restoring the last incr backup will restore the latest full backup plus 
the changed file in the incremental
Il 27/06/2013 09:26, niraj_vara ha scritto:
 Hi

 I have taken the backup via backuppc regularly.  Now problem is that 
 backuppc taking a  3 incremental backup and then taking a full backup.

suppose that system taking backup on fri taking a full backup and on 
 sat,sun,mon incremental backup.


 suppose my system goes down on sunday when I checked in backup system 
 there were incremental backup on it.

 If I want to restored the full backup then I have to choose the friday backup 
 but then I will missed the sat,sun backup.

 what I required to do to  restored the full backup with latest files. !

 +--
 |This was sent by niraj.v...@gmail.com via Backup Central.
 |Forward SPAM to ab...@backupcentral.com.
 +--



 --
 This SF.net email is sponsored by Windows:

 Build for Windows Store.

 http://p.sf.net/sfu/windows-dev2dev
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


-- 
Nicola
Ser.Tec s.r.l.
Via E. Salgari 14/E
31056 Roncade, Treviso
http://dpidgprinting.com


--
This SF.net email is sponsored by Windows:

Build for Windows Store.

http://p.sf.net/sfu/windows-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


[BackupPC-users] backup timeout

2013-06-27 Thread Nicola Scattolin
good morning to all
i got some errors when i do the incremental backup, in different directory
Backup aborted (Call timed out: server did not respond after 2 
milliseconds listing \STORAGE\giancarla\documenti giancarla\*)
some times it complete the backup but in takes only less than 5 minutes, 
while usually it takes at least 20 minutes.
i haven't change the configuration and i can access the directory by 
file explorer.
ideas?

-- 
Nicola
Ser.Tec s.r.l.
Via E. Salgari 14/E
31056 Roncade, Treviso
http://dpidgprinting.com


--
This SF.net email is sponsored by Windows:

Build for Windows Store.

http://p.sf.net/sfu/windows-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] time out

2013-06-27 Thread Mirco Piccin
Hi,

 i got some errors when i do the incremental backup, in different directory
 Backup aborted (Call timed out: server did not respond after 2 
 milliseconds listing \STORAGE\giancarla\documenti giancarla\*)
 some times it complete the backup but in takes only less than 5 minutes, 
 while usually it takes at least 20 minutes.
 i haven't change the configuration and i can access the directory by file 
 explorer.

you can recompile smbclient to avoid this issue.

Once you have the source of smbclien, in the file :
source/libsmb/clientgen.c
edit the line :
cli-timeout = 2;

changing 2 (timeout milliseconds) with a bigger number (eg 5).

Regards
M

--
This SF.net email is sponsored by Windows:

Build for Windows Store.

http://p.sf.net/sfu/windows-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] time out

2013-06-27 Thread Nicola Scattolin
i'm not sure, this backup has always worked well, i don't think is 
samba... i will try to reboot the machine, maybe there is some process 
stuck that slow down the machine or other windows problems... then i 
will try recompiling samba

Il 27/06/2013 10:51, Mirco Piccin ha scritto:
 Hi,

 i got some errors when i do the incremental backup, in different directory
 Backup aborted (Call timed out: server did not respond after 2 
 milliseconds listing \STORAGE\giancarla\documenti giancarla\*)
 some times it complete the backup but in takes only less than 5 minutes, 
 while usually it takes at least 20 minutes.
 i haven't change the configuration and i can access the directory by file 
 explorer.
 you can recompile smbclient to avoid this issue.

 Once you have the source of smbclien, in the file :
 source/libsmb/clientgen.c
 edit the line :
 cli-timeout = 2;

 changing 2 (timeout milliseconds) with a bigger number (eg 5).

 Regards
 M

 --
 This SF.net email is sponsored by Windows:

 Build for Windows Store.

 http://p.sf.net/sfu/windows-dev2dev
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/


-- 
Nicola
Ser.Tec s.r.l.
Via E. Salgari 14/E
31056 Roncade, Treviso
http://dpidgprinting.com


--
This SF.net email is sponsored by Windows:

Build for Windows Store.

http://p.sf.net/sfu/windows-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] time out

2013-06-27 Thread Nicola Scattolin
as expected, rebooted the machine and it work

Il 27/06/2013 11:01, Nicola Scattolin ha scritto:
 i'm not sure, this backup has always worked well, i don't think is 
 samba... i will try to reboot the machine, maybe there is some process 
 stuck that slow down the machine or other windows problems... then i 
 will try recompiling samba

 Il 27/06/2013 10:51, Mirco Piccin ha scritto:
 Hi,

 i got some errors when i do the incremental backup, in different 
 directory
 Backup aborted (Call timed out: server did not respond after 2 
 milliseconds listing \STORAGE\giancarla\documenti giancarla\*)
 some times it complete the backup but in takes only less than 5 
 minutes, while usually it takes at least 20 minutes.
 i haven't change the configuration and i can access the directory by 
 file explorer.
 you can recompile smbclient to avoid this issue.

 Once you have the source of smbclien, in the file :
 source/libsmb/clientgen.c
 edit the line :
 cli-timeout = 2;

 changing 2 (timeout milliseconds) with a bigger number (eg 5).

 Regards
 M

 --
  

 This SF.net email is sponsored by Windows:

 Build for Windows Store.

 http://p.sf.net/sfu/windows-dev2dev
 ___
 BackupPC-users mailing list
 BackupPC-users@lists.sourceforge.net
 List: https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/




-- 
Nicola
Ser.Tec s.r.l.
Via E. Salgari 14/E
31056 Roncade, Treviso
http://dpidgprinting.com


--
This SF.net email is sponsored by Windows:

Build for Windows Store.

http://p.sf.net/sfu/windows-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] backup timeout

2013-06-27 Thread Michael Stowe

 good morning to all
 i got some errors when i do the incremental backup, in different directory
 Backup aborted (Call timed out: server did not respond after 2
 milliseconds listing \STORAGE\giancarla\documenti giancarla\*)
 some times it complete the backup but in takes only less than 5 minutes,
 while usually it takes at least 20 minutes.
 i haven't change the configuration and i can access the directory by
 file explorer.
 ideas?

I have a few ideas, many of which involve NOT posting the exact same thing
to the same list half a dozen times.

I'm guessing that you're using smbclient, and note that the error message
is in how long it takes to list a directory.  All kinds of things can
affect performance; for example, if you're posting messages to a mailing
list at an insane rate from the same directory, it may slow things down. 
You can always increase the smbclient timeout, or use a different backup
method.

--
This SF.net email is sponsored by Windows:

Build for Windows Store.

http://p.sf.net/sfu/windows-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Tar exited with error 36096

2013-06-27 Thread Mike Bosschaert

  
  
Thanks Craig and Stephen for taking the
  time to dive into this, and excuse for my late reply (have been
  out of town, could not access the backupserver).
  
  My tar version is 1.22 (2009) as far as I could test, it supports
  file and link-names longer than 100 characters. It's part of
  ubuntu server 11.04.
  
  Although the Dropbox directory generates errors, which suggests
  characterset-problems, some regular native ext4 directories (with
  longer directory/file names) also produce errors. I have the
  impression it is not an characterset problem.
  
  For some reason I CAN make incremental backups (which do report
  errors on the long directory names). But the process does not
  crash. 
  
  One detail which may be important. I use an external USB disk
  (2GB), mounted on /home/backuppc. I have linked /var/lib/backuppc
  - /home/backuppc. My last full backup is of januari 2013,
  without any errors. The directory structure has not changed
  significantly (the directories producing errors now did exist at
  that time, but were fully accessible, no size-limits present). The
  system design has not changed since then, neither have the system
  been updated.
  
  Is there a way to increase the debug reporting level to get more
  clues?
  
  
  
  
  On 06/25/2013 04:42 AM, Craig Barratt wrote:


  
I find
it interesting that the client file path is being cut off ~=
100 characters.


That's a very good observation and an important
  clue. The file header in any tar file is limited to 100
  characters, and there is a special extension (basically
  another dummy file header with a payload containing the real
  long path) to allow longer paths. The same mechanism is used
  for long soft link targets.


So either this a very old version of tar, as you
  suggest, or the extension header isn't being recognized
  properly.

  Craig 
  
  
  
  
  --
This SF.net email is sponsored by Windows:

Build for Windows Store.

http://p.sf.net/sfu/windows-dev2dev
  
  
  
  ___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/



  


--
This SF.net email is sponsored by Windows:

Build for Windows Store.

http://p.sf.net/sfu/windows-dev2dev___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Tar exited with error 36096

2013-06-27 Thread Les Mikesell
On Thu, Jun 27, 2013 at 2:48 PM, Mike Bosschaert insomn...@gmail.com wrote:

 For some reason I CAN make incremental backups (which do report errors on
 the long directory names). But the process does not crash.

That probably just means there aren't any new files to write there.

 One detail which may be important. I use an external USB disk (2GB), mounted
 on /home/backuppc. I have linked /var/lib/backuppc - /home/backuppc. My
 last full backup is of januari 2013, without any errors. The directory
 structure has not changed significantly (the directories producing errors
 now did exist at that time, but were fully accessible, no size-limits
 present). The system design has not changed since then, neither have the
 system been updated.

What file system type is on the USB drive?

 Is there a way to increase the debug reporting level to get more clues?

Can you create the same file manually with the full path - the shell
might give you a better error message.

--
   Les Mikesell
 lesmikes...@gmail.com

--
This SF.net email is sponsored by Windows:

Build for Windows Store.

http://p.sf.net/sfu/windows-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Backup the backup to online provider

2013-06-27 Thread Raman Gupta
On 06/25/2013 12:05 PM, Carl Wilhelm Soderstrom wrote:
 On 06/25 11:55 , Raman Gupta wrote:
 For disaster recovery purposes, I have been periodically backing up my
 BackupPC pool to external storage. I have a small pool of
 approximately 300 GB on a Linux server, and currently use rsync to
 copy the pool to storage and keep it updated.
 
 That mechanism won't scale much farther. Rsync chokes on the number of files
 that BackupPC (at least the 3.x version) uses, because of the hardlinks.

Agreed, which is one of the reasons I want to replace rsync, and takes
me back to my original question:

 Does anyone have any positive or negative experiences to
 share with using CrashPlan, or any similar provider [1], for backing
 up their pool?

On 06/25/2013 12:05 PM, Carl Wilhelm Soderstrom wrote:
 BPC 4 may be better able to replicate its data pool remotely. it's very
 alpha at the moment tho and I don't know if anyone is yet trying to hammer
 out how to do remote replication with it and what the limiting values are.

Yes, I've been following the information about BPC 4 on the dev list.

Regards,
Raman

--
This SF.net email is sponsored by Windows:

Build for Windows Store.

http://p.sf.net/sfu/windows-dev2dev
___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/


Re: [BackupPC-users] Tar exited with error 36096

2013-06-27 Thread Craig Barratt
Mike,

Is there a way to increase the debug reporting level to get more clues?


Les' suggestions are a good way to see if it is a problem on the source
file system.

For BackupPC, first try to limit the backup to the smallest case that shows
the problem (by excluding files or creating a simple test tree to back up).

Then set $Conf{XferLogLevel} to 9.  You should get a lot of output in the
XferLOG file.

You then have several options, eg: (1) send the XferLOG file to the list if
it short, or to me if it is large - first make sure it doesn't contain
anything confidential like file names, host names, IP addresses etc (2) try
running manually the tar command (that generates the archive) that is shown
in the top of the XferLOG file, and pipe the output to tar -tvf -.  Does
it run successfully?  Do you see the full path?

Craig


On Thu, Jun 27, 2013 at 12:48 PM, Mike Bosschaert insomn...@gmail.comwrote:

  Thanks Craig and Stephen for taking the time to dive into this, and
 excuse for my late reply (have been out of town, could not access the
 backupserver).

 My tar version is 1.22 (2009) as far as I could test, it supports file and
 link-names longer than 100 characters. It's part of ubuntu server 11.04.

 Although the Dropbox directory generates errors, which suggests
 characterset-problems, some regular native ext4 directories (with longer
 directory/file names) also produce errors. I have the impression it is not
 an characterset problem.

 For some reason I CAN make incremental backups (which do report errors on
 the long directory names). But the process does not crash.

 One detail which may be important. I use an external USB disk (2GB),
 mounted on /home/backuppc. I have linked /var/lib/backuppc -
 /home/backuppc. My last full backup is of januari 2013, without any errors.
 The directory structure has not changed significantly (the directories
 producing errors now did exist at that time, but were fully accessible, no
 size-limits present). The system design has not changed since then, neither
 have the system been updated.

 Is there a way to increase the debug reporting level to get more clues?





 On 06/25/2013 04:42 AM, Craig Barratt wrote:

  I find it interesting that the client file path is being cut off ~= 100
 characters.


  That's a very good observation and an important clue.  The file header
 in any tar file is limited to 100 characters, and there is a special
 extension (basically another dummy file header with a payload containing
 the real long path) to allow longer paths.  The same mechanism is used for
 long soft link targets.

  So either this a very old version of tar, as you suggest, or the
 extension header isn't being recognized properly.

 Craig


 --
 This SF.net email is sponsored by Windows:

 Build for Windows Store.
 http://p.sf.net/sfu/windows-dev2dev



 ___
 BackupPC-users mailing listbackuppc-us...@lists.sourceforge.net
 List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
 Wiki:http://backuppc.wiki.sourceforge.net
 Project: http://backuppc.sourceforge.net/



--
This SF.net email is sponsored by Windows:

Build for Windows Store.

http://p.sf.net/sfu/windows-dev2dev___
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/