New question #404018 on Duplicity:
https://answers.launchpad.net/duplicity/+question/404018

Hi there,
I'm facing an interesting situation and I'm sure that I'm not alone:
I have a backup set that comprises way more data than what my internet 
connection can handle within 24 hours (the set is about 200 GB and that should 
take round about a week). 
Unfortunately, my ISP disconnects my internet connection every 24 hours and 
assigns me a new IP. And the backends that I've tried so far (SSH and dropbox) 
cannot handle the closed socket (even though the internet connectivity is back 
after a few seconds). 
I tried quite a few things but in the end failed. So, I have some questions:
1) Does it somehow harm the quality of the backup if I would start the backup 
process over manually (or via the bash) 20 times? I don't find it a good 
solution to resume the backup for so often but currently I see no other option. 
I really would appreciate your opinion on that
2) Are there or will there be any backends that can hanlde such a situation? In 
principle, it's pretty simple. The backend "only" would have to start over 
authentication and reconnect completely in case of a permanent error (at least 
trying this in case of a permanent error would be very useful).
3) Is anybody here encountering the same problem and maybe found a different 
solution that I did not yet think of?

Thanks in advance
Have a nice weekend!
Nils 

-- 
You received this question notification because your team duplicity-team
is an answer contact for Duplicity.

_______________________________________________
Mailing list: https://launchpad.net/~duplicity-team
Post to     : [email protected]
Unsubscribe : https://launchpad.net/~duplicity-team
More help   : https://help.launchpad.net/ListHelp

Reply via email to