@Saturn,
LeeRain was just trying to spam this and other lists. Ignore him. He's been
banned.
+--
|This was sent by wcplis...@gmail.com via Backup Central.
|Forward SPAM to ab...@backupcentral.com.
+---
What are you referring to?
LeeRain wrote:
> So would it just turn it off automatically or did I screw things up ?
+--
|This was sent by saturn2...@gmail.com via Backup Central.
|Forward SPAM to ab...@backupcentral.com.
+--
On Wed, Jul 7, 2010 at 02:34, Saturn2888
wrote:
> Ah, good point! Hmm... I wonder about -z (--compress) then because when I've
> turned that on for testing, it ran the same rsync job for days and did not
> finish on any clients. I wonder if anyone's added it successfully not that
> you'd ever n
Ah, good point! Hmm... I wonder about -z (--compress) then because when I've
turned that on for testing, it ran the same rsync job for days and did not
finish on any clients. I wonder if anyone's added it successfully not that
you'd ever need or want to.
+--
On Tue, Jul 06, 2010 at 07:01:01PM -0400, Saturn2888 wrote:
> Les Mikesell wrote:
> > On 7/6/2010 11:57 AM, Saturn2888 wrote:
> > > So would it just turn it off automatically or did I screw things up?
> > >
> > >
> > http://search.cpan.org/~cbarratt/File-RsyncP-0.68/lib/File/RsyncP.pm
> > doesn'
Oh darn, you guys are right. I'm using the forums so it's different. Let's see,
we're allowed to add the --checksum-seed option so I think that option either
does nothing then or that it does work and the perl rsync.pm file is able to
accommodate more rsync functions than those noted there.
L
On 7/6/2010 11:57 AM, Saturn2888 wrote:
>
> So would it just turn it off automatically or did I screw things up?
>
http://search.cpan.org/~cbarratt/File-RsyncP-0.68/lib/File/RsyncP.pm
doesn't show --inplace as an option, so I'd guess it doesn't do anything.
By the way, posting without quoting any
So would it just turn it off automatically or did I screw things up?
+--
|This was sent by saturn2...@gmail.com via Backup Central.
|Forward SPAM to ab...@backupcentral.com.
+--
On 7/6/2010 11:08 AM, Tyler J. Wagner wrote:
> On Tuesday 06 July 2010 16:11:17 Innop wrote:
>> Are you sure for --inplace ? Is that upload will be there not bigger?
>> *--inplace* This option changes how rsync transfers a file when its data
>> needs to be updated: instead of the default method of
It's disk write speed which has always plagued rsync in my opinion. Normally
nearly none of my Gigabit network bandwidth is in use. I've even changed out
all the CAT5 and CAT5e cables to CAT6 to no avail so it's definitely disk
performance. As it stated though, it might actually make transferri
On Tuesday 06 July 2010 16:11:17 Innop wrote:
> Are you sure for --inplace ? Is that upload will be there not bigger?
> *--inplace* This option changes how rsync transfers a file when its data
> needs to be updated: instead of the default method of creating a new copy
> of the file and moving it i
Well it might not be faster for BackupPC, that's for sure. I've gone ahead and
removed that part of it.
+--
|This was sent by saturn2...@gmail.com via Backup Central.
|Forward SPAM to ab...@backupcentral.com.
+--
Are you sure for --inplace ? Is that upload will be there not bigger?
*--inplace* This option changes how rsync transfers a file when its data
needs to be updated: instead of the default method of creating a new copy of
the file and moving it into place when it is complete, rsync instead writes
the
Add --checksum-seed=32761 and --inplace to your rsync arguments. I believe
those speed it up enough to notice.
+--
|This was sent by saturn2...@gmail.com via Backup Central.
|Forward SPAM to ab...@backupcentral.com.
+---
Your method is fine. I don't think it will be fast enough though. 5GB of daily
changes you said? You don't wanna do those all at once. My recommendation is
first, limit BackupPC to backing up a maximum of one host at a time. Second,
spread out the backups into multiple hosts. Instead of them be
Thanks for you response.
I think for me, the best solution is 1,2,3,4 for Incremental Levels (if I
have no problem with backup). I will test it.
And what do you think of my method ?
Thanks.
2010/7/1 Saturn2888
>
> I would do successive incrementals, but only if the same exact files aren't
> c
I would do successive incrementals, but only if the same exact files aren't
changing each time. For instance, successive incrementals do not benefit you if
files X changed today, and it changes tomorrow, and it changes the next day
only if file X is the file that chances the most.
If file X is
Hello,
I require your opinion about my method.
I want to backup 500 GB of data from another company. Approximately 5 GB of
data are changed daily.
We have a VPN 2Mbit/s (approximately 200 KB/s) between the two companies.
I have a computer (50GB, 256MBytes RAM) and an Ethernet hard disk (1 TB).
I
18 matches
Mail list logo