On Wed, Mar 07, 2007 at 09:22:08PM -0800, Sriram Ramkrishna wrote:
Hi folks, I've been googling around for awhile but I can't seem to find
an answer to my question.
I have a number of filesystems that contain thousands of hard links due
to some bad organization of data. Rsync, cpio and
On Wed 07 Mar 2007, Sriram Ramkrishna wrote:
that are hard links. Then after the copy is finished, I will use some
kind of find . -type l type command that finds the hard links and then
find -type l will find symbolic links, *not* hard links.
Paul Slootman
--
To unsubscribe or change
Hi, i try to learn how rsync lock files for create a backup, but.. not find
any on www.
some can help me and give any HOWTO or FAQ or document ?
Thanks!
--
To unsubscribe or change options: https://lists.samba.org/mailman/listinfo/rsync
Before posting, read:
On Thu 08 Mar 2007, Alejandro Feij?o wrote:
Hi, i try to learn how rsync lock files for create a backup, but.. not find
any on www.
rsync does not lock files.
Paul Slootman
--
To unsubscribe or change options: https://lists.samba.org/mailman/listinfo/rsync
Before posting, read:
mmm and is posible corrupt data???
for example if rsync is on file called A.txt and at same time mysql is read
A.txt... whats happend? is that who i dont understand.
2007/3/8, Paul Slootman [EMAIL PROTECTED]:
On Thu 08 Mar 2007, Alejandro Feij?o wrote:
Hi, i try to learn how rsync lock
On Wed, Mar 07, 2007 at 09:22:08PM -0800, Sriram Ramkrishna wrote:
| Hi folks, I've been googling around for awhile but I can't seem to find
| an answer to my question.
|
| I have a number of filesystems that contain thousands of hard links due
| to some bad organization of data. Rsync, cpio
On Thu, Mar 08, 2007 at 12:08:55PM +0100, Alejandro Feij?o wrote:
| mmm and is posible corrupt data???
|
| for example if rsync is on file called A.txt and at same time mysql is read
| A.txt... whats happend? is that who i dont understand.
It can get mixed data. The file contents could appear
Its Great! thanks for the info :)
--
To unsubscribe or change options: https://lists.samba.org/mailman/listinfo/rsync
Before posting, read: http://www.catb.org/~esr/faqs/smart-questions.html
Destroy ACL-s on destination when no ACL-s differens between source and
destination.
Bug is somehow related with function send_file_name() called with negative file
descriptor f. There is no such bug in 2.6.9 version, but there options -X -A
--deleted can't be used (we have Internal error:
Hi Dave,
Dave Markham wrote:
What are you the Rsync options you are using ?
Clientside:
rsync -vratz --password-file=somefile --force --delete-excluded
--delete --exclude-from=somefile --files-from=somefile
Serverside:
rsync --no-detach --daemon --config /etc/rsyncd.conf
with rsyncd.conf:
Hi Matt,
On 2 Mar 2007 at 10:37, Matt McCutchen [EMAIL PROTECTED] said:
On 3/2/07, Sabahattin Gucukoglu [EMAIL PROTECTED] wrote:
--delete --force Won't Remove Directories With Dotnames if you remove a
directory which was aborted in a recent transfer on source (leaving a
.partial mirror
Thanks for rsync - it's great! I use it for all my backups.
I use rsync via SSH. Recently - I was having trouble with my backups:
rsync: connection unexpectedly closed (4968349 bytes received so far)
[receiver]
rsync error: error in rsync protocol data stream (code 12) at
io.c(453)
On Thu, Mar 08, 2007 at 05:02:21PM -, Sabahattin Gucukoglu wrote:
However, rsync is still never making any mention of the .partial
directories in the output during the list build or sync (even with
-vv, and I think even this would be a courtesy indicator).
Starting with version 3.0.0,
On Wed, Mar 07, 2007 at 09:22:08PM -0800, Sriram Ramkrishna wrote:
Is there a way to have it skip hard links when doing an rsync?
If you mean you want to skip any file that has more than one link, you
could do this:
find . -type f -links +1 /path/exclude.txt
Then, you'd use the exclude.txt
On Thu, Mar 08, 2007 at 03:00:38PM +0100, Stanis?aw Gruszka wrote:
Destroy ACL-s on destination when no ACL-s differens between source
and destination.
Can you explain this a little more?
we have Internal error: wrong write used in receiver.
Yes, this is a known problem with the acls.patch
On Thu, Mar 08, 2007 at 10:32:04AM -0700, Phil Hassey wrote:
someone suggested that I add this to my ~/.ssh/config file:
ServerAliveInterval 5
ServerAliveCountMax 120
Those options only apply to protocol 2 connections (which people should
be using anyway these days). There are also settings
Hi guys,
I've been struggling with getting --link-dest working for a couple of
hours now. I'm using rsync 2.6.9 (protocol 29), on Gentoo Linux.
For some strange reason, whenever I recreate the source file, even
though it's identical to the old source file --link-dest simply does not
create the
On Wed, Mar 07, 2007 at 09:22:08PM -0800, Sriram Ramkrishna wrote:
Hi there,
For some reason, I sent this mail before I was fully subscribed and I
have missed out on the replies. If I don't answer all the responses this
is why.
The following command pipeline can give you a list which you
On Thu 08 Mar 2007, Sri Ramkrishna wrote:
I think I probably hard links to directories. I have observed cpio
going through a loop continously. Since I was doing this on an AIX
JFS filesystem (on an AIX fileserver) it might not have same protections
that I believe Linux when hitting a
On Thu, Mar 08, 2007 at 10:15:01PM +0100, Paul Slootman wrote:
On Thu 08 Mar 2007, Sri Ramkrishna wrote:
I think I probably hard links to directories. I have observed cpio
going through a loop continously. Since I was doing this on an AIX
JFS filesystem (on an AIX fileserver) it might
On Thu, Mar 08, 2007 at 10:14:39AM -0800, Wayne Davison wrote:
On Wed, Mar 07, 2007 at 09:22:08PM -0800, Sriram Ramkrishna wrote:
Is there a way to have it skip hard links when doing an rsync?
If you mean you want to skip any file that has more than one link, you
could do this:
find
On Thu, Mar 08, 2007 at 08:56:54PM +0200, Jaco Kroon wrote:
For some strange reason, whenever I recreate the source file, even
though it's identical to the old source file --link-dest simply does not
create the link.
The file isn't identical enough to hard-link unless every preserved
detail is
I have a directory dumps on my laptop containing several dumps of
various levels.
local-0-2007-03-03.gz
local-4-2007-02-12.gz
local-4-2007-02-19.gz
local-4-2007-02-26.gz
local-4-2007-03-05.gz
local-5-2007-03-04.gz
local-5-2007-03-06.gz
local-5-2007-03-07.gz
local-5-2007-03-08.gz
Naturally the
Dnia 8-03-2007 o godz. 15:00 Stanisław Gruszka napisał(a):
Destroy ACL-s on destination when no ACL-s differens between
source and destination.
Wayne, here is example:
[EMAIL PROTECTED] /mnt/hda5 $ getfacl --omit-header export/file
user::rw-
user:apache:-w-
group::r--
mask::rw-
other::r--
24 matches
Mail list logo