Hi Michael,
Thank you for your time and reply.
take care and have a good week,
_g
On Tue, Feb 21, 2017 at 10:20 AM, Michael Link wrote:
> Hi Greg,
>
> It isn't possible -- hard links are read and created as normal files by
> the Globus tools.
>
> Mike
>
> On 2/18/2017 7:59 PM, greg whynott
Hi Greg,
It isn't possible -- hard links are read and created as normal files by
the Globus tools.
Mike
On 2/18/2017 7:59 PM, greg whynott wrote:
> Hello,
>
> Is there a method to exclude hardlinks from being transferred which
> exist within the data set to be transferred, yet are recreated on
Walter, perhaps I wasn't clear, or mistaken the focus of the list here.
if so sorry.
Developing something to handle this isn't a challenge. I was curious if
there was a method within the tool set I was unaware of. Thinking there
would be some folks much more familiar with the took kit here than
I'm not sure how that relates to my question Walter but thanks. ls -i gets
you there too. :)
-g
On Mon, Feb 20, 2017 at 2:56 AM, Walter de Jong
wrote:
> Hi,
>
> You can use `stat` to see that the inode numbers are identical. Hardlinked
> files also have a link count higher than 1.
>
> $ echo
Hi,
You can use `stat` to see that the inode numbers are identical. Hardlinked
files also have a link count higher than 1.
$ echo hello >file1
$ ln file1 file2
$ stat file1
File: `file1'
Size: 6 Blocks: 8 IO Block: 4096 regular file
Device: 802h/2050d Inode:
Hello,
Is there a method to exclude hardlinks from being transferred which exist
within the data set to be transferred, yet are recreated on the other end?
(using url-copy or any other tool)
An example. In the below scenario only two files would actually cross the
wire, file1.gz and subdir/fi