Re: [CentOS] The directory that I am trying to clean up is huge

2010-01-26 Thread Christopher Chan
On Wednesday, January 27, 2010 11:35 AM, Kevin Krieser wrote:
>
> On Jan 26, 2010, at 6:06 PM, Les Mikesell wrote:
>
>> On 1/25/2010 8:49 AM, Chan Chung Hang Christopher wrote:
>>> Anas Alnaffar wrote:
 I tried to run this command

 find -name "*.access*" -mtime +2 -exec rm {} \;

>>>
>>> Should have been: find ./ -name \*.access\* -mtime +2 -exec rm -f {} \;
>>
>> No difference.  If the path is omitted, current versions of find assume
>> the current directory, and double quotes are fine for avoiding shell
>> expansion of wildcards.  (But, I'm guessing the quotes were omitted on
>> the command that generated the error).
>
> In my defense, I didn't realize that there were versions of find that didn't 
> require a starting location.  And I've tended to remain with more standard 
> versions of commands like this, since I've had to use too many stripped down 
> systems through the years, plus I still use several different versions of 
> Unix like systems.  Centos 5 does work without the path, but I wonder now 
> when that was added to Linux?  OS X doesn't support that variant.  I don't 
> know yet about Solaris.

GNU find and anything GNU has always been a bit different from 
UNIX/POSIX versions. GNU is NOT UNIX after all.

However, there are cases with you would want to use GNU find over the 
local UNIX version of find like on Solaris 8. Way, way faster. Of 
course, the Larry Lackeys er Sun Engineers would point out that GNU find 
is not doing things 'correctly.'

Now that I have gone way off topic and started bashing other operating 
systems, I shall make this my last post on this thread.
___
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos


Re: [CentOS] The directory that I am trying to clean up is huge

2010-01-26 Thread Kevin Krieser

On Jan 26, 2010, at 6:06 PM, Les Mikesell wrote:

> On 1/25/2010 8:49 AM, Chan Chung Hang Christopher wrote:
>> Anas Alnaffar wrote:
>>> I tried to run this command
>>> 
>>> find -name "*.access*" -mtime +2 -exec rm {} \;
>>> 
>> 
>> Should have been: find ./ -name \*.access\* -mtime +2 -exec rm -f {} \;
> 
> No difference.  If the path is omitted, current versions of find assume 
> the current directory, and double quotes are fine for avoiding shell 
> expansion of wildcards.  (But, I'm guessing the quotes were omitted on 
> the command that generated the error).

In my defense, I didn't realize that there were versions of find that didn't 
require a starting location.  And I've tended to remain with more standard 
versions of commands like this, since I've had to use too many stripped down 
systems through the years, plus I still use several different versions of Unix 
like systems.  Centos 5 does work without the path, but I wonder now when that 
was added to Linux?  OS X doesn't support that variant.  I don't know yet about 
Solaris.
___
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos


Re: [CentOS] The directory that I am trying to clean up is huge

2010-01-26 Thread Christopher Chan
Les Mikesell wrote:
> On 1/25/2010 8:49 AM, Chan Chung Hang Christopher wrote:
>> Anas Alnaffar wrote:
>>> I tried to run this command
>>>
>>> find -name "*.access*" -mtime +2 -exec rm {} \;
>>>
>> Should have been: find ./ -name \*.access\* -mtime +2 -exec rm -f {} \;
> 
> No difference.  If the path is omitted, current versions of find assume 
> the current directory, and double quotes are fine for avoiding shell 
> expansion of wildcards.  (But, I'm guessing the quotes were omitted on 
> the command that generated the error).
> 

Well, like you said, I cannot imagine the above command line generating 
a "too many arguments" error. That only makes sense if find was fed too 
many arguments.
___
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos


Re: [CentOS] The directory that I am trying to clean up is huge

2010-01-26 Thread Les Mikesell
On 1/25/2010 8:49 AM, Chan Chung Hang Christopher wrote:
> Anas Alnaffar wrote:
>> I tried to run this command
>>
>> find -name "*.access*" -mtime +2 -exec rm {} \;
>>
>
> Should have been: find ./ -name \*.access\* -mtime +2 -exec rm -f {} \;

No difference.  If the path is omitted, current versions of find assume 
the current directory, and double quotes are fine for avoiding shell 
expansion of wildcards.  (But, I'm guessing the quotes were omitted on 
the command that generated the error).

-- 
   Les Mikesell
lesmikes...@gmail.com
___
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos


Re: [CentOS] The directory that I am trying to clean up is huge

2010-01-26 Thread Kwan Lowe
On Tue, Jan 26, 2010 at 1:15 PM, Les Mikesell  wrote:
> On 1/26/2010 11:42 AM, James B. Byrne wrote:
>>
>> On Mon, January 25, 2010 13:40, Les Mikesell wrote:
>> .
>>>
>>> I'd say it is more likely that the command that resulted in an error
>>> wasn't exactly what was posted or there is a filesystem problem.
>>>
>>
>> I do not consider a file system issue, as in error or corruption,
>> highly probable in this case.  It might be, however, that something
>> returned by the find caused rm itself to choke.
>
> Causing one instance of the per-file rm invocations to choke shouldn't
> bother the rest.  And while file system corruption isn't likely, it is
> still a possible cause of generally-strange behavior.  The most probable
> thing still seems like there was an unquoted * on the line that was
> actually typed when the error was reported.

To illustrate what you and others were saying I did the following:

  [k...@linbox find_test]$ cat add_one.sh
  #!/bin/sh

  COUNTER=`cat counter`
  COUNTER=`expr ${COUNTER} + 1`
  echo ${COUNTER}
  echo "${COUNTER}" > counter

  [k...@linbox find_test] mkdir foo; cd foo; for i in $(seq 1 1 20);
do touch a${i}; done

  [k...@linbox find_test]$ ls
  add_one.sh  counter  foo

  [k...@linbox find_test]$ ls foo
  a1  a10  a11  a12  a13  a14  a15  a16  a17  a18  a19  a2  a20  a3
a4  a5  a6  a7  a8  a9


  [k...@linbox find_test]$ echo "0">counter
  [k...@linbox find_test]$ find foo -name "a*" |xargs ./add_one.sh
  1
  [k...@linbox find_test]$ echo "0">counter
  [k...@linbox find_test]$ find foo -name "a*" -exec ./add_one.sh {} \;
 1
 2
[snip]
 18
 19
 20

Finally:
  [k...@linbox find_test]$ find foo -name a* -exec ./add_one.sh {} \;
  [k...@linbox find_test]$

(This last one has no results because the a* is not quoted and
therefore expanded by the shell before it hits the find command. )
___
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos


Re: [CentOS] The directory that I am trying to clean up is huge

2010-01-26 Thread Kurt Newman
What was your original find command?

Robert Nichols wrote:
> Les Mikesell wrote:
>> On 1/26/2010 11:42 AM, James B. Byrne wrote:
>>> On Mon, January 25, 2010 13:40, Les Mikesell wrote:
>>> .
 I'd say it is more likely that the command that resulted in an error
 wasn't exactly what was posted or there is a filesystem problem.

>>> I do not consider a file system issue, as in error or corruption,
>>> highly probable in this case.  It might be, however, that something
>>> returned by the find caused rm itself to choke.
>> Causing one instance of the per-file rm invocations to choke shouldn't 
>> bother the rest.  And while file system corruption isn't likely, it is 
>> still a possible cause of generally-strange behavior.  The most probable 
>> thing still seems like there was an unquoted * on the line that was 
>> actually typed when the error was reported.
> 
> Indeed, upon closer examination, that message:
> 
> -bash: /usr/bin/find: Argument list too long
> 
> came from the login shell, not from 'find', and indicates that the
> shell got a failure return with errno==E2BIG when it tried to exec()
> /usr/bin/find.  The 'find' command was never executed.
> 


-- 
Global DataGuard, Inc.
Software Engineer
Phone: (214) 980-1444 ext. 242
Cel: (214) 682-1978
Email: knew...@globaldataguard.com
___
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos


Re: [CentOS] The directory that I am trying to clean up is huge

2010-01-26 Thread Robert Nichols
Les Mikesell wrote:
> On 1/26/2010 11:42 AM, James B. Byrne wrote:
>> On Mon, January 25, 2010 13:40, Les Mikesell wrote:
>> .
>>> I'd say it is more likely that the command that resulted in an error
>>> wasn't exactly what was posted or there is a filesystem problem.
>>>
>> I do not consider a file system issue, as in error or corruption,
>> highly probable in this case.  It might be, however, that something
>> returned by the find caused rm itself to choke.
> 
> Causing one instance of the per-file rm invocations to choke shouldn't 
> bother the rest.  And while file system corruption isn't likely, it is 
> still a possible cause of generally-strange behavior.  The most probable 
> thing still seems like there was an unquoted * on the line that was 
> actually typed when the error was reported.

Indeed, upon closer examination, that message:

-bash: /usr/bin/find: Argument list too long

came from the login shell, not from 'find', and indicates that the
shell got a failure return with errno==E2BIG when it tried to exec()
/usr/bin/find.  The 'find' command was never executed.

-- 
Bob Nichols "NOSPAM" is really part of my email address.
 Do NOT delete it.

___
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos


Re: [CentOS] The directory that I am trying to clean up is huge

2010-01-26 Thread Les Mikesell
On 1/26/2010 11:42 AM, James B. Byrne wrote:
>
> On Mon, January 25, 2010 13:40, Les Mikesell wrote:
> .
>>
>> I'd say it is more likely that the command that resulted in an error
>> wasn't exactly what was posted or there is a filesystem problem.
>>
>
> I do not consider a file system issue, as in error or corruption,
> highly probable in this case.  It might be, however, that something
> returned by the find caused rm itself to choke.

Causing one instance of the per-file rm invocations to choke shouldn't 
bother the rest.  And while file system corruption isn't likely, it is 
still a possible cause of generally-strange behavior.  The most probable 
thing still seems like there was an unquoted * on the line that was 
actually typed when the error was reported.

-- 
   Les Mikesell
lesmikes...@gmail.com

___
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos


Re: [CentOS] The directory that I am trying to clean up is huge

2010-01-26 Thread James B. Byrne

On Mon, January 25, 2010 13:40, Les Mikesell wrote:
.
>
> I'd say it is more likely that the command that resulted in an error
> wasn't exactly what was posted or there is a filesystem problem.
>

I do not consider a file system issue, as in error or corruption,
highly probable in this case.  It might be, however, that something
returned by the find caused rm itself to choke.


-- 
***  E-Mail is NOT a SECURE channel  ***
James B. Byrnemailto:byrn...@harte-lyne.ca
Harte & Lyne Limited  http://www.harte-lyne.ca
9 Brockley Drive  vox: +1 905 561 1241
Hamilton, Ontario fax: +1 905 561 0757
Canada  L8E 3C3

___
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos


Re: [CentOS] The directory that I am trying to clean up is huge

2010-01-25 Thread Kevin Krieser


-Original Message-
From: centos-boun...@centos.org [mailto:centos-boun...@centos.org] On Behalf
Of James B. Byrne
Sent: Monday, January 25, 2010 10:06 AM
To: Robert Nichols
Cc: centos@centos.org
Subject: Re: [CentOS] The directory that I am trying to clean up is huge

On Mon, January 25, 2010 10:31, Robert Nichols wrote:
\
>
> Now if the "{}" string appears more than once then the command line
> contains that path more than once, but it is essentially impossible
> to exceed the kernel's MAX_ARG_PAGES this way.
>
> The only issue with using "-exec command {} ;" for a huge number of
> files is one of performance.  If there are 100,000 matched files,
> the command will be invoked 100,000 times.
>
> --
> Bob Nichols rnichol...@comcast.net
>

Since the OP reported that the command he used:

  find -name "*.access*" -mtime +2 -exec rm {} \;

in fact failed, one may infer that more than performance is at issue.

The OP's problem lies not with the -exec construction but with the
unstated, but nonetheless present, './' of his find invocation.
Therefore he begins a recursive descent into that directory tree.
Since the depth of that tree is not given us, nor its contents, we
may only infer that there must be some number of files therein which
are causing the MAXPAGES limit to be exceeded before the recursion
returns.

I deduce that he could provide the -prune option or the -maxdepth= 0
option to avoid this recursion instead. I have not tried either but
I understand that one, or both, should work.




I still suspect that the OP had an unquoted wildcard someplace on his
original command.  Either a find * -name ..., or find . -name *.access*...

I see people all the time forget to quote the argument to -name, which would
normally work if the wildcard doesn't match more than 1 file in the current
directory.  But if there is more than 1 file, then find will return an error
since the second file would likely not match an option to find.  

If there are too many matches in the current directory, the unquoted example
would fail even before the find command could execute.

___
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos


Re: [CentOS] The directory that I am trying to clean up is huge

2010-01-25 Thread Les Mikesell
James B. Byrne wrote:
> On Mon, January 25, 2010 10:31, Robert Nichols wrote:
> \
>> Now if the "{}" string appears more than once then the command line
>> contains that path more than once, but it is essentially impossible
>> to exceed the kernel's MAX_ARG_PAGES this way.
>>
>> The only issue with using "-exec command {} ;" for a huge number of
>> files is one of performance.  If there are 100,000 matched files,
>> the command will be invoked 100,000 times.
>>
>> --
>> Bob Nichols rnichol...@comcast.net
>>
> 
> Since the OP reported that the command he used:
> 
>   find -name "*.access*" -mtime +2 -exec rm {} \;
> 
> in fact failed, one may infer that more than performance is at issue.
> 
> The OP's problem lies not with the -exec construction but with the
> unstated, but nonetheless present, './' of his find invocation.
> Therefore he begins a recursive descent into that directory tree.
> Since the depth of that tree is not given us, nor its contents, we
> may only infer that there must be some number of files therein which
> are causing the MAXPAGES limit to be exceeded before the recursion
> returns.

Find just emits the filenames as encountered, so _no_ number of files should be 
able to cause an error.  An infinitely deep directory tree might, or 
recursively 
linked directories, but only after a considerable amount of time and churning 
to 
exhaust the machine's real and virtual memory.

> I deduce that he could provide the -prune option or the -maxdepth= 0
> option to avoid this recursion instead. I have not tried either but
> I understand that one, or both, should work.

I'd say it is more likely that the command that resulted in an error wasn't 
exactly what was posted or there is a filesystem problem.

-- 
   Les Mikesell
lesmikes...@gmail.com


___
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos


Re: [CentOS] The directory that I am trying to clean up is huge

2010-01-25 Thread James B. Byrne
On Mon, January 25, 2010 10:31, Robert Nichols wrote:
\
>
> Now if the "{}" string appears more than once then the command line
> contains that path more than once, but it is essentially impossible
> to exceed the kernel's MAX_ARG_PAGES this way.
>
> The only issue with using "-exec command {} ;" for a huge number of
> files is one of performance.  If there are 100,000 matched files,
> the command will be invoked 100,000 times.
>
> --
> Bob Nichols rnichol...@comcast.net
>

Since the OP reported that the command he used:

  find -name "*.access*" -mtime +2 -exec rm {} \;

in fact failed, one may infer that more than performance is at issue.

The OP's problem lies not with the -exec construction but with the
unstated, but nonetheless present, './' of his find invocation.
Therefore he begins a recursive descent into that directory tree.
Since the depth of that tree is not given us, nor its contents, we
may only infer that there must be some number of files therein which
are causing the MAXPAGES limit to be exceeded before the recursion
returns.

I deduce that he could provide the -prune option or the -maxdepth= 0
option to avoid this recursion instead. I have not tried either but
I understand that one, or both, should work.



-- 
***  E-Mail is NOT a SECURE channel  ***
James B. Byrnemailto:byrn...@harte-lyne.ca
Harte & Lyne Limited  http://www.harte-lyne.ca
9 Brockley Drive  vox: +1 905 561 1241
Hamilton, Ontario fax: +1 905 561 0757
Canada  L8E 3C3


___
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos


Re: [CentOS] The directory that I am trying to clean up is huge

2010-01-25 Thread Chan Chung Hang Christopher
Anas Alnaffar wrote:
> I tried to run this command
> 
> find -name "*.access*" -mtime +2 -exec rm {} \;
> 

Should have been: find ./ -name \*.access\* -mtime +2 -exec rm -f {} \;
___
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos


Re: [CentOS] The directory that I am trying to clean up is huge

2010-01-25 Thread m . roth
> fred smith wrote:
>> On Mon, Jan 25, 2010 at 03:14:54AM -0800, John Doe wrote:
>>> From: Anas Alnaffar 
 I tried to run this command
 find -name "*.access*" -mtime +2 -exec rm {} \;
 and I have same error message
>>> How many "*.access*" are there...?
>>>
>> if there are so many that you're finding the previously suggested
>> techniques difficult to use, you can try the brute-force I sometimes
>> use...
>
> It actually shouldn't matter.  As long as the wildcards are quoted on the
> command line, you shouldn't get an error from too many files.  I suspect
> the command that was typed wasn't exactly what is shown above.

First, I don't see the path there, which *must* be after the command

Also, I don't believe that "" will work - the shell will interpret that. I
think you need '', or, what I always use, \, so that if I were typing it,
I'd have:
$ find . -name \*.access\* -mtime +2 -exec rm {} \;

  mark

___
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos


Re: [CentOS] The directory that I am trying to clean up is huge

2010-01-25 Thread Les Mikesell
fred smith wrote:
> On Mon, Jan 25, 2010 at 03:14:54AM -0800, John Doe wrote:
>> From: Anas Alnaffar 
>>> I tried to run this command
>>> find -name "*.access*" -mtime +2 -exec rm {} \;
>>> and I have same error message
>> How many "*.access*" are there...?
>>
>> JD
> 
> if there are so many that you're finding the previously suggested
> techniques difficult to use, you can try the brute-force I sometimes
> use...

It actually shouldn't matter.  As long as the wildcards are quoted on the 
command line, you shouldn't get an error from too many files.  I suspect the 
command that was typed wasn't exactly what is shown above.

-- 
   Les Mikesell
lesmikes...@gmail.com


___
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos


Re: [CentOS] The directory that I am trying to clean up is huge

2010-01-25 Thread fred smith
On Mon, Jan 25, 2010 at 03:14:54AM -0800, John Doe wrote:
> From: Anas Alnaffar 
> > I tried to run this command
> > find -name "*.access*" -mtime +2 -exec rm {} \;
> > and I have same error message
> 
> How many "*.access*" are there...?
> 
> JD

if there are so many that you're finding the previously suggested
techniques difficult to use, you can try the brute-force I sometimes
use...

run:
ls > list

then edit the file (list) with a decent text editor, one in which
you can use one command to place text at the beginning of every line
such that every line then turns out to read:

rm file1
rm file2

etc, as well as removing any lines for files you do NOT want to remove.

if you have 'vi', this command will do the edits for you:
":1,$s/^/rm /"

then make the file executable:

chmod a+x list

then run it:

./list
-- 
 Fred Smith -- fre...@fcshome.stoneham.ma.us -
   But God demonstrates his own love for us in this: 
 While we were still sinners, 
  Christ died for us.
--- Romans 5:8 (niv) --
___
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos


Re: [CentOS] The directory that I am trying to clean up is huge

2010-01-25 Thread John Doe
From: Anas Alnaffar 
> I tried to run this command
> find -name "*.access*" -mtime +2 -exec rm {} \;
> and I have same error message

How many "*.access*" are there...?

JD


  
___
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos


Re: [CentOS] The directory that I am trying to clean up is huge

2010-01-25 Thread Tony Mountifield
In article ,
Kevin Krieser  wrote:
> 
> On Jan 23, 2010, at 6:45 AM, Robert P. J. Day wrote:
> 
> > On Sat, 23 Jan 2010, Marcelo M. Garcia wrote:
> >  the find ... -exec variation will invoke a new "rm" command for
> > every single file it finds, which will simply take more time to run.
> > beyond that, the effect should be the same.
> 
> 
> Unless there are files or directories with spaces in them, in which case the
> xargs variant can fail.

That's what -print0 is for, together with the -0 option to xargs:

find dir1 dir2 -name '*.foo' -print0 | xargs -0 rm

Cheers
Tony
-- 
Tony Mountifield
Work: t...@softins.co.uk - http://www.softins.co.uk
Play: t...@mountifield.org - http://tony.mountifield.org
___
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos


Re: [CentOS] The directory that I am trying to clean up is huge

2010-01-23 Thread Robert Nichols
Robert Heller wrote:
> At Sat, 23 Jan 2010 12:43:40 + CentOS mailing list  
> wrote:
> 
>> Just curious. What is the difference between the command above and "find 
>>  -exec rm -f {} \;" ?
> 
> The command "find  -exec rm -f {} \;" collects ALL of the names
> "find " as a single command line, which in your case is too
> large for the shell to deal with.

Gosh, then I guess the manpage for 'find' must be totally wrong where it
says:

-exec command ;
   ...
   The specified command is run once for each matched file.

-- 
Bob Nichols "NOSPAM" is really part of my email address.
 Do NOT delete it.

___
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos


Re: [CentOS] The directory that I am trying to clean up is huge

2010-01-23 Thread Robert Heller
At Sat, 23 Jan 2010 12:43:40 + CentOS mailing list  
wrote:

> 
> Robert Heller wrote:
> > At Sat, 23 Jan 2010 15:23:58 +0300 CentOS mailing list  
> > wrote:
> > 
> >> Content-Language: en-us
> >>
> >>
> >> The directory that I am trying to clean up is huge . every time get this
> >> error msg 
> >>
> >>  
> >>
> >> -bash: /usr/bin/find: Argument list too long
> > 
> > 'man xargs'
> > 
> > find  -print | xargs rm 
> > 
> Hi
> 
> Just curious. What is the difference between the command above and "find 
>  -exec rm -f {} \;" ?

The command "find  -exec rm -f {} \;" collects ALL of the names
"find " as a single command line, which in your case is too
large for the shell to deal with.  The command "find  -print |
xargs rm" uses a pipeline.  "find  -print", *prints* the names
it finds to stdout.  xargs reads stdin, line by line, and collects those
lines as words up to some reasonable string length (within the shell's
command line length limits) and passes this argument list to xargs's
arguments.  If necessary, xargs will call the command repeatedly with
suitable subsets of the complete list, keeping each subset below the
shell's command line string length limit.

The '-exec ...' option to find is fine for small sets of results.  The 
"find ... -print | xargs ..." will handle arbitrarily large result
sets.  xargs can also be used anyplace you happen to have a list of
names (one per line) that you need to pass as words to a command:

tar tzf foo.tar.gz| xargs -r ls -d

Will list those files that are in the tar file that are also on disk.
The '-r' option to xargs will prevent xargs from calling ls with no
arguments, if xargs happens not to get any input.


> 
> Thanks
> 
> mg.
> 
> ___
> CentOS mailing list
> CentOS@centos.org
> http://lists.centos.org/mailman/listinfo/centos
> 
>   

-- 
Robert Heller -- 978-544-6933
Deepwoods Software-- Download the Model Railroad System
http://www.deepsoft.com/  -- Binaries for Linux and MS-Windows
hel...@deepsoft.com   -- http://www.deepsoft.com/ModelRailroadSystem/
___
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos


Re: [CentOS] The directory that I am trying to clean up is huge

2010-01-23 Thread Kevin Krieser
> 
> find on CentOS 5.4 supports
> 
> find  -exec {} +;
> 
> which avoids the negative effect of spawning new subprocesses when using
> "-exec {} \;"
> 
> find on CentOS 4.8 does not support that.

I'll have to give that a try sometime.  A person gets used to a subset of a 
command, and doesn't necessarily look for new options being added.
___
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos


Re: [CentOS] The directory that I am trying to clean up is huge

2010-01-23 Thread Alexander Dalloz
Am 23.01.2010 14:12, schrieb Kevin Krieser:
> 
> On Jan 23, 2010, at 6:45 AM, Robert P. J. Day wrote:
> 
>> On Sat, 23 Jan 2010, Marcelo M. Garcia wrote:
>>
>>> Robert Heller wrote:
>
> -bash: /usr/bin/find: Argument list too long

 'man xargs'

 find  -print | xargs rm

>>> Hi
>>>
>>> Just curious. What is the difference between the command above and "find
>>>  -exec rm -f {} \;" ?
>>
>>  the find ... -exec variation will invoke a new "rm" command for
>> every single file it finds, which will simply take more time to run.
>> beyond that, the effect should be the same.
> 
> 
> Unless there are files or directories with spaces in them, in which case the 
> xargs variant can fail.

find on CentOS 5.4 supports

find  -exec {} +;

which avoids the negative effect of spawning new subprocesses when using
"-exec {} \;"

find on CentOS 4.8 does not support that.

> It is likely the original poster either did
> find * ...
> or find . -name *
> and the bash shell still expanded the arguments.  He was on the right track 
> using the find command, but it wasn't used right.

Alexander

___
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos


Re: [CentOS] The directory that I am trying to clean up is huge

2010-01-23 Thread Kevin Krieser

On Jan 23, 2010, at 7:07 AM, Anas Alnaffar wrote:

> I tried to run this command
> 
> find -name "*.access*" -mtime +2 -exec rm {} \;
> 
> 
> and I have same error message
> 
> 
> 
> Anas 
> 


There must have been more to it, since the command above is invalid.  you need 
to specify where to start the find.
___
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos


Re: [CentOS] The directory that I am trying to clean up is huge

2010-01-23 Thread Kevin Krieser

On Jan 23, 2010, at 6:45 AM, Robert P. J. Day wrote:

> On Sat, 23 Jan 2010, Marcelo M. Garcia wrote:
> 
>> Robert Heller wrote:
 
 -bash: /usr/bin/find: Argument list too long
>>> 
>>> 'man xargs'
>>> 
>>> find  -print | xargs rm
>>> 
>> Hi
>> 
>> Just curious. What is the difference between the command above and "find
>>  -exec rm -f {} \;" ?
> 
>  the find ... -exec variation will invoke a new "rm" command for
> every single file it finds, which will simply take more time to run.
> beyond that, the effect should be the same.


Unless there are files or directories with spaces in them, in which case the 
xargs variant can fail.

It is likely the original poster either did
find * ...
or find . -name *
and the bash shell still expanded the arguments.  He was on the right track 
using the find command, but it wasn't used right.
___
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos


Re: [CentOS] The directory that I am trying to clean up is huge

2010-01-23 Thread Anas Alnaffar
I tried to run this command

find -name "*.access*" -mtime +2 -exec rm {} \;


and I have same error message



Anas 


-Original Message-
From: centos-boun...@centos.org [mailto:centos-boun...@centos.org] On Behalf
Of Marcelo M. Garcia
Sent: Saturday, January 23, 2010 3:34 PM
To: CentOS mailing list
Subject: Re: [CentOS] The directory that I am trying to clean up is huge

Anas Alnaffar wrote:
> The directory that I am trying to clean up is huge . every time get this 
> error msg
> 
>  
> 
> -bash: /usr/bin/find: Argument list too long
> 
>  
> 
>  
> 
> Please advise
> 
>  
> 
> *Anas *
Hi

Could you put the complete command? Please provide more details.

Regards

mg.

___
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos

___
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos


Re: [CentOS] The directory that I am trying to clean up is huge

2010-01-23 Thread Kai Schaetzl
http://www.google.com/search?as_epq=Argument+list+too+long

Kai

-- 
Get your web at Conactive Internet Services: http://www.conactive.com



___
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos


Re: [CentOS] The directory that I am trying to clean up is huge

2010-01-23 Thread Robert P. J. Day
On Sat, 23 Jan 2010, Marcelo M. Garcia wrote:

> Robert Heller wrote:
> >>
> >> -bash: /usr/bin/find: Argument list too long
> >
> > 'man xargs'
> >
> > find  -print | xargs rm
> >
> Hi
>
> Just curious. What is the difference between the command above and "find
>  -exec rm -f {} \;" ?

  the find ... -exec variation will invoke a new "rm" command for
every single file it finds, which will simply take more time to run.
beyond that, the effect should be the same.

rday
--


Robert P. J. Day   Waterloo, Ontario, CANADA

Linux Consulting, Training and Kernel Pedantry.

Web page:  http://crashcourse.ca
Twitter:   http://twitter.com/rpjday

___
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos


Re: [CentOS] The directory that I am trying to clean up is huge

2010-01-23 Thread Marcelo M. Garcia
Robert Heller wrote:
> At Sat, 23 Jan 2010 15:23:58 +0300 CentOS mailing list  
> wrote:
> 
>> Content-Language: en-us
>>
>>
>> The directory that I am trying to clean up is huge . every time get this
>> error msg 
>>
>>  
>>
>> -bash: /usr/bin/find: Argument list too long
> 
> 'man xargs'
> 
> find  -print | xargs rm 
> 
Hi

Just curious. What is the difference between the command above and "find 
 -exec rm -f {} \;" ?

Thanks

mg.

___
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos


Re: [CentOS] The directory that I am trying to clean up is huge

2010-01-23 Thread Robert Heller
At Sat, 23 Jan 2010 15:23:58 +0300 CentOS mailing list  
wrote:

> 
> Content-Language: en-us
> 
> 
> The directory that I am trying to clean up is huge . every time get this
> error msg 
> 
>  
> 
> -bash: /usr/bin/find: Argument list too long

'man xargs'

find  -print | xargs rm 

> 
>  
> 
>  
> 
> Please advise 
> 
>  
> 
> Anas 
> 
> 
> MIME-Version: 1.0
> 
> ___
> CentOS mailing list
> CentOS@centos.org
> http://lists.centos.org/mailman/listinfo/centos
> 
>   
> 

-- 
Robert Heller -- 978-544-6933
Deepwoods Software-- Download the Model Railroad System
http://www.deepsoft.com/  -- Binaries for Linux and MS-Windows
hel...@deepsoft.com   -- http://www.deepsoft.com/ModelRailroadSystem/

  
___
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos


Re: [CentOS] The directory that I am trying to clean up is huge

2010-01-23 Thread Marcelo M. Garcia
Anas Alnaffar wrote:
> The directory that I am trying to clean up is huge … every time get this 
> error msg
> 
>  
> 
> -bash: /usr/bin/find: Argument list too long
> 
>  
> 
>  
> 
> Please advise
> 
>  
> 
> *Anas *
Hi

Could you put the complete command? Please provide more details.

Regards

mg.

___
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos


[CentOS] The directory that I am trying to clean up is huge

2010-01-23 Thread Anas Alnaffar
The directory that I am trying to clean up is huge . every time get this
error msg 

 

-bash: /usr/bin/find: Argument list too long

 

 

Please advise 

 

Anas 

___
CentOS mailing list
CentOS@centos.org
http://lists.centos.org/mailman/listinfo/centos