Just be aware that find will also go down subdirectories, so if you have
files with similar names in lower levels that you want to keep you have to
find another way. If there are no subdirs, or if the subdirs don't have
similar names then find works great.
Someone mentioned xargs. xargs works great and was faster in a test I saw
(but we are only talking fractions of a second per 1000 files :).
Another way is to ls filespec > delthem. Then edit delthem and put rm in
front of every line. (Hint: in vi use :%s/^/rm /). This gives you very fine
control, but is slowest.
There are perl scripts (and I'm sure REXX, python, etc) that wrap around rm
and read the directory to handle large directories.
I could go on, but....
Have fun!
Kai.
-----Original Message-----
From: Hamish Marson
To: [EMAIL PROTECTED]
Sent: 7/10/02 8:15 AM
Subject: Re: AIX Question!
Al'shaebani, Bassam wrote:
>Hello All,
>I'm came across an issue yesteday, does anyone know a way around
>deleting a large number of files without getting the 'parameter too
>long'
>message. i.e. I was trying to delete all the file that began with
>program*.
>There were too many files, so I had to actually cut my search down, to
>something like program01* and so forth, to avoid the error message.
>
>
That's a shell issue, not necessarily an AIX issue. The problem is that
the shell is responsible for expanding the commandline parameters. Thus
typing *, the shell expands that to ALL the file that match (i.e. all of
them).
For instances like this, there are probably more ways than I have
fingers, but genrally one that will always work is using find. e.g.
find <dir> -name '*' -exec rm {} \;
>Thanks......
>
>Regards,
>Bassam
>
>
--
I don't suffer from Insanity... | Linux User #16396
I enjoy every minute of it... |
|
http://www.travellingkiwi.com/ |