On 11/3/05, Hawaii Linux Institute [EMAIL PROTECTED] wrote:
Eric Hattemer wrote:
Ok, I wonder if I'm the only one who immediately read this as what I
thought it should have been, then didn't get the criticism. The typo
should have read
for i in m*; do rm $i; done;
-Eric Hattemer
Aloha,
We have a box with over 500,000 files in a direcotry. If I try 'rm m*' I get
an error, something like 'too many arguments'
I think someone else in this situation had a method of switching to another
shell, bash is default.
All the files start with mgetty. I would like to prserve the
On Thu, Nov 03, 2005 at 09:23:54AM -1000, Matt Darnell wrote:
Aloha,
We have a box with over 500,000 files in a direcotry. If I try 'rm m*' I get
an error, something like 'too many arguments'
...
All the files start with mgetty. I would like to prserve the other files in
the directory and
Clifton Royston wrote:
On Thu, Nov 03, 2005 at 09:23:54AM -1000, Matt Darnell wrote:
Aloha,
We have a box with over 500,000 files in a direcotry. If I try 'rm m*' I get
an error, something like 'too many arguments'
...
All the files start with mgetty. I would like to prserve the
Aloha,
We have a box with over 500,000 files in a direcotry. If I try 'rm m*'
I get
an error, something like 'too many arguments'
I think someone else in this situation had a method of switching to
another
shell, bash is default.
All the files start with mgetty. I would like to prserve the
Matt Darnell wrote:
Aloha,
We have a box with over 500,000 files in a direcotry. If I try 'rm m*' I get
an error, something like 'too many arguments'
I think someone else in this situation had a method of switching to another
shell, bash is default.
All the files start with mgetty. I would
Matt Darnell wrote:
Aloha,
We have a box with over 500,000 files in a direcotry. If I try 'rm m*' I get
an error, something like 'too many arguments'
I think someone else in this situation had a method of switching to another
shell, bash is default.
All the files start with mgetty. I would
Jim Thompson wrote:
or cut down on the amount of globbing that the shell does in any one
pass.
If all the files end in a number, (and the numbers are well-distributed)
#!/bin/bash
i=0
while [ $i -le 9 ]
do
echo mgetty*$i | xargs rm -f
i=$[$i+1]
done
Didn't realize you already answered the
find . -name 'mgetty*' -maxdepth 1 -print | xargs rm
Also beware of filenames with spaces when using xargs. If you
know none of the files have spaces, then the above should work
fine for you. If you aren't sure something like:
find . -name '*mgetty*' -print0 |xargs -0 rm
will do what
Tim Newsham wrote:
You'd think that by 2005, arbitrary length command lines wouldn't
be an issue (assuming they could all fit in memory, or kmem, or some
suitable place). I wonder if anyone has made a more up-to-date
unix like system that fixes these warbles..
The UNIX operating system
Jim Thompson wrote:
Hawaii Linux Institute wrote:
ne at a time, I am sure everyone here knows how to do this. For
example:
for i in m*; do rm m*; done
still requires bash to glob all 500,000 files. Fails for the same
reaons.
note my error here. Bash is OK globbing 500,000 files
You'd think that by 2005, arbitrary length command lines wouldn't
be an issue
[...]
Note that the expansion itself is no problem, rather it's almost always an
exec(2) system call which fails returning E2BIG.
Remember that all that crap has to be copied into kernel space for an exec
[...]
Tim Newsham wrote:
You'd think that by 2005, arbitrary length command lines wouldn't
be an issue
[...]
Note that the expansion itself is no problem, rather it's almost
always an exec(2) system call which fails returning E2BIG.
Remember that all that crap has to be copied into kernel space
Jim Thompson wrote:
Clifton Royston wrote:
On Thu, Nov 03, 2005 at 09:23:54AM -1000, Matt Darnell wrote:
Aloha,
We have a box with over 500,000 files in a direcotry. If I try 'rm
m*' I get
an error, something like 'too many arguments'
...
All the files start with mgetty. I
Jim Thompson wrote:
Hawaii Linux Institute wrote:
for i in m*; do rm m*; done
still requires bash to glob all 500,000 files. Fails for the same
reaons.
Ok, I wonder if I'm the only one who immediately read this as what I
thought it should have been, then didn't get the criticism.
Eric Hattemer wrote:
Ok, I wonder if I'm the only one who immediately read this as what I
thought it should have been, then didn't get the criticism. The typo
should have read
for i in m*; do rm $i; done;
-Eric Hattemer
You're exactly right. Wayne
16 matches
Mail list logo