Getting file sizes

2007-10-22 Thread Kent Johnson
Newbie question:

How can I get the total size, in K, of all files in a directory that 
match a pattern?

For example, I have a dir with ~5000 files, I would like to know the 
total size of the ~1000 files matching *.txt.

On RHEL and bash, if it matters...
Thanks,
Kent
___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


Re: Getting file sizes

2007-10-22 Thread Stephen Ryan
On Mon, 2007-10-22 at 09:11 -0400, Kent Johnson wrote:
 Newbie question:
 
 How can I get the total size, in K, of all files in a directory that 
 match a pattern?
 
 For example, I have a dir with ~5000 files, I would like to know the 
 total size of the ~1000 files matching *.txt.
 

du -c *.txt | tail -1

(That's -(one), not -(ell), meaning, you only want the last line of
output from du.)

du prints out the sizes of each of the matching files; '-c' means you
want a total, too; piping the output through tail -1 picks out just the
last line with the total.

-- 
Stephen Ryan
Dartware, LLC

___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


Re: Getting file sizes

2007-10-22 Thread Jim Kuzdrall
On Monday 22 October 2007 09:11, Kent Johnson wrote:
 Newbie question:

 How can I get the total size, in K, of all files in a directory that
 match a pattern?

 For example, I have a dir with ~5000 files, I would like to know the
 total size of the ~1000 files matching *.txt.

Ah!  Perhaps I actually know an answer to this one.  (Very rare) 

Go to directory of interest and try
du -shc *.txt

Jim Kuzdrall
___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


Re: Getting file sizes

2007-10-22 Thread Michael ODonnell


More than you asked for, but here's a command that reports
total space occupied by all files with names ending in .jpg,
recursively from the current directory (but not crossing mount
points) and which is also a gratuitous example of the Process
Substitution facility mentioned in a previous thread:

   du -c -h --files0-from=(find . -xdev -type f -name *.jpg -print0 
2/dev/null) | tail -1
 
___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


Re: Getting file sizes

2007-10-22 Thread Ted Roche
Kent Johnson wrote:
 Newbie question:
 
 How can I get the total size, in K, of all files in a directory that 
 match a pattern?
 
 For example, I have a dir with ~5000 files, I would like to know the 
 total size of the ~1000 files matching *.txt.
 
 On RHEL and bash, if it matters...
 Thanks,
 Kent
 ___

To get the result in K, specify:

du -c --block-size=1024 *.txt

or your choice of what you think K means ;)

man du tells more.

-- 
Ted Roche
Ted Roche  Associates, LLC
http://www.tedroche.com
___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


Re: Getting file sizes

2007-10-22 Thread Kent Johnson
Jim Kuzdrall wrote:
 On Monday 22 October 2007 09:11, Kent Johnson wrote:
 How can I get the total size, in K, of all files in a directory that
 match a pattern?

 For example, I have a dir with ~5000 files, I would like to know the
 total size of the ~1000 files matching *.txt.
 
 Ah!  Perhaps I actually know an answer to this one.  (Very rare) 
 
 Go to directory of interest and try
 du -shc *.txt

That still lists each file individually, it needs to pipe to tail as 
Stephen suggested.

Kent
___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


Re: Getting file sizes

2007-10-22 Thread Michael ODonnell


Ooops - that --files0-from= option is apparently
new enough (my du version is 5.97) that it's probably
not widely available.  My home system has it, but my
work systems don't...  -/
 
___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


Re: Getting file sizes

2007-10-22 Thread Paul Lussier
Kent Johnson [EMAIL PROTECTED] writes:

 Newbie question:

 How can I get the total size, in K, of all files in a directory that 
 match a pattern?

Stephen Ryan [EMAIL PROTECTED] writes:

 du -c *.txt | tail -1

 du prints out the sizes of each of the matching files; '-c' means you
 want a total, too; piping the output through tail -1 picks out just the
 last line with the total.

Hmmm, I wouldn't have chosen 'tail -1'.  My instinct would have been
to 'grep -i total', which is both more typing, and not as accurate
(what if there was a filename containing the string 'total'?).

Michael ODonnell [EMAIL PROTECTED] writes:

 du -c -h --files0-from=(find . -xdev -type f -name *.jpg -print0 \
 2/dev/null) | tail -1

Hmm, again, certainly not my fist instinct :)

I almost *never* think to redirect stdin this way for some reason.
Had I come up with the answer, I probably would have written it more
like:

 find . -type f -name \*.muse -print0 | du -c --files0-from=- | tail -1

Which yields the same answer.  I think I like mod's better :)

Ted Roche [EMAIL PROTECTED] writes:

 To get the result in K, specify:

 du -c --block-size=1024 *.txt

Hmmm, I would have just used -k, or actually, let it default to that
and not specify any flag at all regarding size.

 or your choice of what you think K means ;)

Though, it's very cool that you can specify exactly what you mean here.

 man du tells more.

Indeed it does!  What a great little thread! :)
-- 
Seeya,
Paul
___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


Re: Getting file sizes

2007-10-22 Thread Shawn K. O'Shea
On 10/22/07, Stephen Ryan [EMAIL PROTECTED] wrote:
 On Mon, 2007-10-22 at 09:11 -0400, Kent Johnson wrote:
  Newbie question:
 
  How can I get the total size, in K, of all files in a directory that
  match a pattern?
 
  For example, I have a dir with ~5000 files, I would like to know the
  total size of the ~1000 files matching *.txt.
 

 du -c *.txt | tail -1

Since I know Kent has a Mac and this might be on his laptop, I'd like
to add that this should really be:
du -ck *.txt | tail -1

Although Linux (ie GNU) du defaults to outputting sizes in k, OS X
does not. It counts blocks (512 byte blocks) and the -k option to du
explicitly says I want output in k and GNU du honors this even
though it's the default). For additional examples... Solaris 9 == 512
by default, FreeBSD 6 == 1024 by default, NetBSD 1.6.1 == 512 by
default, but they all honor -k

-Shawn


 (That's -(one), not -(ell), meaning, you only want the last line of
 output from du.)

 du prints out the sizes of each of the matching files; '-c' means you
 want a total, too; piping the output through tail -1 picks out just the
 last line with the total.

 --
 Stephen Ryan
 Dartware, LLC

 ___
 gnhlug-discuss mailing list
 gnhlug-discuss@mail.gnhlug.org
 http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/

___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


Re: Getting file sizes

2007-10-22 Thread Michael ODonnell



 Hmm, again, certainly not my fist instinct :)

Paul, we embrace diversity here but that is *definitely* OT...
 
___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


Re: Getting file sizes

2007-10-22 Thread Jim Kuzdrall
On Monday 22 October 2007 09:36, Kent Johnson wrote:
 Jim Kuzdrall wrote:
  On Monday 22 October 2007 09:11, Kent Johnson wrote:
  How can I get the total size, in K, of all files in a directory
  that match a pattern?
 
  For example, I have a dir with ~5000 files, I would like to know
  the total size of the ~1000 files matching *.txt.
 
  Ah!  Perhaps I actually know an answer to this one.  (Very
  rare)
 
  Go to directory of interest and try
  du -shc *.txt

 That still lists each file individually, it needs to pipe to tail as
 Stephen suggested.

I thought of that, but you just said you wanted the answer.  So I 
gave you the engineering approach: simplest approximation of adequate 
accuracy; minimum time spent.  (It takes less than two seconds to 
scroll the file names on the screen, and it does confirm what type of 
files are being counted.)

However, next time I need that info (which I often do), I will try 
to remember the | tail -1 trick.

Jim Kuzdrall 
___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


Re: Getting file sizes

2007-10-22 Thread Ben Scott
On 10/22/07, Shawn K. O'Shea [EMAIL PROTECTED] wrote:
 Since I know Kent has a Mac and this might be on his laptop, I'd like
 to add that this should really be:
 du -ck *.txt | tail -1

  Since we're on the subject, it should also be noted that du means
*disk usage*.  That means du is supposed to be aware of things like
allocation overhead (a 3 byte might use 4096 bytes on disk, or
whatever) and sparse files (files with holes in the middle, thus
using *less* space on disk than the file size).

  The GNU variant, at least, has an option to report actual file sizes
instead of disk usage.

  Which one you want depends on what you're looking for.

-- Ben
___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


Re: Getting file sizes

2007-10-22 Thread Kent Johnson
Shawn K. O'Shea wrote:
 du -c *.txt | tail -1
 
 Since I know Kent has a Mac and this might be on his laptop, I'd like
 to add that this should really be:
 du -ck *.txt | tail -1

No, this is a bona fide Linux question :-) it's a Webfaction account. 
But thanks for the note!

Kent
___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


Re: Getting file sizes

2007-10-22 Thread Ben Scott
On 10/22/07, Michael ODonnell [EMAIL PROTECTED] wrote:
 Ooops - that --files0-from= option is apparently
 new enough ... that it's probably not widely available.

find . -xdev -type f -name *.jpg -print0 2/dev/null | xargs -0 du
-ch | tail -1

(untested)

-- Ben
___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


Re: Getting file sizes

2007-10-22 Thread Steven W. Orr
On Monday, Oct 22nd 2007 at 10:17 -, quoth Ben Scott:

=On 10/22/07, Shawn K. O'Shea [EMAIL PROTECTED] wrote:
= Since I know Kent has a Mac and this might be on his laptop, I'd like
= to add that this should really be:
= du -ck *.txt | tail -1
=
=  Since we're on the subject, it should also be noted that du means
=*disk usage*.  That means du is supposed to be aware of things like
=allocation overhead (a 3 byte might use 4096 bytes on disk, or
=whatever) and sparse files (files with holes in the middle, thus
=using *less* space on disk than the file size).
=
=  The GNU variant, at least, has an option to report actual file sizes
=instead of disk usage.
=
=  Which one you want depends on what you're looking for.

I'd just like to kibbutz one more subtlety: du reports disk usage as 
discussed above, but another way that you can get seemingly conflicting 
numbers is from sparse files, i.e., where the length of the file is large, 
but still contains little data.

f = open ( newfile, O_CREAT | O_WRONLY );
lseek ( f, 10, SEEK_SET );
close ( f );

Bang. You now have a file that ls will report as 1 gig and yet still 
occupies almost no space on the dick.

-- 
Time flies like the wind. Fruit flies like a banana. Stranger things have  .0.
happened but none stranger than this. Does your driver's license say Organ ..0
Donor?Black holes are where God divided by zero. Listen to me! We are all- 000
individuals! What if this weren't a hypothetical question?
steveo at syslang.net
___
gnhlug-discuss mailing list
gnhlug-discuss@mail.gnhlug.org
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss/


File sizes

2002-08-20 Thread Kenneth E. Lussier

Hi All,

Can the 2GB file size limit be changed? I need to store about 10GB worth
of data in a single file, but it dies at 2GB.

TIA,
Kenny
-- 

Tact is just *not* saying true stuff -- Cordelia Chase

Kenneth E. Lussier
Sr. Systems Administrator
Zuken, USA
PGP KeyID CB254DD0 
http://pgp.mit.edu:11371/pks/lookup?op=getsearch=0xCB254DD0


___
gnhlug-discuss mailing list
[EMAIL PROTECTED]
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss



Re: File sizes

2002-08-20 Thread Mark Komarinski

On Tue, Aug 20, 2002 at 09:10:58AM -0400, [EMAIL PROTECTED] wrote:
 On 20 Aug 2002, at 8:12am, Kenneth E. Lussier wrote:
  Sorry for the lack of description. I didn't want to get into too much
  detail, since it is a bit embarrassing I'm doing a Windows backup to a
  samba mount. I get write failures at the 2GB point. I believe that it is
  actually a limit in the ext2 FS. I don't know if ext3 changes this.
 
   The ext2 disk format is quite capable of handling files in the terabyte
 range.
 
   You may be encountering a limit in:
   - the ext2 driver in your kernel
   - the general file I/O routines in your kernel
   - your C library
   - Samba
 
Samba and NFS(v2) don't like 2GB file sizes.
http://www.suse.de/~aj/linux_lfs.html

-Mark
___
gnhlug-discuss mailing list
[EMAIL PROTECTED]
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss



Re: File sizes

2002-08-20 Thread Derek D. Martin

-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

At some point hitherto, Mark Komarinski hath spake thusly:
 Samba and NFS(v2) don't like 2GB file sizes.
 http://www.suse.de/~aj/linux_lfs.html

That page is a bit outdated.  It talks about RH 6.2 as being current,
and doesn't mention ext3 at all.  I happened to be looking at the
changelog for Samba the other day for something unrelated, and noticed
that recent versions DO have support for large files as of 2.2.1:

  New option to allow new Windows 2000 large file (64k) streaming
  read/write options. Needs a 64 bit underlying operating system (for
  Linux use kernel 2.4 with glibc 2.2 or above). Can improve performance
  by 10% with Windows 2000 clients. Defaults to off. Not as tested as
  some other Samba code paths.

  http://us2.samba.org/samba/whatsnew/samba-2.2.5.html

Haven't used this, so don't know how well it works.  However,
apparently if you're not using Win2k to transfer from, you're still
limited to Windows 4GB SMB limit.

Your best bet will probably be to remove the disk and mount it in the
system you're going to back it up to, and do the copy locally.

- -- 
Derek Martin   [EMAIL PROTECTED]
- -
I prefer mail encrypted with PGP/GPG!
GnuPG Key ID: 0x81CFE75D
Retrieve my public key at http://pgp.mit.edu
Learn more about it at http://www.gnupg.org
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.0.6 (GNU/Linux)
Comment: For info see http://www.gnupg.org

iD8DBQE9Yk2udjdlQoHP510RAlvKAJ9BGxujE5Vtd7YQEOSffZZn6U97igCfa9PJ
OTi1RUHSAEvseoUfvoLanbQ=
=v/dU
-END PGP SIGNATURE-
___
gnhlug-discuss mailing list
[EMAIL PROTECTED]
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss



Re: File sizes

2002-08-20 Thread pll


In a message dated: 20 Aug 2002 07:34:27 EDT
Kenneth E. Lussier said:

Hi All,

Can the 2GB file size limit be changed? I need to store about 10GB worth
of data in a single file, but it dies at 2GB.

I don't know if ext2 supports big files.  I think you need to turn 
something on in the kernel somewhere too.

I was doing this with XFS on my amanda server at MCL and storing 
files between 3-6GB at the time.  XFS is specifically designed to 
deal with large files (SGI, movie-making, yadda, yadda, yadda)
as opposed to ReiserFS which was specifically designed to deal with 
lots and lots of small files.

I'd try out XFS, recompile your kernel, and go from there.  It can 
definitely be done.
-- 

Seeya,
Paul
--
It may look like I'm just sitting here doing nothing,
   but I'm really actively waiting for all my problems to go away.

 If you're not having fun, you're not doing it right!


___
gnhlug-discuss mailing list
[EMAIL PROTECTED]
http://mail.gnhlug.org/mailman/listinfo/gnhlug-discuss