Hello Sluggers: 

Here's a frustrating one Sluggers...
A system monitoring program periodically needs to generate a 
list of filename with each files timestamps prepended to it:
--------sample output----------------
 15:15,151500.data
 15:46,154501.data
 16:45,154501.data
 16:55,061500.data
 ...
-------------------------------------
The files live in <directory> and the names have to match pattern: <pattern>.


The following works well in most situations:
  ls -l <dir>/<pattern> | awk '{print $8 "," $9}'                
But with more than 4000 files in a directory, ls generates the following error:
  bash: /bin/ls: Argument list too long   


Using find in the following manner seemed to work beautifully:
  find  <dir> -name <pattern> -printf "%AH:%AM,%f\n" 
However, the find  -format specification outputs data based on
file *access* time instead of file *creation* times.
This data is useless if the files are accessed by some other 
program (say, a backup system).


Is there a way of generating the required list using command line tools 
in just one pass? (or Perl)

Regards,
Sonam
-- 
Electronic Commerce
Corporate Express Australia Ltd.
Phone: +61-2-9335-0725 Fax: +61-2-9335-0753

-- 
SLUG - Sydney Linux User Group Mailing List - http://slug.org.au/
More Info: http://slug.org.au/lists/listinfo/slug

Reply via email to