Title: Re: [Ltsp-discuss] can't ls in directory >90,000 files

I have a system running on RH 7.1 that is a monitoring station for 12 network video cameras.  Each of the cameras uploads a jpg image every 2-3 seconds to their own directory on the server.  On the server itself, I have a kylix cgi program running that checks for the latest file in each of the camera directories and displays the thumbnail in a web page grid every 2-3 seconds.

During testing, we found that we could not allow the files to accumulate for more than 3-4 hours.  At that point, you could not 'ls' or do anything in the directories without it hanging the box.  Our solution was to run a cron job every hour to zip up any files that were more than 3 hours old and move the zip files into a archive directory.  Once a day, the system purges any zip files older than 6 days.

works pretty well ... the only remaining problem that i have is time sync with the cameras ... i wish i could find a network time server that would run without depending on the reference clocks ...

> Date: Wed, 6 Mar 2002 16:23:04 -0500 (EST)
> From: <[EMAIL PROTECTED]>
> To: "Rose, David " <[EMAIL PROTECTED]>
> cc: "'[EMAIL PROTECTED]'"
> <[EMAIL PROTECTED]>
> Subject: Re: [Ltsp-discuss] can't ls in directory >90,000 files
>
> David,
>
> 90,000 file is huge!!
>
> Still, ls should handle it.  Keep in mind that ls is going to
> sort the entries.  It could be that ls has a really lousy sort
> algorithm.
>
> I't try running "top" while you do the ls on that directory, and
> see if ls is chewing up massive amounts of cpu time.
> I think if you let it go long enough, you will get some results
> out of it.
>
> Then, I'd consider a different way of storing all those files.
> It really puts the system through a workout.  Adding a new
> file is especially painful, because the system needs to
> look at every single entry, to make sure the file doesn't
> already exist.
>
> Jim McQuillan
> [EMAIL PROTECTED]
>
> On Wed, 6 Mar 2002, Rose, David  wrote:
>
> > Hello all.
> >   On a more general note, I have a directory on my linux
> server which has
> > over 90,000 files.  when I do a dir | wc -l I receive a
> number of >46,000
> > (which I take to be >90,000 files since dir gives me 2
> columns of file
> > names.
> >
> >   However, I can't ls.  I have waited for up to 15 minutes.
>  Can ls not
> > handle the vast quantity of files in the directory?  Is
> there a way around
> > this 'ceiling'?
> >
> > Thank you.
> > Dave
> >
> >
> >
>

Reply via email to