On 09/12/20, Rory Campbell-Lange ([email protected]) wrote:
> On 08/12/20, Gary Conley ([email protected]) wrote:
> > I suspect I have some sort of issue with large directories. As a workaround 
> > I've been breaking the directories down into 4000 images at a time and the 
> > performance is acceptable. So, while image processing may not be a great 
> > idea, it is working well for me provided I don't have huge directories. I 
> > had one as large as 10,000 that also ran fine, but 30,000+ was a total bust 
> > with performance rapidly going from 2 images per second to 7 seconds per 
> > image. With 4000 images in a directory I get consistent performance of 1-2 
> > images per second.
> 
> Off topic, but I suggest not having more than 1,000 files per directory
> if you can manage it, as running "ls" against a directory with more
> images than than on cloud storage or indifferent storage backends will
> cause a noticeable lag.

Torek's answer on Stack Overflow suggests that git restricts the number
of files in a directory to 6700 by default

https://stackoverflow.com/a/18732276

-- 
You received this message because you are subscribed to the Google Groups 
"modwsgi" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion on the web visit 
https://groups.google.com/d/msgid/modwsgi/20201209074514.GA23613%40campbell-lange.net.

Reply via email to