Rest of second line: This causes Celery to execute just one worker process as opposed to one process for each CPU core.
http://docs.celeryproject.org/en/latest/userguide/workers.html#concurrency On Thursday, August 10, 2017 at 9:26:36 PM UTC-4, Roberto Rosario wrote: > > Hi, > > You can control the number of OCR processes using the "--concurrency=1" > setting of Celery on the worker managing the OCR queue. This causes Celery > to executye > > For example (from the Docker image): > > python /usr/local/lib/python2.7/dist-packages/mayan/bin/mayan-edms.py > celery --settings=mayan.settings.production worker -Ofair -l ERROR -Q > mailing,ocr,tools,statistics -n mayan-worker-slow.%%h --concurrency=1 > > If you are still running out of memory you can add the --concurrency=1 > setting to the other workers to control the conversion and display > processes. > > On Thursday, August 10, 2017 at 1:07:55 PM UTC-4, Mirco Hansen wrote: > >> Hi, >> >> is there a way to limit the number of OCR processes running at the same >> time? If I upload lots of documents the OCR in the background makes Mayan >> mostly unusable so I want to limit the number of background jobs. >> >> Thanks in advance. >> >> Regards >> Mirco >> >> -- --- You received this message because you are subscribed to the Google Groups "Mayan EDMS" group. To unsubscribe from this group and stop receiving emails from it, send an email to mayan-edms+unsubscr...@googlegroups.com. For more options, visit https://groups.google.com/d/optout.