dheerajturaga commented on PR #56457:
URL: https://github.com/apache/airflow/pull/56457#issuecomment-3752640247

   @jscheffl , I was able to scale test apache-airflow-providers-edge3 3.0.0rc1 
(this pull request) and here are my findings. 
   
   I was able to load the edge worker with 100 concurrent tasks, I see that the 
memory utilization spikes to 5GB and slowly starts shrinking as tasks continue 
to finish. This is significantly better than celery which jumps to 20+GB at 100 
concurrent tasks! 
   
   Also, as the number of tasks queued on the worker increases, the memory 
consumption of the worker also increases.   with 100 tasks running and 400 
tasks queued on the same worker, the memory spikes to 13GB. 
   
   Overall, this is a significant improvement! thanks for implementing! 
   
   I was also able to push the boundaries with 400 parallel tasks and I see 
memory consumption of 18GB. However some tasks start to fail due to socket 
limitations. I think we can safely claim a concurrency of 100 if the machine 
allows.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to