Celery worker using a lot of memory
Hello, so I'm using the Django, Celery, Redis & Postgres template and when I run it locally the worker is using around 900mb of memory just idealing. However, when I deploy it to Railways it uses around 3gb of memory while idling. I don't know if anyone has any possible insight into this issue. It'd be much appreciated. I've played around with the 'celery max-memory-per-child' setting however that doesn't seem to make a difference.
It must be something with the deployment configuration, start command or something besides the code base itself because it's not using 3gb of memory when running locally.
Thanks in advance.
It must be something with the deployment configuration, start command or something besides the code base itself because it's not using 3gb of memory when running locally.
Thanks in advance.
Railway
Deploy Django, Celery, Redis & Postgres on Railway
Full Django/Postgres stack with Celery tasks and Redis as cache/queue.
Solution:Jump to solution
you want the
concurrency
flag instead, I'd say setting it down to 2
initially is a safe bet, and then do some testing, if thats not enough bump it.
https://docs.celeryq.dev/en/stable/userguide/workers.html#concurrency...6 Replies
Project ID:
c9822e71-ad02-4e28-9f58-0fd9f2ed99b0
c9822e71-ad02-4e28-9f58-0fd9f2ed99b0
Here's what I mean by memory usage locally vs. deployed
Solution
you want the
concurrency
flag instead, I'd say setting it down to 2
initially is a safe bet, and then do some testing, if thats not enough bump it.
https://docs.celeryq.dev/en/stable/userguide/workers.html#concurrencyThank you so much! That solved the issue!
Awesome!