Celery worker using a lot of memory

Hello, so I'm using the Django, Celery, Redis & Postgres template and when I run it locally the worker is using around 900mb of memory just idealing. However, when I deploy it to Railways it uses around 3gb of memory while idling. I don't know if anyone has any possible insight into this issue. It'd be much appreciated. I've played around with the 'celery max-memory-per-child' setting however that doesn't seem to make a difference.
It must be something with the deployment configuration, start command or something besides the code base itself because it's not using 3gb of memory when running locally.
Thanks in advance.
Railway
Deploy Django, Celery, Redis & Postgres on Railway
Full Django/Postgres stack with Celery tasks and Redis as cache/queue.
Solution:
you want the concurrency flag instead, I'd say setting it down to 2 initially is a safe bet, and then do some testing, if thats not enough bump it. https://docs.celeryq.dev/en/stable/userguide/workers.html#concurrency...
Jump to solution
6 Replies
Percy
Percy3mo ago
Project ID: c9822e71-ad02-4e28-9f58-0fd9f2ed99b0
whittles4402
whittles44023mo ago
c9822e71-ad02-4e28-9f58-0fd9f2ed99b0
whittles4402
whittles44023mo ago
Here's what I mean by memory usage locally vs. deployed
No description
No description
Solution
Brody
Brody3mo ago
you want the concurrency flag instead, I'd say setting it down to 2 initially is a safe bet, and then do some testing, if thats not enough bump it. https://docs.celeryq.dev/en/stable/userguide/workers.html#concurrency
whittles4402
whittles44023mo ago
Thank you so much! That solved the issue!
Brody
Brody3mo ago
Awesome!