High Memory Usage by service
Hey!
I ran a standalone celery app both worker and beat using docker file. The beat is configured to run every 5 mins. I am getting constant 900MB memory usage. I am using python3.12.1 slim as base and supervisor to run both processes. Is this normal?
Project ID: 26d20592-d5d6-4fd1-bfb3-086415c85b58
21 Replies
Project ID:
26d20592-d5d6-4fd1-bfb3-086415c85b58
900mb to run two things doesn't seem too crazy tbh
though I would highly recommend running your two apps as separate railway services instead of under one railway service with supervisor (not that it would save you memory, but it's just the recommended way)
Can I do that using the same github repo?
yes of course
and Will Celery be able to detect it's workers from different service?
Any reference?
deploy from the same repo to two railway services, in one service set the start command to run your app, in the other service set the start command to run celery
I don't think it needs to, or cares, it just needs all the applicable environment variables
Okay I'll try it.
So I need to configure beat and workers in seperate services.
yeah isn't communication done through a redis database?
Yes it does, sometimes I forget things. But the main point was 900MB still seems high for a program which just prints something when the beat sends it signals.
Worked Although memory of worker still goes to 700MB
how many workers does your worker service spawn?
1
are you absolutely positive?
celery -A tasks.app worker -l INFO
I am not so sure anymore, you are making me nervous kek
seems like you may be letting celery decide how many worker processes it wants to spawn, look into a flag that you can set to specify the exact amount of worker processes
I am trying with a concurrency of 1. Let's see.
It worked, default concurrency of a worker is 10. I am surprised there's no min max but a constant value of 10.
Memory usage is now 70
140 MB if we look combined for Beat and worker
awsome!
Thanks for the direction! I thought this had something to do with docker file loading something big into the memory.
just a simple flag π
Yeah, you saved my wallet.
mind saying your final start command?
Solution
celery -A tasks.app worker --concurrency=1 -l INFO