Running one off commands in Docker container
Super weird to me that Railway doesn't support simply SSHing into a VM.
Anyway, I'm deploying a Django project. In Django there's something called management commands and it's imperative for me to be able to use it.
Locally, I do it with:
But since I'm using a containerized environment, I do it like this:
I thought I can do the same remotely with
railway link
.
When I do:
The command seemingly runs successfully. But I know it didn't run on the railway production instance because it's supposed to populate DB rows, and nothing like that happened in the Railway environment.
Am I out of luck?99 Replies
Project ID:
87f6d50b-7bab-488e-b802-02f9edc442e3
87f6d50b-7bab-488e-b802-02f9edc442e3
Even running:
Only creates the superuser locally (it's super weird this commands affects both local and production environments).
So, I just don't see a way to use Railway if we can't run commands remotely on the VM.
I don't see myself sharing commands between local and prod environments (not that the above is working).
I guess this is rooted in the fact that
run
in my case does these actions in a container, even if I did remove that and just make it run python3 manage.py createsuperuser
, it will fail because the it's executed against the local source code, and locally I'm running my Python interpreter in a docker-compose
container.
OK, so I went as far as logging into the docker container locally, installing railway CLI inside the container, and from within the container, do railway link
, and try to run:
But I get:
This can't be a misconfig because the app is able to read and write normally to the DB.railway does not use vm's, they run your code in empherial containers.
railway run
runs the given command locally but with the linked service variables available, so all you would need to do is run railway run python manage.py createsuperuser
locally in your project's folder after linking to the appropriate service, no need to be inside a container or anything complicatedThanks @Brody. I’d suspected this is what’s happening, and that it won’t work with the way my repo is laid out.
The entire Python env is contained in a
web
container in my case, so I guess when I run my script, it always logs into that instead of the remote env vars.yeah if you had a local .env the variables from that would overwrite the environment variables set "inside"
railway run
Any workarounds that come to mind? I’d really like to keep “duplication for the purpose of deployment” to a minimum.
stop using .env files and store your environment variables in railway instead
Hmmm, huh?
Are you suggesting I also hold my local (developlment environment) ENV variables on Railway as well?
absolutely, why not? but those would be stored as services variables under a development environment in your railway project
you use the cli to switch environments, then you run your app locally through railway run, then switch back to production to create a superuser account on the production database
Well I don’t think it’s horrible? Except it couples me to your platform a little
But I might just try to go with that.
Or maybe there’s a magic way to copy the python env from inside the Docker container ugh…
it does, but you can then just export the entire service variables as a .env file if you want to leave the platform
I think im gonna give this a try. Can I create an environment without any services?
nope, there needs to be services, otherwise you have no place to store your variables
Wait. Doesn’t this force me to pay for Postgres + Redis + Web needlessly? Because right now I’m running those locally with docker compose
no, you don't need to actually run them, they can just be there as skeleton services
hmm nice, I think. How would I go about intializing a copy of my production ENV?
in fact with the new environment changes you may be able to just have a single blank service that holds your service variables
from the projects settings under the environments tab
this is gonna be awesome if works nicely enough
well this is why railway run and railway shell exist
Do I now need to deploy in order for the variables to take effect?
just hold your alt key down while pressing deploy so the variables are saved
Sorry for the dumb question, is that option or control on Mac? also, will this not spin up the service?
lol let me go Google a picture of a Mac keyboard
yeah it's the option key as far as I can tell
the alt deploy is just supposed to save variables, but you can always go remove the source if you want before you save
yeah so now I have those envs same as the local
.env
file, I do:
Chooose local
, but unfortunately it doesn't seem like it's able to pass the variables to docker compose up --build
ah I see what you're doing, you may need to be inside the container that runs your app then, or just use compose to run the databases and run your app on your computer
Sorry, and I appreciate your patience. But I understood neither options.
If I understand you correctly:
This involves setting up a non-dockerized Python env locally. Not an option here because I wouldn't want all the duplication.
If I need to ocassionally SSH into a local container to run some one of commands I'm fine with that, but this is what I've been trying to do since noon.
seems like you understood both options just fine, don't underestimate yourself!
Yeah, except I've been trying to do #2 for a good few hours. Maybe I'm missing something there.
so try option 2 again, but this time, make sure there is no .env file to be found, and link to the production service
oooo
Should I be using
railway shell
?
or just link to the correct service/environment?
option 2 isn't gonna work, in order to run ./run manage createsuperuser
, the web
container should be up locally.
But the web
container can't spin up locally without the .env
file.if you have multiple commands you want to run, sure, but for one off commands use
railway run
though either command you will need to be linked to the desired project, environment and service
you've gotten yourself into quite the pickle with how you're doing things
personally I'd just use docker to run the databases, then run the app locally in a venv, it's far less complex that wayI'm basing my project off of this excellent repository for deploying Django w/ docker compose:
https://github.com/nickjj/docker-django-example
I guess our item of interest is the
run
file (https://github.com/nickjj/docker-django-example/blob/main/run).
The issue here isn't with how the project is set up, it's just that Railway doesn't support docker-compose, and I'm trying to force it to work still.
@Brody we both know Docker isn't needed to run Redis/DB, as that's usually the simpler part (which you don't deploy with compose
anyway in production, unless you self-manage)modify that run file to have the cli dynamically create a local .env file?
Wait, if I do
railway link
and link to the local environment with the designated service, and then:
Then if all env variables were successfully populated from the railway local environment, it should outout my local env secret key, no?
Because that's empty
Interesting, it's only there if I enter railway shell
. Let's try again now.railway run/shell isn't going to make an .env file, you need to run something completely different
but the run file doesn't depend on an
env file
mostly, and I commented out the parts in docker-compose
that do.
I'm trying to pass the env variables from railway local envsomething like
railway variables --kv > .env
So
railway link local-service
, then docker compose up --build
doesn't see the variables.
Even a simple echo $SECRET_KEY
returns empty.
If do railway shell
however, echo $SECRET_KEY
does work.
But for some reason docker compose up --build
still complains.because it's looking for a physical .env file right?
the docker-compose file? no, I removed all references to that
then how will it know what environment variables to pass into the container
hence my suggestion here, build this into the run file some way so it's automated
OK, confused on how that'll help us achieve stuff. But will try
unless you know of another way to pass environment variables into a container run by docker compose?
no, I appreciate your help.
Unfrotunately this is not working (incorporating
railway variables --kv > .env
). The container is built not by the run
script, but simply by running docker-compose up --build
, by the time we want to run railway run ./run manage.py createsuperuser
, the envs are already baked in.why don't we want the envs to be baked in?
because these are coming from the
.env
file.we had the cli make that env file, I'm not seeing an issue
the end goal is to get the correct set of environment variables (the production environment variables) into your container that's running django so that you can shell into the container and create a superuser
Railway wants to run the
createsuper
against a local container, running with env vars that are baked in (with local env_file
), but it wants to reference production
env vars. I don't think it's possible.
Unless you're telling me I need to spin up containers referencing production
's env vars whenever I want to change anything with production?
so something like:
once you have the production environment variables baked into the local container you don't need to use the railway cli inside the local container
Let's try this
unfortunately since you have quite the complex setup, yes.
most people just have their project files and use the cli right in the project
I'm just gonna do:
And then try to run the commands.
I mean I was just guessing on that syntax to create a .env file, have you tried it?
looks good
yeah but does it create an env file lol
yes, I meant the file looks good
ah cool
however after doing
railway link
with production, running docker compose up --build
says no service selected
.did you break something with your local docker compose setup?
Doesn't look like it
I don't know why it complains specifically after doing
railway link
docker compose seems to disagree
oh wait
OK, was missing profiles. But unfortunately this still isn't working.
that's the output of the superuser command inside the django container?
nah, that's trying to spin up local containers with PROD env vars
that's what we want
yeah except we want the container to not exit 😦
it's not able to establish connections. I guess not having a public DNS for my
railway
postgres has something to do with it?print the environment variables inside the container
any dns resolver can resolve the public proxy domain
I don't have a public proxy for it
you removed the tcp proxy on the database?
yes, I'm using private networking
definitely would have been worth mentioning... a long time ago
Very sorry about this, how would this affect anything else in the process except for this step?
Also, I thought it's the most obvious thing to do with a DB.
because its a private network you cant connect to the database locally if its not exposed publicly, you will need to expose it publically just for this operation
I didn't any explicitly remove proxy or something, I only removed the public DNS.
OK, I appreciate your help very much.
It seems I need to rethink stuff here. Either I change the project structure, or move off to another host.
To open my DB to the outer world and go through all the hoops every time I need to do this seems a bit too much.
Very much appreciated @Brody !
no problem!
@Brody one idea that sprung up right now, is what if I comment out the dependency on the .env_file temporarily.
Then, write a bash script to build with docker compose in a way that would pass the PROD env vars one by one:
docker compose … -e FOO=BAR -e BAR=BAZ
I still have to expose the database momentarily. What do you think?
that looks like it would achieve the exact same thing
railway variables --kv > .env
does but 10x harder, and you still need to expose the database publiclyWell if Railway doesn’t provide access to the “ephemeral container”, the decision is to either leave, or open temporarily. No matter what I do.
Ok, 3rd option: spin up a virtual env specifically for talking with Railway. Non dockerized, nothing. Just has the env vars.
the 3rd option is the best
I personally don't see the need to run the django app in a docker container for local development when you can use a venv
still need to connect to the database publicly though
you did say you didn't remove a tcp proxy, so if you're correct in saying that, then it's still publicly available (though with a password)
I didn’t remove a proxy explicitly. I did remove the public HTTP domain.
And even after enabling it back, it said that Railway couldn’t connect.
databases don't have http domains
show me what the public networking section of the database's settings says please
yeah you removed the tcp proxy
If by TCP proxy you mean the public networking domain, that’s true.
It was an HTTP domain
it wasnt lol, it was a tcp proxy
I stand corrected
@Brody 20 minutes after opening the TCP proxy, Railway is still not able to connect. Is that a sign of trouble?
Anyway, I'm trying to
export POSTGRES_DB=[New Proxy URL]
, and then run python3 manage.py createsuperuser
. No dice.
Feel like I'm really close though
The env variables are populated properly in railway shell
And regarding what the UI asks me:
Doesn't work, it complaints that the port -p is not an int, since it isn't.
The PGPORT
env var is an empty string
I can't psql
into the postgres instance anyway, using the public proxy TCP addressOh interesting, I spun up a new postgres database just to see what has gone wrong. And it seems like the proxy setting looks different.
@Brody please tell me I can open a proxy back after having removed it...
OK, found it's just a button to the right. Clearly have been trying to make this work for too long.
Success.
yeah you can add and remove the proxy at any point
also, what tool are you using to edit your screenshots with those nice little tooltips
@Brody The very awesome Shottr, indispensable! (and free, though I found it so useful I felt compelled to pay).
oh mac app 😦
One more reason to switch
ha like i have that kind of money
don't feel bad. macOS and Apple have been going downhill for the last 4-5 years or so.
so has windows
sharex has that on windows
I'll check it out, I currently use greenshot