Non-Root Terminal?
Is there a way to have it default to user abc instead of defaulting to root? I'd want to run claude code or other apps as anyone but root..
thanks!
PS - Github discussion mirror here: https://github.com/coder/coder/discussions/18914
GitHub
Non-Root Terminal? · coder/coder · Discussion #18914
Is there a way to have it default to user abc instead of defaulting to root? I'd want to run claude code or other apps as anyone but root.. terraform { required_providers { coder = { source = &...
14 Replies
What are you creating this issue for?
Can you share your template?
You bet!
Are you creating a non root user in your Dockerfile?
Are you chnaging to that user by using
USER
?hmm, I believe it's been created
@Ryan Wiancko you just need to add an
USER abc
directive at the end of your Dockerfile
also, I would recommend using multiple RUN
directives so that layer caching can do its job!
also you can ditch the lscr.io/linuxserver/code-server:latest
image and just use ubuntu/debian/other and use our code-server terraform module (https://registry.coder.com/modules/coder/code-server)
and you might be interested in our Dev Container support as it makes managing tools easier (https://coder.com/docs/admin/templates/managing-templates/devcontainers)I actually looked into Dev Containers and have it on my list! It just seems like too big of a lift at the time as I cannot begin to describe the nightmare of getting LLM's to setup Coder. Honestly, I've learned that HCL Syntax is literally kryptonite for LLM's.
Even with Context7 getting Claude+Gemini+GPT to try to work together to get, first the main instance running and then simple templates after that was pure insanity. We actually developed entirely new AI Development workflows just to solve this problem.
Obviously I could have just pinged you guys here asking for help and it would have been solved in 5 minutes but it became my white whale, trying to get the LLM's in a state where they could work with Coder to create templates.
Are there particular use cases why you would want https://registry.coder.com/modules/coder/vscode-web vs https://registry.coder.com/modules/coder/code-server ?
sounds good
I cannot begin to describe the nightmare of getting LLM's to setup Coder. Honestly, I've learned that HCL Syntax is literally kryptonite for LLM's.can you elaborate on this? we really want to make the experience smooth and i am interested to learn using your feedback :-)
code-server
is our (Coder) fork of the open source version of VS Code to run on the web, it is older than vscode-web
, which is Microsoft's official closed-source VSCode ported to the webWas just an endless cycle of the llms going in circles repeating the same syntax areas referencing non-existent documentation, taking three steps backwards for every one step forward.
It's nothing really to do with coder. I don't think, maybe just a lack of training data
Trying to get them to debug was an act of utility. At one point I had all three llms trying to solve the same problem and gemini resorted to hallucinating and calling Claude a liar to try to convince me it was right when in fact Claude had the correct solution.
I find claude-opus to be the best atm on these types of tasks and I'm satisfied with Cursor's agent, but Roo Code seems like a better option for me
Tried open code?
i have not
That's what my current focus is, getting workspace templates setup with the gambit of frameworks like Roo Code, Claude Code, Open Code, Gemini CLI, etc etc
I have had all the LLM's do retrospectives of every phase of trying to get Coder setup, documenting exactly where they went wrong and how they fixed it if you're interested. I'll have to scrub them to make sure there's nothing sensitive but it was the only way I was able to get it installed(via an LLM in the end).
Honestly, it was the hardest things I've ever had to get an LLM to do, felt like I was being trolled most days because of how many times they would screw up the simplest things. Eventually I knew enough myself that I would catch them in the dumbest mistakes again and again and again.