Like most developers, I like to keep my workstation clean. This is a quick rundown of how you can have a working *dev* setup, specifically for web apps, on Windows 10, Mac OSX and Linux. Sorry BSDs...
## Things you need
1. [Docker](https://www.docker.com/products/docker-desktop)
2. [VS Code](https://code.visualstudio.com/)
3. An SSH client (Optional)
You probably already have the first two installed, or know how to install it. The last requirement is also available on most Linux distros and MacOS out of the box.
Linux users are required to add their regular user to the Docker user group:
$ sudo usermod -aG docker $USER
For this change to take effect you need to log out and sign back in.
## Why use [Remote - Container](https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.remote-containers)?
Remote Container extension allows you to focus on your ideas and not the environment. Start developing directly within a container, with a fully functional editor, i.e, VS Code. Its integrated shell also allows you to use the container as a functional Linux environment. Install the extension by visiting [this page](https://code.visualstudio.com/docs/remote/containers).
You can start by simply pulling a Docker image of your choice, spin up a container, and use VS Code to start editing files within that container.
No need to install dozens of packages on your host system, neither will you have several dozen Docker images cluttering your workspace as you tweak and fiddle with Dockerfiles. Only when you have a working prototype of your app, should you consider creating a Dockerfile to package it.
You can even use base OS images like Alpine or Ubuntu, if you want.
## Getting started
1. With the extension installed, let us create a container named dev0using the official Node.js image from Docker Hub:
$ docker run -dit --name dev0 -p 3000:3000 node
2. Next, open VS Code, and if you have the extension installed you will see a small green icon at the bottom left corner of the screen. 3. It will show you various options, let's select "Attach to Running Container" option:
This is followed by selecting the proper container name. In our case, this is *dev0*.
## Your New Environment
This is where a new instance of the VS Code will open up. If you now open the integrated terminal (use keyboard shortcut Ctrl+\`) this will drop you in a shell inside the container.
Since we are using a Node.js container, it already has node and NPM available for us, let us start a small project:
$ mkdir app
$ npm init
## Keep hitting Return to accept the defaults and reply 'yes' when prompted
$ npm install --save express
Create a file 'index.js' in here, and try out this simple "Hello, world" snippet that uses express framework:
const express = require('express')
const app = express()
const port = 3000
app.get('/', (req, res) => res.send('Hello World!'))
app.listen(port, () => console.log(`Example app listening on port ${port}!`))
Using the integrated terminal, run the above code:
$ node index.js
The result can be seen at [http://localhost:3000/](http://localhost:3000/) . You can now continue to work on your app and use localhost:3000 to access its contents.
If you want to open a new directory `/foo/bar`, run the following command inside the container, using VS Code integrated terminal:
$ code /foo/bar
This opens another instance of VS Code with `/foo/bar`. You can invoke VS Code from inside the container `dev0`!
## Side Note
If you open a VS code workspace in a specific folder, say, `/root/app` directory, then delete the container, and create a new one to connect via VS Code, it will try to reopen `/root/app` directory.
Since the directory no longer exists, the remote session will be rendered unusable.
At the time of this writing, the extension of the remote container is still in preview, and hopefully, this bug will be resolved in future updates. For now, you can mitigate this issue by creating whatever directory VS Code is expecting, like, `/root/app`:
$ docker exec dev0 bash -c "mkdir -p /root/app"
It's not the tidiest solution, but it does circumvent the issue.
## Bind Mounts
If you have a current project that you want to test inside a running container, you can do that using VS Code as well. The same extension can allow you to setup bind mounts so you can access parts of the host filesystem within the container.
For example, if you have a directory \~*/Desktop/app* on my host system, you can start by:
1. Clicking on the same green icon and then selecting "*Remote-Container: Open Folder in container...*"
2. Selecting that folder, and then picking a container image offered by Microsoft will allow you to open those file inside a newly created container.
3. When prompted, give Docker the necessary permissions to access the host file system.
4. Select one of the many container images offered by VS Code.
5. Start hacking!
**There are a few caveats, however:**
1. You have a limited set of container images offered by VS Code itself, to use with the bind mount feature.
2. If you are on Windows, you need to tweak your VS Code to use Unix style line endings (a.k.a LF) and a compatible character encoding like UTF-8 or ASCII.
## Moving Forward
If the above workflow appeals to you, there is more! The extension is still in preview and it will become more functionally stable with each commit that it gets.
[Send pull requests](https://github.com/Microsoft/vscode-remote-release), report issues and don't forget to have fun!
*This article was originally published on*[*https://appfleet.com/blog/minimal-dev-environment-vs-code-docker-3/*](https://appfleet.com/blog/minimal-dev-environment-vs-code-docker-3/) *and has been authorized by Appfleet for a republish.*
Nice! You describe my setup very well. Those tools are really the only software I really use. Well plus browsers to test things and a few random tools like Atmel Flip to program AVR microcontrollers, or PuTTY to complete some missing Windows functionality. But the C compiler for that microcontroller and the complete toolset are run in Docker. For web projects, it’s seriously ideal, the entire environment stays isolated and easy to reproduce on a server.
Basically I have what you mentioned plus games on Steam.
I’d like to add 1 container that I use in my dev setup: jwilder/nginx-proxy. With this thing, you can really easily invent non-existing domain names (e.g. local.myproject.com) and run them all on port 80 without having to shut down dev environments to run multiple setups.
“like most developers, I like to keep my workstation clean”
Lol, wut?
Joking aside, vscode on docker is pretty nice.
Interesting this might work well with my nvim setup too.
useful, but probably meant “cd app” after mkdir
I don’t suggest an “attach to running container” workflow. It has several drawbacks.
If you have an existing codebase, then just do the steps below. If you are starting fresh, then just create an empty project on github with a standard .gitignore for Node.
1. F1. and start typing: “Remote-Containers: Clone Repository in Container Volume”
2. When prompted, paste in the url to your repository. I use the https:// url and not the ssh version. https works better on Windows.
3. Choose “unique volume” – this will isolate your code and provide max performance in comparison as bind mounts are slow.
4. Pick your container base image from the list: “node.js.”
5. Wait while it builds.
6. It will add a .devcontainer folder with a Dockerfile. From VSCode, just commit these new files and push to github. You’re done.
After you close VSCode, you can reopen the project from your recent files menu.
Also, you don’t need to mess with line endings, or anything.
An even more minimal development environment would be to use code-server. Only a browser needed them!
Very interesting. I just got a new computer and was thinking about this earlier today. I’m going to dig in and see if I could apply this to my existing projects. Thanks!
I’ve been using this for the last year and I **absolutely love it**.
It’s my favorite way to dev. I’ve yet to find any containers that _can’t_ be used with this.
VSCode feels unobtrusive and performant, docker gives me a wide variety of software with very little work on my part. And if you screw it up you can issue a rebuild of the stack with two clicks.
I strongly recommend using the docker-compose option, as it lets you spin up an entire stack at once, and you can still use the Dockerfile you had originally. VSCode will provide a shell to the service listed in devcontainer.json (in this example, web)
Also, you can (and probably should) use named volumes to improve the filesystem performance.
Example:
# docker-compose.yml
version: “3”
services:
####################################################
# Web app
####################################################
web:
# Using a Dockerfile is optional, but included for completeness.
build:
context: .
dockerfile: Dockerfile
ports:
– “80:80”
volumes:
# This is where VS Code should expect to find your project’s source code
# and the value of “workspaceFolder” in .devcontainer/devcontainer.json
– ..:/var/www/html
– vendor:/var/www/html/vendor
– node_modules:/var/www/html/node_modules
####################################################
# Redis
# (No need to track ip, your web service can connect to redis:6379)
####################################################
redis:
image: “redis:alpine”
ports:
– “6379:6379”
volumes:
# Named volume to speed up dependency download and reduce clutter in project folder
vendor:
node_modules:
–
And in devcontainer.json:
{
“name”: “project”,
“dockerComposeFile”: “docker-compose.yml”,
“service”: “web”,
“appPort”: [80],
“workspaceMount”: “src=${localWorkspaceFolder},dst=/var/www/html,type=bind,consistency=cached”,
“workspaceFolder”: “/var/www/html”,
}
It can be a little tricky to set up at first, if you’re not familiar with docker, but it’s absolutely worth it.
I’ve heard this workflow before, but I never tried it. Since you edit the code inside the container, won’t it disappear when the container is restarted? How did you sync the source code from the container to the git repository?