In part I of this series, we took a look at creating Docker images and running Containers for Node.js applications. We also took a look at setting up a database in a container and how volumes and network play a part in setting up your local development environment.
In this article we’ll take a look at creating and running a development image where we can compile, add modules and debug our application all inside of a container. This helps speed up the developer setup time when moving to a new application or project.
We’ll also take a quick look at using Docker Compose to help streamline the processes of setting up and running a full microservices application locally on your development machine.
Prerequisites
- Docker installed on your development machine. You can download and install Docker Desktop from the links below:
- Sign-up for a Docker ID
- Git installed on your development machine.
- An IDE or text editor to use for editing files. I would recommend VSCode
Fork the Code Repository
The first thing we want to do is download the code to our local development machine. Let’s do this using the following git command:
git clone [email protected]:pmckeetx/memphis.git
Now that we have the code local, let’s take a look at the project structure. Open the code in your favorite IDE and expand the root level directories. You’ll see the following file structure.
├── docker-compose.yml
├── notes-service
│ ├── config
│ ├── node_modules
│ ├── nodemon.json
│ ├── package-lock.json
│ ├── package.json
│ └── server.js
├── reading-list-service
│ ├── config
│ ├── node_modules
│ ├── nodemon.json
│ ├── package-lock.json
│ ├── package.json
│ └── server.js
├── users-service
│ ├── Dockerfile
│ ├── config
│ ├── node_modules
│ ├── nodemon.json
│ ├── package-lock.json
│ ├── package.json
│ └── server.js
└── yoda-ui
├── README.md
├── node_modules
├── package.json
├── public
├── src
└── yarn.lock
The application is made up of a couple simple microservices and a front-end written in React.js. It uses MongoDB as it’s datastore.
In part I of this series, we created a couple of Dockerfiles for our services and also took a look at running them in containers and connecting them to an instance of MongoDb running in a container.
Local development in Containers
There are many ways to use Docker and containers to do local development and a lot of it depends on your application structure. We’ll start at with the very basic and then progress into more complicated setups
Using a Development Image
One of the easiest ways to start using containers in your development workflow is to use a development image. A development image is an image that has all the tools that you need to develop and compile your application with.
In this article we are using node.js, so our image should have Node.js installed as well as npm or yarn. Let’s create a development image that we can use to run our node.js application inside of.
Development Dockerfile
Create a local directory on your development machine that we can use as a working directory to save our Dockerfile and any other files that we’ll need for our development image.
$ mkdir -p ~/projects/dev-image
Create a Dockerfile in this folder and add the following commands.
FROM node:12.18.3
RUN apt-get update && apt-get install -y \
nano \
vim
We start off by using the node:12.18.3 official image. I’ve found that this image is fine for creating a development image. I like to add a couple of text editors to the image in case I want to quickly edit a file while inside the container.
We did not add an ENTRYPOINT or CMD to the Dockerfile because we will rely on the base image’s ENTRYPOINT and we will override the CMD when we start the image.
Let’s build our image.
$ docker build -t node-dev-image .
And now we can run it.
$ docker run -it --rm --name dev -v $(pwd):/code node-dev-image bash
You will be presented with a bash command prompt. Now, inside the container we can create a JavaScript file and run it with Node.js.
Run the following commands to test our image.
$ cat <<EOF > index.js
console.log( 'Hello from inside our container' )
EOF
$ node index.js
Nice. It appears that we have a working development image. We can now do everything that we would do in our normal bash terminal.
If you ran the above Docker command inside of the notes-service directory, then you will have access to the code inside of the container.
You can start the notes-service by simply navigating to the /code directory and running npm run start.
Using Compose to Develop locally
The notes-service project uses MongoDb as it’s data store. If you remember from Part I of this series, we had to start the Mongo container manually and connect it to the same network that our notes-service is running on. We also had to create a couple of volumes so we could persist our data across restarts of our application and MongoDb.
In this section, we’ll create a Compose file to start our notes-serice and the MongoDb with one command. We’ll also set up the Compose file to start the notes-serice in debug mode so that we can connect a debugger to the running node process.
Open the notes-service in your favorite IDE or text editor and create a new file named docker-compose.dev.yml. Copy and paste the the below commands into the file.
version: '3.8'
services:
notes:
build:
context: .
ports:
- 8080:8080
- 9229:9229
environment:
- SERVER_PORT=8080
- DATABASE_CONNECTIONSTRING=mongodb://mongo:27017/notes
volumes:
- ./:/code
command: npm run debug
mongo:
image: mongo:4.2.8
ports:
- 27017:27017
volumes:
- mongodb:/data/db
- mongodb_config:/data/configdb
volumes:
mongodb:
mongodb_config:
This compose file is super convenient because now we do not have to type all the parameters to pass to the docker run command. We can declaratively do that in the compose file.
We are exposing port 9229 so that we can attach a debugger. We are also mapping our local source code into the running container so that we can make changes in our text editor and have those changes picked up in the container.
One other really cool feature of using the a compose file, is that we have service resolution setup to use the service names. So we are now able to use “mongo” in our connection string. The reason we use mongo is because that is what we have named our mongo service in the compose file as.
Let’s start our application and confirm that it is running properly.
$ docker-compose -f docker-compose.dev.yml up --build
We pass the “–build” flag so Docker will compile our image and then start it.
If all goes will you should see something similar:
Now let’s test our API endpoint. Run the following curl command:
$ curl --request GET --url http://localhost:8080/services/m/notes
You should receive the following response:
{"code":"success","meta":{"total":0,"count":0},"payload":[]}
Connecting a Debugger
We’ll use the debugger that comes with the Chrome browser. Open Chrome on your machine and then type the following into the address bar.
about:inspect
The following screen will open.
Click the “Open dedicated DevTools for Node” link. This will open the DevTools that are connected to the running node.js process inside our container.
Let’s change the source code and then set a breakpoint.
Add the following code to the server.js file on line 19 and save the file.
server.use( '/foo', (req, res) => {
return res.json({ "foo": "bar" })
})
If you take a look at the terminal where our compose application is running, you’ll see that nodemon noticed the changes and reloaded our application.
Navigate back to the Chrome DevTools and set a breakpoint on line 20 and then run the following curl command to trigger the breakpoint.
$ curl --request GET --url http://localhost:8080/foo
💥 BOOM 💥 You should have seen the code break on line 20 and now you are able to use the debugger just like you would normally. You can inspect and watch variables, set conditional breakpoints, view stack traces and a bunch of other stuff.
Conclusion
In this article we took a look at creating a general development image that we can use pretty much like our normal command line. We also set up our compose file to map our source code into the running container and exposed the debugging port.
Resources
- Getting Started with Docker
- Best practices for writing Dockerfiles
- Docker Desktop
- Docker Compose
- Project skeleton samples