Aanand Prasad

Live Debugging with Docker

During the DockerCon 2016 keynote, I demonstrated a development workflow with Docker for Mac, going from a fresh laptop to a running app in no time. The especially cool part was when I live-debugged a Node.js app running inside a container from my IDE, despite having no Node.js runtime installed on my laptop. Here I’m going to show you how to do it yourself.

Here’s what you’ll need:

  1. Docker: I recommend Docker for Mac or Windows, which are in public beta.
  2. An IDE which supports Node.js remote debugging: I used Visual Studio Code.
  3. A Node.js application: I’ll create a simple one as part of this tutorial.

 

Example Application

Create a directory to work from:

$ mkdir node-example
$ cd node-example

To get our app running, we’ll need 5 files:

  • A JavaScript file to contain the actual app code
  • A package.json to define the npm dependencies
  • An HTML template
  • A Dockerfile to package the whole app in a container
  • A Compose file to set up a development environment. (The Compose file will also come in very handy if the app ever grows beyond a single container, but we won’t bother with that today.)

Create a file called app.js with the following code:

var express = require('express');
var expressHandlebars  = require('express-handlebars');
var http = require('http');

var PORT = 8000;

var LINES = [
    "Hey, now, you're an All Star, get your game on, go play",
    "Hey, now, you're a Rock Star, get the show on, get paid",
    "And all that glitters is gold",
    "Only shooting stars break the mold",
];

var lineIndex = 0;

var app = express();
app.engine('html', expressHandlebars());
app.set('view engine', 'html');
app.set('views', __dirname);
app.get('/', function(req, res) {
    var message = LINES[lineIndex];

    lineIndex += 1;
    if (lineIndex > LINES.length) {
        lineIndex = 0;
    }

    res.render('index', {message: message});
});

http.Server(app).listen(PORT, function() {
    console.log("HTTP server listening on port %s", PORT);
});

It’s a simple web server which just prints a different message back to the client with each request. It’s also got a bug in it, but we’ll worry about that later.

Next up, define the app’s main script and dependencies in package.json:

{
    "main": "app.js",
    "dependencies": {
        "express": "~4.14.0",
        "express-handlebars": "~3.0.0"
    }
}

Now the template file, in index.html:

<html>
    <head>
        <meta http-equiv="refresh" content="2">

        <style type="text/css">
            body {
                font-family: Helvetica, Arial, sans-serif;
                font-weight: 600;
                font-size: 56pt;
                text-transform: uppercase;
                text-align: center;
                background: #3c3;
                color: white;
            }
        </style>
    </head>

    <body>&ldquo;{{message}}&rdquo;</body>
</html>

As you can see from the <meta> tag, the page will refresh every 2 seconds. This saves us the trouble of manually refreshing it, making for a more laid-back viewing experience.

We’re almost done. Next up, the Dockerfile:

FROM node:5.11.0-slim

WORKDIR /code

RUN npm install -g nodemon

COPY package.json /code/package.json
RUN npm install && npm ls
RUN mv /code/node_modules /node_modules

COPY . /code

CMD ["npm", "start"]

This’ll get us a container image with Node.js and our application code, ready to run. As well as installing our application code and its dependencies, we’re installing a tool called nodemon, which watches your JS files and restarts the application whenever they change.

Finally, docker-compose.yml:

version: "2"

services:
  web:
    build: .
    command: nodemon --debug=5858
    volumes:
      - .:/code
    ports:
      - "8000:8000"
      - "5858:5858"

A few things are going on here:

  • It defines a service called “web”, which uses the image built from the Dockerfile in the current directory.
  • It overrides the command specified in the Dockerfile to enable the remote debugging feature built into Node.js. We do that here because when you ship this application’s container image to production, you don’t want the debugger enabled – it’s a development-only override.
  • It overwrites the application code in the container by mounting the current directory as a volume. This means that the code inside the running container will update whenever you update the local files on your hard drive. This is very useful, as it means you don’t have to rebuild the image every time you make a change to the application.
  • It maps port 8000 inside the container to port 8000 on localhost, so you can actually visit the application.
  • Finally, it maps port 5858 inside the container to the same port on localhost, so you can connect to the remote debugger.

It’s time to start the app:

$ docker-compose up

Docker Compose will build the image and start a container for the app. You should see this output:

Creating network "nodeexample_default" with the default driver
Creating nodeexample_web_1
Attaching to nodeexample_web_1
web_1  | [nodemon] 1.9.2
web_1  | [nodemon] to restart at any time, enter `rs`
web_1  | [nodemon] watching: *.*
web_1  | [nodemon] starting `node --debug=5858 app.js`
web_1  | Debugger listening on port 5858
web_1  | HTTP server listening on port 80

The app is now running. Open up http://localhost:8000/ to see it in action, and take a moment to appreciate the poetry:

 

 

It’s undoubtedly beautiful, but the problem is obvious: we’re outputting a blank message at the end before cycling back to the first line. It’s time to debug.

 

Remote Debugging

Open up the app directory in VSCode. Head over to the debugger by clicking the bug icon in the left-hand sidebar.

Create a boilerplate debugger config by clicking the gear icon and selecting “Node.js” in the dropdown.


 

A JSON file will be created and displayed. Replace its contents with the following:

{
    "version": "0.2.0",
    "configurations": [
        {
            "name": "Attach",
            "type": "node",
            "request": "attach",
            "port": 5858,
            "address": "localhost",
            "restart": true,
            "sourceMaps": false,
            "outDir": null,
            "localRoot": "${workspaceRoot}",
            "remoteRoot": "/code"
        }
    ]
}

There are three important changes here:

  • The whole “Launch” config has been deleted – you’re using Compose to launch the app, not VSCode, so it’s unnecessary.
  • restart is set to true, so that the debugger re-attaches when the app restarts.
  • remoteRoot is set to the path of the code directory inside the container, because it’s almost certainly different than the path to the code on your machine.

With the “Attach” config selected, click the “play” button to start the debugger.

 

Now go back to app.js and find the line that reads lineIndex += 1 line, just after we initialize the message variable. Set a breakpoint by clicking in the gutter, just to the left of the line number.

 

If your browser window is still open and refreshing, in a second or two you should see it hit the breakpoint. If not, go back and refresh it – VSCode will pop back to the front as soon as the debugger hits it.

Hit the Play button at the top to resume code execution. It’ll hit the breakpoint every time the browser refreshes, which is every 2 seconds. You can see it cycling through the lines, and then the bug shows up – right after the last line, message gets set to undefined.

 

The reason becomes clear if you open up the “Closure” section under “VARIABLES”: lineIndex has incremented to 4 – the length of the LINES array – when it should have been reset after getting to 3. We’ve got an off-by-one error.

 

Fixing the bug

Replace the > with >= in the conditional on the next line:

 

Now save the file. A second or two later, you should see the debugger detach and then reattach (the yellow line highlighting the breakpoint will disappear and reappear). This is because several things have just happened:

  • Upon saving the file, Docker detected the filesystem change event and proxied it through to the container.
  • nodemon detected the event and restarted the application. You can confirm this by looking at your terminal: there should be a line that reads “restarting due to changes…”
  • Finally, VSCode detected that the remote debugger had gone away and reattached.

The debugger is now attached again. However, your browser tab might have errored out – go refresh it if so.

You can now step through the debugger once again and see that the lines cycle properly – no more undefined.

 

Remove the breakpoint and detach the debugger by clicking the stop button. Go back to the browser window and enjoy the updated experience.

 

You’re all done – congratulations! You can watch how I did this on stage at DockerCon 2016 below:


 

Learn More about Docker

, , , , , ,

Aanand Prasad

Live Debugging with Docker


18 Responses to “Live Debugging with Docker”

  1. Vladimir Varankin

    Thank you for a great post. Looks very helpful to me.

    The only question that I have is about your Dockerfile. What is the purpose of moving installed node_modules from code directory to the file system root? My guess is that it's to allow own's projects node_modules from the host to override the newly installed modules, but I still can't catch the point for that.

    Reply
    • Ganga

      Hi, I keep getting an error that it cannot find the express module when I run docker-compose up.

      Reply
      • Matt Freeman

        I'm guessing you can workaround this by using an entrypoint so WORKDIR is applicable.

        Untested, but something like this:

        WORKDIR /code
        ENTRYPOINT ./entrypoint.sh
        CMD ["run-prod"]

        Then in entrypoint.sh

        #!/bin/bash

        if [ $1 -eq "run-prod]; then
        exec npm start
        fi
        exec "$@"

        So the entrypoint script basically uses your command if not run-prod.

        Reply
    • Christian

      Hi Vladimir Varankin,

      I have the same concern regarding why node_modules are being moved from the /code folder up to the root folder. It will work anyway I guess as node will try to get the modules from the parent folder but I don't get the reason why it's being done.

      I don't think it would be OK to debug using host's node_modules. You might be debugging from a Windows host and the node_modules need to be compatible with the actual OS running the app (in this case = docker = some unix distro). You might have native modules that get compiled to an specific OS when doing npm install. And in order the app to work, they should be the one calculated during npm install inside the container.

      Reply
      • Christian

        Ohh BTW, great post Aanand Prasad thanks for doing it! 🙂

        Reply
      • Christian

        Vladimir Varankin, I think I finally figured out the reason why the node_modules folder is being moved from the /code folder to the
        root folder. The thing is that, when you mount a volume from a host folder that folder "overlays" any pre-existing content if the destination folder already existed inside the container (therefore the former content is not accessible until you unmount the volume)(More info here: https://docs.docker.com/engine/tutorials/dockervolumes/#mount-a-host-directory-as-a-data-volume). As the magic of this approach is to be able to debug in real time changing the code directly from the host then you don't want that volume to be unmounted, so, that's the reason why the node_modules folder is being moved to an upper level where node will still be able to look for it (More info on how node resolves modules: https://nodejs.org/api/modules.html#modules_loading_from_node_modules_folders). Once again note that it's important to use THAT node_modules folder and not other as the node_modules folder could be platform-specific (And if you're using Windows in the host machine the calculated node_modules in the host machine might not work correctly inside the container). To sum up, your /code folder in the host should not have the node_modules folder so that the running code inside the container can use the one located in the upper level folder (which does corresponds to the "platform" where your code will finally run -> inside a docker container -> some unix-based OS)

        Reply
  2. ljack

    Hi,
    for the saving the app in VSC did not change the app.js in the docker. Running on Windows….

    Reply
    • Rich R

      Try changing the nodemon command in the docker-compose.yml file to include the -L option. This turns on the legacy file watching mode for nodemon. This worked for me using a windows host -> VM -> Docker setup.

      command: nodemon –debug=5858 -L

      Reply
  3. Rafał Sztwiorok

    In Windows you have to share your drives with docker. Go to docker options and select the local drives you want to be available to your containers, otherwise it will not work.

    Reply
  4. Augustine Correa

    Hi Aanand,

    Running your docker compose as-is results in the following

    Creating nodeexamples_web_1
    Attaching to nodeexamples_web_1
    [36mweb_1 |[0m Usage: nodemon [nodemon options] [script.js] [args]
    [36mweb_1 |[0m
    [36mweb_1 |[0m See "nodemon –help" for more.
    [36mweb_1 |[0m
    [36mnodeexamples_web_1 exited with code 0

    Reply
    • Augustine Correa

      The issue I had is related to Docker for Windows. Should be applicable to Docker for Mac too. The user has to go to the Settings and tick the drive on which the source code is stored.

      Reply
  5. Stenio Ferreora

    Thanks for the walkthrough!

    I am running this on a Mac using Docker-machine. Two changes I had to make:

    1. add the "-L" parameter to the docker-compose command:
    " command: nodemon -L –debug=5858"

    2. change from 'localhost' to the ip of the docker-machine. This is shown when the machine is started.

    Thanks!

    Reply
  6. John Rz.

    Awesome walkthrough, LOVE IT!

    Reply
  7. Garth Boyst

    I haven't had any luck attaching to an Alpine container this way, only Slim.

    Anyone else try?

    Reply
  8. Marcos

    Why are you repeating the WORKDIR path after setting it?

    Aren't you ending up placing your files in /code/code?

    Reply
    • Marcos

      Update: I figured out that the dest paths are relative to the WORKDIR except when you use absolute paths, which is the case here.

      Reply

Leave a Reply to Rafał Sztwiorok

Click here to cancel reply.

Get the Latest Docker News by Email

Docker Weekly is a newsletter with the latest content on Docker and the agenda for the upcoming weeks.