Building a Docker/Gulp-based development environment: Introducing DevPail

Building a Docker/Gulp-based development environment: Introducing DevPail

This is the first in a series of articles about building DevPail, my general-purpose development environment tooling tool.

Days gone by

Oh, how I long for the days of yore. Especially Tuesdays. Tuesdays of yore were the best. Back then, building software was super simple. You could just fire up an editor, write some code, and build you software like this:

$ vimacs main.c
$ gcc main.c
$ ./a.out

Every. Single. Time.

But then you wake up (or move on to 200-level classes), and the world is much more complex. Now, because your project has grown to multiple source files, in multiple languages, using multiple technologies, you need... (insert ominous music) Tooling.

What is this "tooling" of which you speak?

The short version (and that's all you're getting, folks; Google's got you for the long version) is: the software you use to write and manage your software. In the example above, this is the editor with which you write the code, and the compiler/linker package you use to translate the code into an executable. But we're way beyond that in today's world. Now, we also have:

  • linters to validate our code is syntactically correct and properly formed
  • testing frameworks to make sure our code does what we expect it to
  • development servers to preview our code's operation while we work
  • packagers to wrap up our code into neat little bundles we can send to others
  • task runners to manage all of that stuff
  • so much more, just waiting to guide you down another rabbit hole

And not just one of each of those, but one or more for each and every language we use. Oh, you say, you're only using one language. I doubt it. For example, for web work you're likely using at least three of HTML, CSS, JavaScript, Python, Ruby and Go and, well, you get the idea.

Please don't get pedantic on the definition of a language here; Let's agree that these are at least dialects whose features and rules we have to learn.

What's the problem with tooling?

The issue, of course, is that each of these tools is itself software, and as such, undergoes the same churn as all software projects. We find and fix bugs. We add, revise and remove features. We replace some tools entirely in favor of something newer or more well-suited to what we're doing. And some fall into disrepair out of neglect or abandonment.

In a project started just a year ago, the tooling may have become unusable because it's not compatible with the latest version of some other tool, or the underlying platform. In projects even older, this is nearly guaranteed.

I've had the pleasure of having to tell a client that, yes, I can add that relatively small feature to their website, but the cost of revising the tooling will cost more than the feature. And, no, their previous developer didn't keep the tooling specs pinned to the versions which worked back then. This is NOT a happy conversation, for either of us.

In addition, every new platform, library and framework brings with it more tooling. So while the tooling from your last project may work, it'll need at a minimum some tweaks to work with the new stuff. And when you go back to a previous project and the tool flow you're used to isn't there, you either end up fighting you own tools or "investing" time retrofitting the new tools into the old workflow.

We need a bucket of tools we can carry with us

You know those 5-gallon bucket and apron combos people use to carry their tools from job to job? (Hint: check out the cover image for this article.) Everything's in there: tools, spare parts, old candy wrappers, everything! And the entire bucket, unaltered, moves from job to job with ease.

I want one of those for software tooling. So I built one, and I call it DevPail (yes, it's on GitHub). It's not a silver bullet for every possible werewolf, but it makes me happy.

Please note, the version available on GitHub, as of this writing, is very young, not feature-complete, and not well documented. I'm actively working on it. Patience is a virtue.

Let's walk through building it together!

Designing DevPail

The general philosophy is that we have three concerns in building our development environments, which we want to keep separate:

  1. All the infrastructure (i.e. NodeJS or Python) lives in the bucket, which can be moved easily from job to job -- and this is important -- without alteration. One ring to rule them all, so to speak.

  2. The project-specific tooling (i.e. node_modules or site-packages) are carried around with the bucket, like the apron. Separate, but connected.

  3. The project's code lives on the host system (well, probably in a source control repo somewhere, but that's picking nits).

The bucket: Docker

I tried using VMs to contain all my tooling for a project, but for my needs at least, that's like swatting a fly with a sledgehammer. I need a much lighter tool, which doesn't take two minutes to boot. Docker provides just what we need.

The apron: Gulp

I've been using Gulp for years, but I'm really tired of needing to copy/paste/tweak/cus/tweak the gulpfile every time I start a new project. While I like Gulp in general, I want a plug-able solution which can be configuration-driven rather than strictly code-driven. Gulp, in its infinite coolness, can be made to meet that objective.

The tools: Everything else, on demand, as needed

Each project, naturally, has it's own requirements for platform and tooling, so those should be configurable easily. Ideally by setting a couple of entries in a package.json file.

Enough exposition, let's build

Let's start by creating a directory for our project directory, and some subidirectories. Next, let's build a Dockerfile:

$ mkdir -p devpail/imagesrc/homedir
$ cd devpail
$ vimacs imagesrc/Dockerfile

We'll talk about that homedir entry below, and in the following installments of this series.

Choose a base image

DockerHub contains (no pun) images to meet needs both sublime and obscene... ok, wait, that's over the top. There are lots of images there, and you'll surely find one you like.

I've chosen one which gives us Python 3.8 and NodeJS 14 because that matches my most common deployment environments. This image includes a non-root user called pn whose home directory is, of course, /home/pn.

# syntax=docker/dockerfile:1
FROM nikolaik/python-nodejs:python3.8-nodejs14

The Dockerfile we're building here is a somewhat simplified version of what you'll see in the GitHub repo, for the sake of clarity.

Add Gulp to the image

We'll need the Gulp CLI, so we add that to the image. Not much to see here.

RUN npm install --global gulp-cli

Setup our container environment

We want to use bash rather than the Docker default sh for reasons which will become apparent later on.

We also want to be able to run any tools we've added via pip/poetry and npm/yarn, so we set up some environment variables which we'll leverage... later on, and update our PATH.

SHELL ["/bin/bash", "-c"]

ENV PYTHONROOT="/home/pn/app/site-packages"
ENV PYTHONPATH="${PYTHONROOT}/lib/python3.8/site-packages"
ENV NODE_PATH="/home/pn/app/node_modules/.bin"
ENV PATH="/usr/local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:${PYTHONROOT}/bin:${NODE_PATH}"

Create our container directory structure

As stated above, our base image gives us a pn user, so we'll make our tools run in the container as that user. It's a best practice to avoid running anything as root, even inside a container.

We'll copy a few files we need into the image from the homedir we made before into the home directory of the pn user, and ensure those files are owned by pn.

We will mount a volume to hold all the per-project tooling under /home/pn/app, and ensure that directory is writable by pn.

USER pn
COPY --chown=1000:1000 homedir/* /home/pn
WORKDIR /home/pn/app
RUN chown 1000:1000 /home/pn/app

Make our container's server accessible

Our Gulp process will spin up one or more server processes, on ports 3000 through 3009. Yes, it's more than we likely need, but the cost is near zero, and gives us lots of flexibility, so why not? Let's ensure we can connect to those ports from our host.

EXPOSE 3000 3001 3002 3003 3004 3005 3006 3007 3008 3009

Run our development tooling

Finally, when we run the container, we (usually) want to run our default Gulp task.

ENTRYPOINT [ "/bin/bash", "-c", "gulp" ]

Build and run the image

Building the image is pretty straightforward. We ask Docker to build the image described by the Dockerfile in imagesrc, and we tag (-t) that image as devpail.

$ docker build -t devpail imagesrc
... (lots of docker output) ...

Running it will take a bit more work, so we'll walk through the command line bit by bit. Here it is:

$ docker run -it --rm -v myproject-tooling:/home/pn/app -v ~/myproject:/home/pn/app/src devpail
  • docker run tells docker to... um... run an image.
  • -i (interactive) and -t (tty), which we've abbreviated as -it, allows us to interact with the container. Basically, we're "shelling" into the container.
  • --rm (note the double-dash here) tells Docker to remove the container when it exits.
  • -v myproject-tooling:/home/pn/app mounts a named volume to ~/app. This will hold our per-project tooling, keeping it nicely wrapped up.
  • -v ~/myproject:/home/pn/app/src mounts our project's directory. Replace this with your actual project directory.
  • devpail is the image we want to run.
$ docker run ...(as above)
[01:02:34] Local gulp not found in ~
[01:02:34] Try running: npm install gulp

We've build an image, and then run that image in a container. It didn't DO anything, but it's a good start.

In upcoming articles, we'll build our Gulp tasks, simplify running all those docker commands, and make DevPail a bit smarter. Not necessarily in that order. Stay tuned!

Did you find this article valuable?

Support Synaptic Spillage by becoming a sponsor. Any amount is appreciated!