Containers, huh, good god
What is it good for?
Probably something?
- Edwin Starr (disputed)

Here at Headspring, we're seeing more and more usage of Docker for local development. Having not really touched Docker or containers, I wanted to understand how Docker could help make our lives easier for development, whether it's just local development, our CI/CD pipeline, production, anything really.

I hadn't touched containers mainly because I really didn't have the problems that I (thought) it solved. I've done automated builds and deployments, scripted local dev setup, for over a decade now, so I wasn't sure how containers could streamline what I saw as a fairly streamlined process at was today. In my previous company, now over 10 years ago, we were doing blue-green deployments well before it had that sort of name, for a large (in the billions of $ in yearly revenue) e-commerce website, scripting out VM deployments before even Powershell was around to help script. That, and I don't do any Linux work, nor do any of my clients.

That being said, I wanted to see what containers could do for me.

Learning containers

First and foremost was really understanding what containers are. I'm not a Linux person, so when I hear "containers are just cgroups and namespaces" I'm already behind because I don't know what cgroups or namespaces are. It also doesn't help that evidently containers aren't actually a thing. So...yeah.

And on Windows, there are things called containers, but like Linux, it's not really a thing either, and there are different flavors.

Luckily, between the Docker website and the Windows website, there are tons of great resources for learning Docker and containers. Looking at our current build/deploy pipeline, I wanted to build a picture of what the Docker terms mean for how we typically build apps.

Semi-not-really-accurate Docker terminology

Docker has quite a few terms and the world is similar, but not the same, as what I'm used to. The mapping I had in my head is roughly:

Dockerfile =/= "build.ps1"
Image =/= "foo.exe"
Container =/= "$ foo.exe"
Docker-compose =/= "foo.sln"
Docker Hub =/= NuGet server

Our projects have a build script that produces a runnable instance of the app. It's not quite "F5" so more or less it's equivalent to the build instructions of a dockerfile. Built images are kinda like our built app, and a running instance is similar to a container as a running image.

A solution pulls together many different projects and applications into a single "run" experience, similar to a docker-compose configuration. And finally, we use Octopus Deploy and NuGet as our build artifacts, similar to Docker and Docker Hub.

Semi-quasi-sorta-probably-not-accurate equivalent terminology, but close enough for my mental model.

Pipeline before Docker

I wanted to understand how Docker could help our development, but first, let's look at what my typical CI/CD pipeline looks today:


In this picture, I'm using some sort of Git source control server, Bitbucket, VSTS, whatever. For local development, it's usually just Visual Studio (or Code) and a database. The database is usually SQL Express, sometimes SQL Server Developer Edition if there are additional things going on.

Because we can't assume everyone has the same instance name, our app would use environment variables for things like instance names/connection strings so that two developers wouldn't need to have the exact same environment.

For continuous integration, we'll use AppVeyor or VSTS for cloud builds, or TeamCity or Jenkins for on-prem builds. Rarely we'd use TFS for on-prem, but it is there.

The output of our build is an artifact, whose design matches our deployment needs. For Octopus Deploy, this would be one or more versioned NuGet packages, or with VSTS it'd just be the build artifact (typically a zip file).

Finally, for continuous delivery, promoting packages between environments, we default to Octopus Deploy. We're looking at VSTS too, but it's somewhat similar. You promote immutable build artifacts from environment to environment, with environment configuration usually inside our CD pipeline configuration. Deployments would be Azure, AWS, or some bespoke on-prem servers.

Everything is automated from end-to-end, local setup, builds, deployments.

Given this current state, how could containers help our process, from local dev to production?

In the next few posts, I'll walk through my journey of adding Docker/containers to each of these stages, to see where it works well for our typical clients, and where it still needs work.