Docker is a revolutionary tool that provides speed and repeatability for an embedded team. While traditionally used in non-firmware environments, there is enormous potential to improve the development efforts of firmware projects by making use of Docker.
The Problem
A few decades ago, you could submit your work as a batch job to a common environment, shared by everyone.Everyone used the same tools and the same settings. As technology improved, the shared environment slowly started moving towards personalized spaces; first as accounts on remote shared systems, and then on personal computers. From that point, you could customize your environment, change settings that fit your project or development style, and create shortcuts to speed up common tasks. As the systems became more complex, there continued to be more and more ways to customize a build environment. There were login scripts, environment variables, aliases, file extension associations, preferred tools, search path orders, plugins, etc. You could download the latest version of a tool, or stick with an older version. .
That all works great in theory until you consider that now everyone can have their own version of each tool. On a project, every developer should be using the same tool versions, but if you have 10 tools for a build, it’s hard to ensure everyone’s environment matches. Mismatched tool versions can result in: extra warnings on one developer’s machine vs. another, build failures that aren’t consistent, program crashes when built on one machine but passes on another, etc. This eventually will cause a lot of wasted hours in troubleshooting, downloading proper tool versions, and even setting up a new computer for a developer, since you must ensure all tool versions match exactly.

The Solution
So, do we go back to mandating a single environment for every developer and project? Yes and no! Enter Docker.
Docker provides the ability to create a system-level image over which you have complete control. An image contains all the system components, tools, settings, libraries, and other dependencies the project needs. However, these images are not massive multi-gigabyte VM images. They’re small enough to be portable. The beauty of the smaller image size is that you can simply share a Docker image with everyone who needs to use the development environment for the project. The Docker image can be run locally on Windows, Linux or MacOS alongside other applications. It can also be run remotely on a CI server. By using a Docker image, the development environment can be the same everywhere, for every developer, at any time. Note: It does need to be stated for clarity that a Docker “image” is initially just a “Dockerfile” which contains instructions for how to build said “image”. These instructions are then built/executed to create a final Docker “image”. Once this image is run, it is referred to as a “container”. Since the rest of this blog post is talking about the running Docker image, it will from this point be referred to as a container, instead of an image.
With a Docker container, every developer can still continue to develop within his or her customized environment using their favorite tools and settings. When it’s time to build and test, they simply launch the Docker container and build their code inside of it. This works by using a container that can take a local path as an argument to where your code is located. The Docker container would ideally contain all your necessary toolchains (i.e. gcc, clang, cppcheck, etc), and then would build your code using those toolchains versus using what is currently on your PC. Docker is perfect for projects with longer life cycles. If tools change for newer development, but the legacy projects need to be supported, you can be sure that the Docker images used on those legacy projects will still build the same as they did from day one. You can also create new images for new development and continue to test that code works in both the new and legacy Docker images.
Bonus Benefits
Maintaining a consistent development environment turns from a “nice to have” into a mandatory process when your product has to meet safety standards. For example, the IEC-62304 Medical Device Software Lifecycle requires a formal, documented development process. By using Docker, you can document, track, and prove compliance to the standard. When it comes time to produce a patch for an anomaly reported from the field, it’s easy to launch the exact same development environment, even if the operating system and tools have been upgraded on the active development platform.
Conclusion
By creating a Docker image – a single, standard, preconfigured development environment becomes the basis for all developers. It provides one consistent set of development tools with known versions. Everything is integrated and is independent from the computer on which it’s being run. This makes the setup for a new or returning developer very easy. Instead of the typical hours-long setup routine for a new developer, it’s merely a short download before they can launch the container and be off to the races.
It’s not often a piece of software meant for developers really impresses. But Docker has been one case where the advantages it offers have a profound influence on improving the development process now and into the future.
And if you have questions about an embedded project you’re working on, Dojo Five can help you with all aspects of your devops for embedded journey! We are always happy to hear about cool projects or interesting problems to solve, so don’t hesitate to reach out. You can book a call to discuss your project, and you can sign up for our Embedded DevOps platform for free.


