Docker provides a one-step setup. Then you’re up and running with a full-fledged development environment with the same OS, tools, and environment for each person on the team. A consistent setup will provide consistent, repeatable results.
A few decades ago, you could submit your work as a batch job to a common environment, shared by everyone. Every detail you could control was specifically included as part of that task. If you needed a specific setting, there it was in the project itself. Everyone used the same tools, the same settings. As technology improved, the shared environment slowly started moving toward personalized spaces; first as accounts on remote shared systems, and then on personal computers.
You could now customize your environment, change settings that fit your project or development style, and create shortcuts to speed up common tasks. As the systems became more complex and you added more languages to your repertoire, you had more and more ways of customizing your environment. There were login scripts, environment variables, aliases, file extension associations, preferred tools, search path orders, plugins, etc. You could download the latest version of a tool, or stick to that older version you preferred.
This was great. Now you have everything you wanted, and it’s perfectly configured. When a new project comes along, you only needed to focus on the project. The environment was already in place. In fact, everyone on the project, or anyone who later retrieved your work, could do this, too! Everyone was happily working in their own custom environment, tuned to make them as productive as possible.
And that’s when the trouble started.
For the most part, everyone seemed to generate the same results. Mostly. Maybe an output file was a little bit longer. Or a project generated a warning for one developer but not anyone else. But then there were the outright failures. Someone included a reference to a tool that wasn’t installed on anyone else’s machine. Or an old project that just needed some updating required an older version of a library. Well, then that tool wouldn’t run on your recently updated operating system. Countless hours had to be spent tracking down differences in development environments to explain inconsistencies that showed up during testing and deployment. And trying to downgrade a tool due to an incompatibility. And attempting to configure a brand new computer with a list of installation software to simply set it up as a development machine.
So, do we go back to mandating a single environment for every developer and project? Yes! But no! Enter Docker.
Docker provides the ability to create a system-level image over which you have complete control. An image contains all the system components, tools, settings, libraries, and other dependencies the project needs. However, these images are not the massive multi-gigabyte VM images of yore. They’re small enough to be portable. The beauty of this feature is that you can simply share a Docker image with everyone who needs to use the development environment for the project. The image can be run locally on Windows, Linux or MacOS alongside other applications. And it can be run remotely on a CI server. Now the development environment can be the same everywhere, for every developer, at any time.
Every user can continue to develop within his or her customized environment using their favorite tools and settings. When it’s time to build and test, they simply launch the Docker image and build against it to confirm that their changes work in the agreed-upon standard environment. And it gets better still. Docker is perfect for projects with longer life cycles. If tools change for newer development, but the legacy projects need to be supported, you can be sure that the Docker images used on projects won’t have been changed. You can also create new images with the newer environment for new development and continue to test that code works in the legacy system as well as the new.
Maintaining a consistent development environment turns from a “nice to have” into a mandatory process when your product has to meet standards to be approved for sale. For example, the IEC-62304 Medical Device Software Lifecycle requires a formal, documented development process. By using Docker, you can document, track, and prove compliance to the standard. When it comes time to produce a patch for an anomaly reported from the field, it’s easy to launch the exact same development environment, even if the operating system and tools have been upgraded on the active development platform.
By creating a Docker image, a single, standard, preconfigured development environment becomes the basis for all of the developers. It provides one consistent set of development tools with known versions. Everything is integrated. Everything about it is independent from the computer on which it’s being run. Everything is ready to go for a new or returning developer. Instead of the typical hours-long setup routine for a new developer, it’s merely a short download before they can launch the container and be off to the races.
It’s not often a piece of software meant for developers really impresses. But Docker has been one case where the advantages it offers have a profound influence on improving what we had available before, and how it takes us into the future.
And if you have questions about an embedded project you’re working on, Dojo Five can help you with all aspects of your devops for embedded journey! We are always happy to hear about cool projects or interesting problems to solve, so don’t hesitate to reach out and chat with us on LinkedIn or through email!