In commonly used systems, when developers create software or an application like a webshop or a new function for a banking back-end system, the program will be passed on to operators who then follow the guidelines and execute the installation on another server.
Operators and developers work independently: the software is tested by the developers on their own machines and then reproduced by the operators. For instance, a mail-sending system will be set to send mails through the test mail server. Then later, when switching to the “real” mail server, the whole process needs to be repeated.
In many industries, the scope of these duties is separated by strict rules. For example, in the financial sector it is especially important that developers do not have access to live systems. The two teams often communicate only through descriptions because of the geographical distance and differences in work shifts.
This, however, can cause complications due to inaccurate or not up to date descriptions, not to mention any last-minute changes that may have been inadvertently left out from the text.
But there is a solution for this problem: a new platform called Docker.
Essentially, developmental steps are not recorded on paper or shared documents, but in a file, that can be read by computers. Thus, the operator simply has to run the application and they require less knowledge of the internal structure. The running copy is called the Docker Container.
Files that create the environment are called Dockerfiles.
It is also possible to make the updated versions of these files available in the version management system along with the source codes, so a restoration of an older complete system is easier. Files only describe commands, sensitive data (such as database passwords) is provided by operators in a separate location.
So, there is no risk. From development to live system, everything runs through Docker. This means that if a potential bug was detected by the developer then the software would not work for them.
How does this all work in practice?
Let’s stick to the mail example.
Let’s say we’ve written a webshop program that sends customers a letter. To run this application, you need a running environment, for which we define a Docker environment.
In the first line of the file we describe what kind of operating system to boot, and on the following line we specify which packages to install. We also specify that other settings on the server are still required, such as defining databases and external connections. Some of the settings can be adjusted with environment variables related to use.
What is the operator’s job? When he receives the above specifications, he just replaces the address of the test email server with the live system’s server and so the environment is created where the application can run from.
What does this mean for customers?
First and foremost, speed and accuracy. Endless hours are no longer spent seeking out errors or omissions in text based documents, mistakes which are usually only discovered on the very last day before a system goes live.
This solution is much simpler from the side of operations too. There is no need for accurate knowledge of servers and applications as before. Just as using a container was a big breakthrough for the transportation of goods – different sizes or types of packages ceased to be a problem anymore – Docker provides a similar benefit, you can manage different applications in a unified way.
Of course, this solution is not a cure-all. It is not worth setting it up for simple projects. Moreover, there are cases when using Dockerfiles are not even possible. Since the essence of this technology is to describe how to reset a server in a Linux environment, the Docker method can not be used if the solution that you have prepared cannot run on a Linux platform.
However, we definitely recommend it whenever it is complicated to create the environment.
It is also a good idea to use this technology for Java developments if the installation happens on an application server. Likewise, it is great to use Docker for micro-services too. In these cases, each service with a separate life cycle should be run as separate Docker containers.
The speciality of this solution is that although the initial phase of development and the integration of new infrastructure elements might take a little longer due to the continuous maintenance of the environment, this is negligible compared to the whole process. Overall, the time spent on communication and debugging before installation is much shorter.
What’s more, the technology and the software which Docker provides is free.
In the light of all this, I believe that it is time to end the era of descriptions shared in Word documents while running big, complicated projects.