Containers? Docker? Kubernetes?

Egon Fiedler
6 min readMay 17, 2021

No great solution is born unless there is a problem, a solution without a problem is pointless.

Not this type of container, if this is what you were looking for, not your article but if you do get the metaphorical meaning, yes this article was meant for you.

So a problem inherent to the computer revolution with all its constant change, new chips ever so more powerful consistently being born, the amount of devices that work interconnected by the internet ever so often keeps on growing, the innovations happening at every corner of the economy in every possible industry and just the way in which we live life keeps on changing every day. Then while this happens been quite impossible to ignore at the micro-level, the software engineer level, this same amount of change also happens, since they are the ones fuelling this transcendent change in how the world works, every so often creating tools, software, new devices, updates to programming languages, discontinuing hardware, letting programming languages die, creating new programming languages, deprecating software, upgrading hardware, changing jobs, scaling, business new needs, change in the database, creating new jobs, disposing of old jobs and while all this happens, you need whatever it is you made to run smoothly anywhere it's needed or else, well you can imagine, and sadly it's not like you can say my dog ate the computer.

Well, it seems that while technology evolved, dog's abilities did so too, regardless that excuse would be even worse since there are GitHub and other version control software alternatives where you can store your code.

So the solution was born, virtualization is the process of running a virtual instance of a computer system in a layer abstracted from the actual hardware. It is a dedicated machine where the operating system, libraries, and other programs are unique to the guest virtualized system and unconnected to the host operating system which sits below it.

Now I will explain how this concept is used in what is a container, its worth, and how Docker made them become so popular.

A container is an isolated process that shared the same Linux Kernel as the host operating system, as well as the libraries and other files needed for the execution of the program running inside of the container. This allows, a developer to package up an application with all the parts it needs, such as libraries and other dependencies, and ship it all out as one package.

This allows for complex multi-system environments to be fully modeled on a laptop. Multiple machines can now be a safe default because these multiple, separate “machines” can all be trivially instantiated on a laptop.

They also operate much faster, as unlike traditional virtualization the process is essentially running natively on its host, just with an additional layer of protection around it.

So now the development environment for a complex, multisystem application stack can now be reliably and repeatedly installed on a single computer, and any changes to any of the environment, or all of the environment, can be easily shared among the whole team so that everyone can rebuild identical environments quickly.

So then where does Docker fit on the image?

Docker is an open-source project that made the concept of containers become so much popular, at the end an idea to be anything it must be made and for it to change the world, it must be easy to use, easy to understand, and be better than what's available at its particular task, hence Docker!!

The revolution started in 2013! This is Docker’s logo.

It is a command-line tool for programmatically defining the contents of a Linux container in code, which then can be versioned, reproduced, shared, and modified easily just as it were the source code to a program.

This has sparked interest in microservice architecture, a design pattern for developing applications in which complex applications are broken down into smaller, composable pieces which work together. Each component is developed separately, and the application simply is the sum of its constituent components. Each piece can live inside of a container and can be scaled independently of the rest of the application as the need arises.

Great, this sounds amazing, I’m guessing you are thinking but then like any great solution also creates room for better solutions, and while Docker as its own solution in the name of Docker Swarm, is not as good as the last topic in this article, Kubernetes.

So now its time to take the metaphor of the freight cargo, yes there was actual metaphorical worth in there other than it is the actual inspiration for Docker logo, for cargo to be useful, you need things to be transported, something must be on the container and also this something must reach somewhere, and all of it must be working perfectly like a well-orchestrated orchestra.

OSUSP Orchestra

It not like you want to be this guy.

Business Insider.

You want your microservices to be working smoothly!

Docker Swarm metaphorical example, implying that its solution to the orchestration of multiple containers is not the best available.

While I haven’t used either Docker Swarm or Kubernetes, so by no means take my opinion as anything else that what seemed the most reliable pieces of information glued together to make sense of this all. It seems Kubernetes is the better solution to the orchestration of multiple containers.

Freight image that represents how Kubernetes does it better.

After doing my research my solution became gluing multiple opinions hence creating a more complete opinion that hopefully brings clarity to whoever is reading this and wanted to understand from the reason why to the highest level of implementation, what's up with containers, thanks to Microsoft Azure and Open shift, for the information I needed to understand Kubernetes.

This is what Kubernetes is!

As applications grow to span multiple containers deployed across multiple serves, operating them becomes more complex. Kubernetes is open-source orchestration software that provides an API to control how and where those containers will run.

It allows you to run Docker containers and workloads, helping you tackle some of the operating complexities when moving to scale multiple containers deployed across multiple serves.

Kubernetes runs on-premise in your own datacenter, in a public cloud, or a hybrid cloud configuration, deploying containers the same way, every time. Orchestrating a cluster of virtual machines and scheduling containers to run on those virtual machines based on their available computing resources and the resource requirements of each container.

Containers are grouped into pods, the basic operational unit for Kubernetes. These pods can be scaled to your desired state and you are able to manage their lifecycle to keep your apps up and running.

Kubernetes also automatically manages service discovery, incorporates load balancing, tracks resource allocation, and scales based on compute utilization. And, it checks the health of individual resources and enables apps to self-heals by automatically restarting or replication containers.

Hope you learned something and that this article was instructive to you! If you like it, clap my article and share it.

--

--

Egon Fiedler

Join my journey to grow as a writer and software engineer. If you like my content follow this link https://egonfiedler.medium.com/membership