On the importance of DevOps

Andrei Dascalu
4 min readSep 17, 2021

--

too many silos

Initially published on Dev.To

Before we get down to the subject, I need to clarify that my take on “devops” is not one of DevOps engineers or DevOps as a role, but the “original” meaning of DevOps when it was coined: a practice that brings together ops and development for the purpose of enabling development teams to own the full lifecycle of an application, from inception to deployment.

## The Problem ##

I will (once again) use PHP as an example, mostly because it poses some interesting challenges from this perspective. It’s also related to a recent experience on a project I’ve been involved with.

The challenge is fairly straightforward.

You’re working on an application. You write your code, unit tests and so on, you click a button and once your code passes code review, your coverage (unit tests, integration and so on) is insane enough to allow you to press a button that takes the application, deploys it on a staging environment to be subjected to a set of automated tests and on success it’s already in production!

It’s pretty great and in line in about 50% of projects I worked with in the past few years.

Then, say, you’re developing a feature that requires interactions with RabbitMQ. You need your amqp extension made available, in your development stack (which is managed by a team similar to the people managing all the production infrastructure) as well as prod. You can’t really just click to deploy anymore since your code will break without the extension. Or, perhaps, you’re fixing something that also requires changes to OpCache or memory allocation (or any PHP-related configuration).

What do you do? Well, in a non-devops-ish practice, you’d ask a team (or two, in the case of my project) to do that for you and let you know when it’s done so you can proceed with merging it. Of course, the configuration changes also need to be backward-ish compatible in the sense that it won’t break anything when they’re there for people who might get the updated dev environment before your changes are merged in.

Wouldn’t it be great if there was a way to simply make things work as intended, in a way that no change pertaining to the application would depend on work done by a different team (which might block your delivery or have other repercussions)?

## The Solution ##

DevOps. It means that your team should have sufficient knowledge and the ability to do this work on their own (either by having specialised people or by spreading the knowledge).

Personally I prefer the latter because the vast majority of changes to do are limited in scope and it’s still ok for those rare cases when large scale changes to call on more specialised knowledge (or when it implies changes to the infrastructure).

Basically, it boils down to redefining configuration that belongs to the platform and/or application as part of the application rather than part of the infrastructure.

Docker is a pretty great tool for this. In the example above, you would have a Dockerfile defining the runtime of your application with its dependencies (as well as php ini files per various environments). When I need to add the amqp extension, I do it in the Dockerfile in the same merge request containing my other changes. It will reach deployment as well as the other devs at the same time when they pull the changes. My requirements and configuration travels together with the rest of my changes. **I can click to deploy safely**.

Of course, if you don’t use this model (in whole or in part), there are changes needed to get there.

1. As an organisation you must not be afraid of people learning. Learning is scary because when people do something even a bit different, it does affect their “productivity”.

2. You must not “silo” — don’t restrict the knowledge. When I proposed the Docker change, I was told that “this is not who we are, we don’t want to maintain this” — missing the point that knowledge is distributed to empower people to make the changes they need when they need them.

Development is a fast paced domain and silos slow things down. Being even a bit polivalent is a great advantage.

* yes, Docker Desktop is now a paid tool for large companies. But no, *docker daemon* & *docker cli* are still free, though a bit more difficult to install now (most people are used to installing everything through Docker Desktop)

--

--

No responses yet