When the modern approach is nothing but hype

Published on 2019-11-07. Modified on 2019-12-22.

Usually we regard the phrase "the modern approach" as something that denote improvement over time, something that has changed to the better. However, the phrase "the modern approach" are often abused and we must be careful to separate fact from hype.

Table of Contents

Introduction

We could ask why it is that when we are dealing with programming and technology some people have very strong opinions about how everyone else should conduct their business rather than just mind their own. But I think the better question to ask is why everyone else are listening to these hypesters?

Hype and trends can be dangerous pitfalls and we need to be careful about who we allow to influence our decision making process. One of the early warning indicators is when someone addresses something as "the modern approach" in order to make it sound like that's the right thing to do.

"The modern approach" phrase has authority and it is influential because normally there is a big difference between the modern way of doing things vs the old and outdated way of doing things. We no longer use horses for transportation, and we no longer use telegrams for telecommunication, so who would want to be backwards and not follow the modern approach?

Conformity is a type of social influence involving a change in belief or behavior in order to fit in with a group. This change is in response to real or imagined social norms and expectations. But we all know that just because something has become widely adopted and everyone is doing it, doesn't necessarily make it good. Not only that, but perhaps that specific "modern thing" hasn't even got that wide adoption everyone is talking about, we're just having a very hard time filtering out the hype-noise.

Take a look at this small social conformity experiment video:

Object-oriented for the sake of object-oriented

In e.g. the PHP community we have seen numerous examples on how "modern PHP" has been hyped up to be equated to object-oriented programming in PHP, which as a result has caused many projects to become spaghetti code.

Abstraction is powerful. What I'm really allergic to, and what I had a reaction to in the '90s, was all the CORBA, COM, DCOM, object-oriented nonsense. Every startup of the day had some crazy thing that would take 200.000 method calls to start up and print "Hello world". That's a travesty! You don't want to be a programmer associated with that sort of thing.

― Brendan Eich in Coders at work - Reflections on the Craft of Programming

When I discovered the Go programming language it was love at first sight. A lot of the frustration I had been dealing with in the PHP community did not exist in Go, and the best part was that that was by design.

Robert Griesemer says it beautifully in his talk (on page 18 in his slides):

Complex OO code is modern analog to unstructured "spaghetti code" of 1970.

Object-oriented programming for the sake of object-oriented programming makes everything much more complex than it needs to be and the usefulness of the object-oriented programming paradigm is highly exaggerated.

Inheritance hierarchies can become extremely brittle and unstable, and then we also have the huge object-relational structure to contend with. Object-oriented programming can bring at least as many problems to the table as it solves.

I have seen many problems caused by excessive, slavish adherence to object-oriented PHP in production applications. It is not that object-oriented programming is bad, but it is not something that automatically improves development. It is like spicing a meal, use a little spice, and it improves the taste, add too much and it utterly ruins it.

It's all about the cloud

You do know that "the cloud" is nothing but managed servers right? And that when you migrate your data to "the cloud" you're actually just migrating your data to other peoples computers?

But no, everyone should migrate to the cloud, and it's all about the cloud, and container deployment in the cloud, and DevOps, and what not.

I have warned clients about migrating all their data to the cloud only to be met with frown and disdain, as though I was some kind of dinosaur trying to hold the company back from moving forward.

So, the hypesters know best right?

Security strategy begins with the attitude: Never trust, always verify. You cannot do that once you migrate all your data to "the cloud"!

A zero trust architecture ensures that data and access across your network are secure and based on parameters like user identity and location. It inspects and logs all traffic, learns and monitors network patterns, and make use of authentication methods, all with the goal of seeing every user and device connected to the network at any moment.

Insider threats are often a great source of breaches and concern because they originate internally, from countless devices and applications that are anywhere. But that thread isn't mitigated on the cloud, your employees still need to access the data, the only thing that is different is that now you know absolute nothing about your IT infrastructure.

The very nature of public cloud infrastructure allows data to be accessed from anywhere. Even data that normally would only be accessible from within a company's own infrastructure, such as a private Intranet, has to travel exposed to the open Internet, making it vulnerable to theft or infection by malware. Cloud providers also manage data for many clients in the same cloud environment, increasing the likelihood of the wrong people being granted access to what is supposed to be secure data.

Security is one of the main reasons that some companies are finally leaving the public cloud again. With several high-profile data leaks involving cloud providers, e.g. an August 2018 AWS error exposed business-critical information involving over 31.000 systems for the company GoDaddy, their worries are not without merit.

Successful IT security strategies reduce the complexity of the IT environment to something simple, or far simpler than it would be to the unaided naked eye trying to scan the network for anomalies.

Micro-segmentation is the process of placing security perimeters into small, isolated areas or zones to maintain separate access for different parts of the network. With micro-segmentation, files in a network can be placed in separate, secure zones. A user or program with access to one of those zones won't be able to access any of the other zones without separate authorization. This ties security to individual workloads.

One of the benefits of micro-segmentation is the control of application security which includes built-in policies that define allowed behavior and protection for each individual build. Visibility into application behavior on devices that access applications also needs to be taken into account so that anomalous activity can be detected and action can be taken more quickly.

And least privilege is a principle of information security that grants only as much access as an end user needs for a particular purpose or role.

All of this becomes impossible to implement and maintain once you migrate your data to "the cloud".

Furthermore, for many organizations, cloud computing has not only added more complexity, but also not enough cost-savings and as such many businesses are leaving AWS. The migration away from the cloud has also been termed as The Great Cloud Exodus.

Although cloud technologies will continue to be an important part of IT strategy in the future, many companies are moving away from purely public cloud-based networks. Given the variety of cloud computing solutions available today, there is no reason to settle for a strategy that doesn't meet all of an organization's business needs. Colocation data centers may present the most versatile choice for companies, but larger organizations may also consider the benefits of a strictly on-premises enterprise solution as well.

Microservice - the industry's best-practice model

The hype goes that with microservices an application become easier to build and maintain when it is broken down into smaller pieces which work together. Each component is continuously developed and separately maintained, and the application then becomes the sum of its components. This is in contrast to a monolithic application which is all developed all in one piece.

Microservices is sometimes compared to the Unix philosophy that emphasizes building simple, short, clear, modular, and extensible code that can be easily maintained and repurposed by developers other than its creators. The Unix philosophy favors composability as opposed to monolithic design.

Furthermore it is stated that because an application is built as a set of modular components then it becomes easier to understand, easier to test, and easier to maintain.

However, this isn't necessarily true! It very much depends upon which kind of application you're building.

When people compare web application microservices to the Unix operating system and state that the cp command only knows how to copy files, and the ls command only knows how to list files and directories, and that if you want to add a feature to the cp command you don't need to worry about the ls command, then the comparison is flawed.

The comparison is flawed because even though you can at times pipe Unix commands to each other and make them work together, these commands do not depend upon each other at all and they are really small completely separated units, this is not how a web application works.

A much better comparison for a web application would be to compare it to the Linux Kernel, which is a monolithic kernel, because the components in a web application are not really separated. Actually a web application is monolithic by nature!

In a web application components cannot function without the others. The ls command doesn't require the cp command or vice versa, however the web application requires multiple services in order to function properly and you cannot simply remove each part and still have a partly working application.

Many things are also much easier in a monolith. Integration testing and reasoning about how the components in the monolith interact, refactoring of interfaces etc. Requirements also change all the time. You have to update a service and hope it's backwards compatible because if not, then all the services that interact with it needs to be updated, and all the so-called benefits of microservices are lost.

People like to talk about how awesome the latest thing they have invented is, and the hype often spins out of control. It is important to consider the consequences of a given technology because with every increase something else must decrease.

Whatever you do, don't fall into the trap of blind following of "the modern way" of doing things - always consider the trade-offs!

Image of people blindly following each other