March 5, 2018
5 min read time

What role does the butterfly effect play in tech?

It’s pretty well-known that the “butterfly effect” is the term used to describe the theory that a single, tiny occurrence, no matter how minuscule, can change the course of whatever was to come thereafter. Because these almost invisible shifts make it virtually impossible to predict the future or control the outcome of events, one asks whether it’s possible to make any predictions at all.

To give it a slightly more scientific spin, the butterfly effect, as a term used in chaos theory, represents the sensitive dependence on initial conditions in which a small change can result in drastic differences in a later state. The most common way the masses understand the principle behind the butterfly effect is the question of whether a butterfly flapping its wings in New Mexico causes a hurricane in China. Seemingly disconnected and distant events that nevertheless connect and intersect to form a cause-and-effect relationship.

But how would this theory ever come to be, and how could it be proven? While running weather-model simulations, meteorologist/mathematician Edward Lorenz discovered that something simple like rounding off numbers in his data results produced vastly different outcomes. In a computer model based on 12 variables, such as temperature and wind speed, the variables he rounded off, even slightly, completely transformed the entire pattern that his program produced. Thus, a very small change in initial conditions created significant differences in the outcome - and this discovery, stumbled on almost by accident, has changed the course of science.

But outside of science and mathematical systems, do we see similar outcomes in everyday life? Maybe.

Lorenz’s discovery also had implications outside of the hard sciences. Philosophically, his work leveled the idea that everything is deterministic: “Determinism was equated with predictability before Lorenz. After Lorenz, we came to see that determinism might give you short-term predictability, but in the long run, things could be unpredictable. That’s what we associate with the word ‘chaos.’

Butterfly effect in the real world

In 1907 Thomas W. Lawson, a controversial stock broker, wrote the book Friday the Thirteenth. The widespread superstition caused by the book has far-reaching effects that still haunt the U.S. economy to this day. Now, on Friday the 13th, the U.S. economy loses about USD 900 million because this superstition lingers, causing people to fear doing just about anything. Historically, the stock market shows average gains of just 0.2% or less on that day. Lawson’s work was meant to be “just a novel”, right? But its ripple effects continue to be felt in a real way more than a century later.

How the butterfly effect collides with tech

Knowing that the butterfly effect essentially guarantees that we can’t predict anything with any measure of accuracy, why do we do it? Particularly in tech where there are so many rapid changes taking place at any given time. We have seen an astonishing array of wrong predictions, all of which could have come true… but then a butterfly fluttered by, and suddenly the predictions became completely irrelevant, if not simply short-sighted at the very least.

Among the most famously short-sighted: In 1943 Thomas Watson, president of IBM, said, “I think there is a world market for maybe five computers.” I don’t think I need to expand on this one except to say that maybe Watson was short-sighted, as IBM has had a tendency to be ever since (failing to see the OS and software-driven future, for example), but maybe the imperceptible changes brought about by the butterfly effect changed the landscape so much that what was true in 1943 and might have continued to be true were it not for this pesky butterfly became obsolete later.

Given the multiplying nature of wrong predictions, does it really make sense to forecast for the tech future and/or even make plans?

I have my own ideas about this, and as a tech evangelist, I’m pretty much out there talking about technology as it stands today and staking a claim on a variety of predictions. Obviously, my thoughts center on content delivery and web performance based on my observations and experience. Even though I think it’s possible to make informed guesses about what’s to come and what trends will leap into the mainstream, I don’t fool myself into thinking prediction can be a science or even that I can overcome my own skepticism about some trends. A case in point - it may be surprising to hear me say that I don’t believe 100% in the hype surrounding edge-computing technology. Yes, it is probably going to happen (it already is, really), but will it live up to the hype? Will it ever overtake the size and scope of the cloud revolution? I don’t think so, but I could be wrong.

These are the kinds of questions that interest me and that drive the tech world forward. In keeping with these curiosities and thoughts, I will talk in more depth about these topics at the upcoming LDNWebPerf in London on the 7th of March 2018. I hope to see you there, assuming that the Beast from the East cold from Siberia that has… “butterfly-effected” its way into Europe, including ill-prepared London, doesn’t keep us from these plans!