Artificial Intelligence

Global tech outage reveals our digital dependency

Imagine a day when everything goes haywire. That was Friday.

It was not quite a global catastrophe, since it was mostly just a lot of devices, gadgets, computers and machines failing to work right. But it was revelatory — and ominous.

In today’s world, a single bad piece of software can wreak havoc on a global scale. And there’s more of this to come, according to experts who study and fret about our increasingly complex technological systems.

“We have, as this shows, lots of infrastructure relying on single points of failure,” said Gary Marcus, a professor emeritus at New York University and author of the forthcoming book “Taming Silicon Valley,” on Friday. “Absolutely nothing guarantees that we won’t have another similar incident either accidentally or maliciously.”

As more information emerged about the cause of the outage, it seemed clear it was nothing more than an accident, one caused by faulty software in an automated update from an Austin-based company called CrowdStrike. The big headline was the vulnerability of major industries, such as aviation and banking. But it was a rough time for anyone with a computer that on Friday morning announced blandly and without further explanation that it was not working.

Consumers of technology expect software to perform, and it usually does. But that invites complacency and digital illiteracy: We don’t remember anyone’s phone number because on a smartphone you just tap the name and the call goes through. We don’t carry cash because everyone takes plastic.

Life in the 21st century is pretty magical — until it’s not.

Marcus fears that society will become even more vulnerable as we rely increasingly on artificial intelligence. On X, he wrote: “The world needs to up its software game massively. We need to invest in improving software reliability and methodology, not rushing out half-baked chatbots. An unregulated AI industry is a recipe for disaster.”

GET CAUGHT UP

Stories to keep you informed

The AI revolution — which did not come up a single time during the June presidential debate between President Biden and former president Donald Trump — is poised to make these systems even more interdependent and opaque, making human society more vulnerable in ways no one can fully predict.

Political leaders have been slow to react to these changes in part because few of them understand the technology. Even technologists can’t fully understand the complexities of our globally networked systems.

“It’s becoming clear that the nerve center of the world’s IT systems is a giant black box of interconnected software fully intelligible to no one,” Edward Tenner, a scholar of technology and author of the book “Why Things Bite Back,” said in an email Friday. “You could even say that it’s a black box full of undocumented booby traps.”

What happened Friday brought to mind a threat that never fully materialized: Y2K. Twenty-five years ago, as we approached the turn of the century, some computer experts feared that a software bug would cause airplanes to fall out of the sky — along with all sorts of other calamities — the moment 1999 turned into 2000. Governments and private industry spent billions of dollars trying to patch up the computer problems in advance, and the big moment arrived with minimal disruption.

But the question of how vulnerable — or resilient — the global information networks of 2024 are cannot be easily answered. The systems are too numerous, too interconnected, for anyone to have full battlefield awareness.

Friday’s tech outage served as a fleeting reminder of the fragility of that invisible world, especially for those trying to catch planes, book surgeries or power up personal computers that had gone into a mysterious failure mode. Trending online all day was “Blue Screen of Death,” the nickname for the error message that appears when Microsoft Windows ceases operating safely. The Blue Screen of Death, people discovered, has in recent times taken on a gentler, less alarming shade of blue, as if someone had consulted a color theorist.

It did not go unnoticed that CrowdStrike, a company that provides software to ward off cyberattacks, was responsible for the outage. Tenner pointed out that in the history of disasters, technologies meant to improve safety have often introduced new risks.

“Lifeboats and their deck reinforcements installed after the Titanic destabilized a Lake Michigan excursion ship, the SS Eastland, in 1915. Over 840 people died in Chicago Harbor when it capsized during loading,” Tenner said.

And then there’s the safety pin: It was swallowed, open, by so many children that a surgeon developed a special tool to extract it, Tenner said.

Brian Klaas, author of “Fluke: Chance, Chaos, and Why Everything We Do Matters,” wrote on X after the outage that “we’ve engineered social systems that are extremely prone to catastrophic risk because we have optimised to the limit, with no slack, in hyper-connected systems. A tiny failure is now an enormous one.”

Technological disasters can also be triggered by natural causes. Prominent on the minds of many national security experts is the risk of a powerful solar storm knocking out the electrical grid, or damaging satellites crucial to communication, navigation, weather prediction and military surveillance.

Such satellites also could be targeted by a hostile adversary. U.S. officials have expressed concern about the possibility that Russia could be developing the capability to deploy a nuclear weapon in space that would pose a threat to our satellites — and potentially create an exponential increase in space debris with catastrophic consequences.

Friday’s outage emerged without any geopolitical machinations, or anything as dramatic as a thermonuclear explosion. It was just the result of some bad code, a bug — a glitch in the system.

Margaret O’Mara, a historian at the University of Washington and author of “The Code: Silicon Valley and the Remaking of America,” pointed out that the interconnected technologies of today still have human beings in the mix.

“The digital economy is, at the end of the day, human,” she said, “made up of code and machinery designed, directed, and occasionally drastically disrupted by human decisions and imperfections.”


Read More

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button