We were warned that our embrace of computer technology would lead to disaster.
By incorporating computers into ever more areas of our lives, we were told, we had created a scenario in which a mundane glitch could bring everything crashing down. Air travel would be snarled, bank accounts would become inaccessible, essential services would be seriously disrupted, and people would stare in horror as the computers they relied on for so much simply stopped working.
These warnings applied not to last week’s CrowdStrike IT outages, but to the year 2000 computer problem (Y2K), when experts warned that a disaster would unfold when 1999 became 2000 unless precautions were taken.
The events that played out on July 19, 2024, when a faulty software update from the cybersecurity firm CrowdStrike caused widespread outages for users of Microsoft Windows, seemed to be a replay of the failures anticipated in the Year 2000. Even as CrowdStrike rushed to fix the problem, air travel was grounded, many people struggled to access their bank accounts, essential services genuinely were disrupted, and many people felt themselves turning red as they stared at their computers’ “blue screen of death.”
There are key differences between today’s CrowdStrike outage and the scenario warned about in Y2K. But there are significant parallels too. The most important may be what the CrowdStrike outage reveals about what we have failed to learn from Y2K itself: the computer systems on which we depend are fragile and error prone—and these systems are so interwoven in our daily lives that, when disaster strikes, it can hit us everywhere at once.
Read More: CrowdStrike's Role In the Microsoft IT Outage, Explained
The Y2K problem is now nearly ancient history. In the 1950s and '60s, computer memory was expensive, and computer professionals were under pressure to save money. One solution they hit on was to truncate dates, lopping off the century digits so that 1939 would be coded as 39. To be clear, this worked. It saved memory, it saved money, and it did not impact the calculations computers used dates to make. Simply put: 1999 minus 1939 equals 60, and 99 minus 39 also equals 60. However, 2000 minus 1939 equals 61, but 00 minus 39 equals -39. And when computers encountered these incorrect results some systems and programs would start churning out garbage data, while others would fail entirely.
It turned out that straightforward programming decisions taken in the moment could have long-lasting, potentially calamitous implications—especially when so many people and systems came to rely on those underlying programs.
In the 1990s, computers were still in the process of becoming a common feature in people’s homes, but computers had already taken on important functions for businesses, governments, and had become closely intertwined with other essential infrastructure. Congressional hearings with titles like “Y2K: Will the Lights Go Out?” “Y2K and Nuclear Power: Will the Reactors React Responsibly?” “Year 2000 and Oil Imports: Can Y2K Bring Back the Gas Lines?” and “McDonald’s Corporation: Is World’s largest ‘small business’ Y2K ready?” all attested to how closely bound up with vulnerable computer systems daily life had become by the 1990s. In the 1990s there was not a computer in every home (or in every pocket), but computers were already integral for keeping the lights on in those homes.
Congress held its first Y2K hearing in 1996, with the auspicious title “Is January 1, 2000, the Date for Computer Disaster?” There, Kevin Schick, then research director of the technology research and consulting company Gartner Group, stated: “We use computers for everything—we have written programs to handle everything.” In speaking of “everything,” Schick emphasized to the committee that he was not just talking about the dangers to industry and individual companies should their systems fail. Rather, he was drawing attention to the fact that so much of the nation’s (and the world’s) critical infrastructure was now bound up with computer systems. A point Senator Robert F. Bennett (R-Utah) drove home at a Y2K hearing on “Utilities and the National Power Grid,” at which he referred to a survey he had conducted regarding the Y2K preparedness of the ten largest electric, oil, and gas utility firms, survey results that left him to ominously state, “I cannot be optimistic...I am genuinely concerned about the prospects of power shortages as a consequence of the millennial date change.”
Read More: 20 Years Later, the Y2K Bug Seems Like a Joke—Because Those Behind the Scenes Took It Seriously
In its initial report, the Senate Special Committee on the Year 2000 Problem, echoed Schick, and described Y2K as “the first wide-spread challenge of the information age,” which was delivering “a crash course in the fragile mechanics of information technology.”
The Special Committee highlighted how advances in computer technology had been hugely beneficial, but with those benefits had come new dangers. And though the Special Committee was not encouraging anyone to stock up on canned goods and head for the hinterlands, or arguing that computer networks needed to be dismantled, they emphasized that Y2K was “an opportunity to educate ourselves firsthand about the nature of 21st century threats” and an occasion “to consider carefully our reliance on information technology and the consequences of interconnectivity, and work to protect that which we have so long taken for granted.”
As a result, Y2K helped drive broader awareness of our shared reliance on computers for everything from banking to keeping the lights on.
But while the scope of the problem seemed enormous, the predicted disaster did not come to pass.
The reason, however, wasn’t that we lucked out, or that the problems were overblown. Instead, those in and around the IT community, with the coordination and support of the federal government, took the problem seriously and marshaled the attention and resources necessary to fix Y2K before their fears came true.
Considering that Y2K did not, in the end, snarl air travel, block access to banks, or disrupt emergency services, it is easy to look back at it and laugh. And yet the CrowdStrike outage is a reminder that if we do not take seriously “our reliance on information technology,” ultimately the joke winds up being on us.
When we look back at Y2K today, from the midst of our myriad computer-exacerbated problems, we should remember the work that went into fixing Y2K. But we should also remember that part of what Y2K revealed was that, as Kevin Schick put it at Congress’s first Y2K hearing, “we use computers for everything—we have written programs to handle everything.” And those computers, which we use for everything, are fragile and error-prone. Furthermore, as we consider the costly headaches from the CrowdStrike outage, it is worth remembering something else Schick stated at that hearing: “it is very expensive to fix something once it is broken versus making sure that you have resolved the issue prior to.”
Speaking in the early weeks of the year 2000, Representative Constance Morella (R-Md.) asked, “Will Y2K inspire a conscious effort for greater long-term planning and more reliable and secure technology, or will it just prolong the shortsighted thinking that made Y2K so costly?” Unfortunately, the derision with which it is often treated means Y2K did not really succeed in inspiring such a conscious effort. We too often fail to apply the lessons when things go right, or move on too quickly from the incident when other events start to dominate the headlines, and the CrowdStrike outage makes clear we still have much to do for the “more reliable and secure technology” and “shortsighted thinking” that Rep. Morella spoke of in the context of Y2K, remain major problems for us nearly 25 years later. Thus, revisiting Morella’s provocation today leaves us not with an answer but with another question: “If Y2K didn’t, will the CrowdStrike Outage inspire that effort?”
It's too early to answer that question. But nearly 25 years after Y2K, we can’t say we weren’t warned.
Zachary Loeb is an assistant professor in the history department at Purdue University. He works on the history of technology, the history of disasters, and the history of technological disasters. He is currently working on a book about the year 2000 computer problem (Y2K).
Made by History takes readers beyond the headlines with articles written and edited by professional historians. Learn more about Made by History at TIME here. Opinions expressed do not necessarily reflect the views of TIME editors.