The computer revolution: how it's changed our world over 60 years
The BlueGene/L supercomputer can perform 280.6 trillion operations per second. Image: REUTERS/KimberlyWhite
It is a truism that computing continues to change our world. It shapes how objects are designed, what information we receive, how and where we work, and who we meet and do business with. And computing changes our understanding of the world around us and the universe beyond.
For example, while computers were initially used in weather forecasting as no more than an efficient way to assemble observations and do calculations, today our understanding of weather is almost entirely mediated by computational models.
Another example is biology. Where once research was done entirely in the lab (or in the wild) and then captured in a model, it often now begins in a predictive model, which then determines what might be explored in the real world.
The transformation that is due to computation is often described as digital disruption. But an aspect of this transformation that can easily be overlooked is that computing has been disrupting itself.
Evolution and revolution
Each wave of new computational technology has tended to lead to new kinds of systems, new ways of creating tools, new forms of data, and so on, which have often overturned their predecessors. What has seemed to be evolution is, in some ways, a series of revolutions.
But the development of computing technologies is more than a chain of innovation – a process that’s been a hallmark of the physical technologies that shape our world.
For example, there is a chain of inspiration from waterwheel, to steam engine, to internal combustion engine. Underlying this is a process of enablement. The industry of steam engine construction yielded the skills, materials and tools used in construction of the first internal combustion engines.
In computing, something richer is happening where new technologies emerge, not only by replacing predecessors, but also by enveloping them. Computing is creating platforms on which it reinvents itself, reaching up to the next platform.
Getting connected
Arguably, the most dramatic of these innovations is the web. During the 1970s and 1980s, there were independent advances in the availability of cheap, fast computing, of affordable disk storage and of networking.
Compute and storage were taken up in personal computers, which at that stage were standalone, used almost entirely for gaming and word processing. At the same time, networking technologies became pervasive in university computer science departments, where they enabled, for the first time, the collaborative development of software.
This was the emergence of a culture of open-source development, in which widely spread communities not only used common operating systems, programming languages and tools, but collaboratively contributed to them.
As networks spread, tools developed in one place could be rapidly promoted, shared and deployed elsewhere. This dramatically changed the notion of software ownership, of how software was designed and created, and of who controlled the environments we use.
The networks themselves became more uniform and interlinked, creating the global internet, a digital traffic infrastructure. Increases in computing power meant there was spare capacity for providing services remotely.
The falling cost of disk meant that system administrators could set aside storage to host repositories that could be accessed globally. The internet was thus used not just for email and chat forums (known then as news groups) but, increasingly, as an exchange mechanism for data and code.
This was in strong contrast to the systems used in business at that time, which were customised, isolated, and rigid.
With hindsight, the confluence of networking, compute and storage at the start of the 1990s, coupled with the open-source culture of sharing, seems almost miraculous. An environment ready for something remarkable, but without even a hint of what that thing might be.
The ‘superhighway’
It was to enhance this environment that then US Vice President Al Gore proposed in 1992 the “information superhighway”, before any major commercial or social uses of the internet had appeared.
Meanwhile, in 1990, researchers at CERN, including Tim Berners-Lee, created a system for storing documents and publishing them to the internet, which they called the world wide web.
As knowledge of this system spread on the internet (transmitted by the new model of open-source software systems), people began using it via increasingly sophisticated browsers. They also began to write documents specifically for online publication – that is, web pages.
As web pages became interactive and resources moved online, the web became a platform that has transformed society. But it also transformed computing.
With the emergence of the web came the decline of the importance of the standalone computer, dependent on local storage.
We all connect
The value of these systems is due to another confluence: the arrival on the web of vast numbers of users. For example, without behaviours to learn from, search engines would not work well, so human actions have become part of the system.
There are (contentious) narratives of ever-improving technology, but also an entirely unarguable narrative of computing itself being transformed by becoming so deeply embedded in our daily lives.
This is, in many ways, the essence of big data. Computing is being fed by human data streams: traffic data, airline trips, banking transactions, social media and so on.
The challenges of the discipline have been dramatically changed by this data, and also by the fact that the products of the data (such as traffic control and targeted marketing) have immediate impacts on people.
Software that runs robustly on a single computer is very different from that with a high degree of rapid interaction with the human world, giving rise to needs for new kinds of technologies and experts, in ways not evenly remotely anticipated by the researchers who created the technologies that led to this transformation.
Decisions that were once made by hand-coded algorithms are now made entirely by learning from data. Whole fields of study may become obsolete.
The discipline does indeed disrupt itself. And as the next wave of technology arrives (immersive environments? digital implants? aware homes?), it will happen again.
Don't miss any update on this topic
Create a free account and access your personalized content collection with our latest publications and analyses.
License and Republishing
World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.
The views expressed in this article are those of the author alone and not the World Economic Forum.
Stay up to date:
Digital Communications
Forum Stories newsletter
Bringing you weekly curated insights and analysis on the global issues that matter.
More on Fourth Industrial RevolutionSee all
Mihir Shukla
December 23, 2024