Why the internet is outgrowing Moore's Law
Image: The hands of Malini Agarwal, blogger-in-chief of missmalini.com. REUTERS/Vivek Prakash
It’s been 50 years since Gordon Moore came up with what’s now known as “Moore’s Law” and since then we’ve seen some of the greatest technological achievements in history. But we’ve hit an inflection point. The demand for communications bandwidth is expanding faster and faster each year, and we’re entering a stage where just Moore’s Law and faster and cheaper computing power will simply not be enough.
Moore’s Law is based on Gordon Moore’s 1965 prediction that the number of components on integrated circuits could be expected to double without increasing in cost approximately every two years. This idea has been setting the trajectory of silicon integrated circuit technology, computer processing speed and power for decades. For many of us in technology today, Moore’s Law represents a prediction that has been accurate for longer than many of us have been alive.
It explains why, today, we can hold something in our hands with more computing power than the computers that guided Apollo 11’s astronauts to the moon. Our desktop machines now contain more processing power than what we used to call a “supercomputer.” It was Moore’s Law that predicted that digital signal processing would finally catch up to what we needed to enable high-capacity coherent optical communication enabling 100G to become the unit of currency for present day networks.
The problem now lies with our continued reliance on primarily hardware-based improvements to drive additional bandwidth. Hardware-based solutions generally improve at the pace Moore predicted. Meanwhile, Gartner has recently predicted that global mobile data traffic will reach 52 million terabytes in 2015, an increase of nearly 60 percent over 2014.
The value of the Internet scales with the number of connected users, according to Metcalfe’s Law (named for networking pioneer Bob Metcalfe), and we are all readying ourselves for the Internet of Things, where almost everything that can be controlled or observed will be connected to a network. We have some early sense of what future bandwidth demands may be, but what awaits us in the IoT era will likely blow many of our projections out of the water.
So if our hardware capabilities can only double every two years, we need to find new tools that enable us to keep up with growth in demand.
The path forward requires that the bandwidth our networks provide becomes smarter. The networks themselves need to become programmable platforms. The infrastructure needs to be as real-time, flexible and dynamic as our smartphones have become. The answer to the problem of increased demand on the network is to flip that phrase around and evolve to what can be called network-on-demand. Network topology, connectivity, service class and quality of service all need to be on-demand services that can be customized to suit the needs of the end user.
The networking paradigm must change from one where we simply provision a connection and then sit back and monitor it to one where we are orchestrating and controlling the network and its associated applications and services in real-time.
A greater use of software is the key. We can take some examples from the computer industry. As the limits of processing speed were reached, individual processors saturated and we began doing more things in parallel, building machines that deployed arrays of simultaneous chips working together to solve problems. Consequently, software has evolved to operate across these so-called multiple-core architectures in order to achieve better performance. This same idea –leveraging software in parallel with hardware – needs to be applied when it comes to mapping our network architectures.
This is an exciting time to be in networking. With the advent of Software Defined Networks (SDN) and Network Function Virtualization (NFV), the network is no longer just a series of siloed pieces of hardware; the software is coming together now to enable programmable solutions that allow us to leap ahead of the pace of Moore’s Law.
Today’s increasingly software-enabled solutions, when properly combined with orchestration, become readily scalable, easily upgradable and much less expensive than the hardware-limited methods we currently rely on. Today, we can see a path to virtualize many of our network functions, replacing dedicated hardware-centric appliances with software equivalents that can be deployed at the click of a mouse. We can program software to scale up or down networks to meet the demands of an end user.
Hardware can only practically move at the pace that Moore’s Law predicted, but our networks must now move even faster. The growth in connectivity and the need for bandwidth won’t allow us to wait for Moore’s law to deliver. Bandwidth must become smart, and SDN,NFV, and Network Orchestration are the keys.
Author: Steve Alexander is CTO of Ciena Corporation
Don't miss any update on this topic
Create a free account and access your personalized content collection with our latest publications and analyses.
License and Republishing
World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.
The views expressed in this article are those of the author alone and not the World Economic Forum.
Stay up to date:
The Digital Economy
The Agenda Weekly
A weekly update of the most important issues driving the global agenda
You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.
More on Fourth Industrial RevolutionSee all
Olga Stelmakh-Drescher
November 4, 2024