Emerging Technologies

7 innovations that could shape the future of computing

A semiconductor chip designer works on a computer component at the Indian unit of Texas Instruments Inc in Bangalore. Texas Instruments was the first computer software company to set up its operations in the city in 1985. Bangalore today is home to dozens of global and domestic technology companies employing an estimated 75,000 software professionals and is popularly referred to as Asia's 'Silicon Valley'.

So while we may be approaching the limits of what silicon chips can do, technology itself is still accelerating. Image: REUTERS/Savita Kirloskar;

Daniel Wellers

Moore’s Law posits that the number of transistors on a microprocessor — and therefore their computing power — will double every two years. It’s held true since Gordon Moore came up with it in 1965, but its imminent end has been predicted for years. As long ago as 2000, the MIT Technology Review raised a warning about the limits of how small and fast silicon technology can get.

The thing is, Moore’s Law isn't really a law. It’s more of a self-fulfilling prophecy. Moore didn’t describe an immutable truth, like gravity or the conservation of momentum. He simply set our expectations, and lo, the chip makers delivered accordingly.

In fact, the industry keeps finding new ways to pack more power onto tinier chips. Unfortunately, they haven’t found ways to cut costs on the same exponential curve. As Fast Company reported in February 2016, the worldwide semiconductor industry is no longer planning to base its R&D plans for silicon chips around the notion of doubling their power every two years, because it simply can’t afford to keep up that pace in purchasing the incredibly complex manufacturing tools and processes necessary. Besides, current manufacturing technology may not be able to shrink silicon transistors much more than it already has. And in any event, transistors have become so tiny that they may no longer reliably follow the usual laws of physics — which raises questions about how much longer we’ll dare to use them in medical devices or nuclear plants.

So does that mean the era of exponential tech-driven change is about to come to a screeching halt?

Not at all.

Even if silicon chips are approaching their physical and economic limits, there are other ways to continue the exponential growth of computing performance, from new materials for chips to new ways to define computing itself. We’re already seeing technological advances that have nothing to do with transistor speed, like more clever software driven by deep learning and the ability to achieve greater computing power by leveraging cloud resources. And that’s only the tiniest hint of what’s coming next.

Here are a few of the emerging technologies that promise to keep computing performance rocketing ahead:

In-memory computing. Throughout computing history, the slowest part of processing has been getting the data from the hard disks where it’s stored to random access memory (RAM), where it can be used. A lot of processor power is wasted simply waiting for data to arrive. By contrast, in-memory computing puts massive amounts of data into RAM where it can be processed immediately. Combined with new database, analytics, and systems designs, it can dramatically improve both performance and overall costs.

Graphene-based microchips. Graphene — one molecule thick and more conductive than any other known material (see The Super Materials Revolution) — can be rolled up into tiny tubes or combined with other materials to move electrons faster, in less space, than even the smallest silicon transistor. This will extend Moore’s Law for microprocessors a few years longer.

Quantum computing. Even the most sophisticated conventional computer can only assign a one or a zero to each bit. Quantum computing, by contrast, uses quantum bits, or Qubits, which can be a zero, a one, both at once, or some point in between, all at the same time. (See this explainer videofrom the Verge for a surprisingly understandable overview.) Theoretically, a quantum computer will be able to solve highly complex problems, like analyzing genetic data or testing aircraft systems, millions of times faster than currently possible. Google researchers announced in 2015 that they had developed a new way for qubits to detect and protect against errors, but that’s as close as we've come so far.

Molecular electronics. Researchers at Sweden’s Lund University have used nanotechnologyto build a “biocomputer” that can perform parallel calculations by moving multiple protein filaments simultaneously along nanoscopic artificial pathways. This biocomputer is faster than conventional electrical computers that operate sequentially, approximately 99 percent more energy-efficient, and cheaper than both conventional and quantum computers to produce and use. It’s also more likely to be commercialized soon than quantum computing is.

DNA data storage. Convert data to base 4 and you can encode it on synthetic DNA. Why would we want to do that? Simple: a little bit of DNA stores a whole lot of information. In fact, a group of Swiss researchers speculate that about a teaspoon of DNA could hold all the data humans have generated to date, from the first cave drawings to yesterday’s Facebook status updates. It currently takes a lot of time and money, but gene editing may be the future of big data: Futurism recently reportedthat Microsoft is investigating the use of synthetic DNA for secure long-term data storage and has been able to encode and recover 100 percent of its initial test data.

Neuromorphic computing. The goal of neuromorphic technologyis to create a computer that’s like the human brain—able to process and learn from data as quickly as the data is generated. So far, we’ve developed chips that train and execute neural networks for deep learning, and that’s a step in the right direction. General Vision’s neuromorphic chip, for example, consists of 1,024 neurons — each one a 256-byte memory based on SRAM combined with 3,000 logic gates — all interconnected and working in parallel.

Passive Wi-fi. A team of computer scientists and electrical engineers at the University of Washington has developed a way to generate Wi-fi transmissions that use 10,000 times less power than the current battery-draining standard. While this isn’t technically an increase in computing power, it is an exponential increase in connectivity, which will enable other types of advances. Dubbed one of the 10 breakthrough technologies of 2016 by MIT Technology Review, Passive Wi-fiwill not only save battery life, but enable a minimal-power Internet of Things, allow previously power-hungry devices to connect via Wi-fi for the first time, and potentially create entirely new modes of communication.

So while we may be approaching the limits of what silicon chips can do, technology itself is still accelerating. It’s unlikely to stop being the driving force in modern life. If anything, its influence will only increase as new computing technologies push robotics, artificial intelligence, virtual reality, nanotechnology, and other world-shaking advances past today’s accepted limits.

In short, exponential growth in computing may not be able to go on forever, but its end is still much farther in the future than we might think.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Stay up to date:

Fourth Industrial Revolution

Related topics:
Emerging TechnologiesFourth Industrial Revolution
Share:
The Big Picture
Explore and monitor how Fourth Industrial Revolution is affecting economies, industries and global issues
World Economic Forum logo

Forum Stories newsletter

Bringing you weekly curated insights and analysis on the global issues that matter.

Subscribe today

Here’s why it’s important to build long-term cryptographic resilience

Michele Mosca and Donna Dodson

December 20, 2024

How digital platforms and AI are empowering individual investors

About us

Engage with us

  • Sign in
  • Partner with us
  • Become a member
  • Sign up for our press releases
  • Subscribe to our newsletters
  • Contact us

Quick links

Language editions

Privacy Policy & Terms of Service

Sitemap

© 2024 World Economic Forum