Emerging Technologies

Why the future of data science looks spectacular

Glenn Wightwick
Deputy Vice-Chancellor, UTS

It wasn’t that long ago that we lived in an entirely analogue world. From telephones to televisions and books to binders, digital technology was largely relegated to the laboratory.

But during the 1960s, computing had started to make its way into the back offices of larger organisations, performing functions like accounting, payroll and stock management. Yet, the vast majority of systems at that time (such as the healthcare system, electricity grids or transport networks) and the technology we interacted with were still analogue.

Roll forward a generation, and today our world is highly digital. Ones and zeroes pervade our lives. Computing has invaded almost every aspect of human endeavour, from health care and manufacturing, to telecommunications, sport, entertainment and the media.

Take smartphones, which have been around for less than a decade, and consider how many separate analogue things they have replaced: a street directory, cassette player, notebook, address book, newspaper, camera, video camera, postcards, compass, diary, dictaphone, pager, phone and even a spirit level!

Underpinning this, of course, has been the explosion of the internet. In addition to the use of the internet by humans, we are seeing an even more pervasive use for connecting all manner of devices, machines and systems together – the so-called Internet of Things (or the “Industrial Internet” or “Internet-of-Everything”).

Complex systems

We now live in an era where most systems have been instrumented and produce very large volumes of digital data. The analysis of this data can provide insights into these systems in ways that were never possible in an analogue world.

Data science is bringing together fields such as statistics, machine learning, analytics and visualisation to provide a rigorous foundation for this field. And it is doing this in the same way that computer science emerged in the 1950s to underpin computing.

In the past, we have successfully developed complex mathematical models to explain and predict physical phenomena. For example, we can accurately predict the strength of a bridge, or the interaction of chemical molecules.

Then there’s the weather, which is notoriously difficult to forecast. Yet, based on numerical weather prediction models and large volumes of observational data along with powerful computers, we have improved forecast accuracy to the point where a five-day forecast today is as reliable as a two-day forecast was 20 years ago.

But there are many problems where the underlying models are not easy to define. There isn’t a set of mathematical equations that characterise the health care system or patterns of cybercrime.

What we do have, though, is increasing volumes of data collected from myriad sources. The challenge is that this data is often in many forms, from many sources, at different scales and contains errors and uncertainty.

So rather than trying to develop deterministic models, as we did for bridges or chemical interactions, we can develop data-driven models. These models integrate data from all the various sources and can take into account the errors and uncertainty in the data. We can test these models against specific hypothesis and refine them.

It is also critical that we look at these models and the data that underpins them.

360 degree data

At my university, we have built a Data Arena to enable the exploration and visualisation of data. The facility leverages open-source software, high-performance computing and techniques from movie visual effects to map streams of data into a fully immersive 3D stereo video system that projects 24 million pixels onto a four metre high and ten metre diameter cylindrical screen.

Standing in the middle of this facility and interacting with data in real-time is a powerful experience. Already we have built pipelines to ingest data from high-resolution optical microscopes and helped our researchers gain insight into how bacteria travel across surfaces.

We read 22 million points of data collected by a CSIRO Zebedee which had scanned the Wombeyan Caves, and ten minutes later we were flying though the cave in 3D and exploring underground.

No matter what sort of data we have been exploring, we have inevitably discovered something interesting.

In a couple of cases, it has been immediately obvious we have errors in the data. In an astronomical dataset, we discovered we had a massive number of duplicate data points. In other situations, we have observed patterns that hadn’t been evident to domain experts who had been analysing the data.

This phenomenon is the classic “unknown unknown” (made famous in 2002 by US Secretary of Defence Donald Rumsfeld) and highlights the power of the human visual system to spot patterns or anomalies.

Today’s world is drenched in data. It is opening up new possibilities and new avenues of research and understanding. But we need tools that can manage such staggering volumes of data if we’re to put it all to good use. Our eyes are one such tool, but even they need help from spaces such as that provided by Data Arena.

 

This article is published in collaboration with The Conversation. Publication does not imply endorsement of views by the World Economic Forum.

 To keep up with the Agenda subscribe to our weekly newsletter.

Author: Glenn Wightwick is the Deputy Vice-Chancellor of Research at the University of Technology, Sydney. 

 Image: A robotic tape library used for mass storage of digital data is pictured at the Konrad-Zuse Centre for applied mathematics and computer science (ZIB), in Berlin August 13, 2013.  REUTERS/Thomas Peter.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Stay up to date:

Data Science

Share:
The Big Picture
Explore and monitor how The Digital Economy is affecting economies, industries and global issues
A hand holding a looking glass by a lake
Crowdsource Innovation
Get involved with our crowdsourced digital platform to deliver impact at scale
World Economic Forum logo
Global Agenda

The Agenda Weekly

A weekly update of the most important issues driving the global agenda

Subscribe today

You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.

How we can bridge the AI divide with accessible AI data science agents

Darko Matovski

November 18, 2024

Billions of dollars have been invested in healthcare AI. But are we spending in the right places?

About us

Engage with us

  • Sign in
  • Partner with us
  • Become a member
  • Sign up for our press releases
  • Subscribe to our newsletters
  • Contact us

Quick links

Language editions

Privacy Policy & Terms of Service

Sitemap

© 2024 World Economic Forum