6 ways to future-proof universities
A technology entrepreneurship class at Stanford University, California. Image: REUTERS/Stephen Lam
The members of the Global University Leaders Forum community convened at the World Economic Forum Annual Meeting 2019 to discuss their role in our ever-changing world. Here are six topics that were top of the agenda as the members considered the future of the university and its role in society.
1. Introduce data science 101
Today data is omnipresent and often overwhelming. By way of example, Domo’s Data Never Sleeps 6.0 reported that in 2018 Google conducted an average 3.8 million searches per minute.
Though not all graduates will enter data-related fields, universities are starting to work towards increasing data literacy in their student body by adding data science courses and challenges for social science majors so that graduates can effectively communicate with their data-oriented peers and co-workers.
University College London requires successful applicants to its Management Science BSc to have a strong mathematical background to grapple with this new mass of data. Bocconi University requires its first-year MA students to learn the Python programming language as “[i]t is useful to know, at least in general, the logic of computer programming.”
2. Embed ethics
New technologies are being designed to deliver benefits for humanity, but the Fourth Industrial Revolution raises myriad questions about the values embedded in these new technologies. STEM students could benefit from engaging with liberal arts disciplines to help them grapple with these larger questions.
Many institutions are experimenting with ways to embed ethics into regularly programmed courses, as there is no singular interpretation of ethics. Harvard University is leveraging philosophy graduate students as teaching staff and assistants for several computer science courses, in addition to encouraging jointly developed courses across these disciplines and others. Princeton University is asking its engineering students to consider their role in preventing climate change.
The World Economic Forum and Carnegie Mellon University also launched an initiative to understand how ethics are taught in artificial intelligence (AI) courses.
3. Make data open and interoperable
Open science is a key issue for research today. One of the most pressing slices of this issue is open data.
Many innovations today hinge on the aggregation of large data sets. Researchers from the Universities of St. Gallen and Liechtenstein, for example, have underscored how data analysis can improve affordability and personalization of products and services.
There is an urgent need to create the ability to federate or share data sets without relinquishing control of the underlying data. As such data sets are often held by universities and non-profits, there is room for universities to play a role in designing how data is shared widely, and in an interoperable manner.
4. Consult social scientists in tech research
The Financial Times named “techlash” as one of the key words encapsulating 2018. Issues surrounding the ethics and logic governing new technologies are surfacing in spectacular ways only once problems arise, rather than being examined proactively.
The social sciences have a role to play in navigating inventions and mitigating the “techlash”. Time-honed methodologies can unearth the governance-related questions, trade-offs and benefits presented by new technologies, and understand how options are likely to be received by consumers and the public. A study conducted by researchers at the University of Campinas, Brazil showed how a lack of public education about developments in biotechnology and genetically modified food, 22 years after the first genetically modified food reached market, was still hindering acceptance of such techniques.
Today there is ample opportunity for social scientists to be consulted in the early stages of development and to take an active role in the debates around gene editing and autonomous vehicles, for example.
5. Embrace the usefulness of useless knowledge
In 1939, American educator Abraham Flexner pleaded in the pages of Harper’s Magazine for knowledge production to be decoupled from considerations of use:
“Institutions of learning should be devoted to the cultivation of curiosity and the less they are deflected by considerations of immediacy of application, the more likely they are deflected by considerations of immediacy of application, the more likely they are to contribute not only to human welfare but to the equally important satisfaction of intellectual interest which may indeed by said to have become the ruling passion of intellectual life in modern times.”
Flexner was not unbiased in his plea – a decade prior he founded the Institute for Advanced Study for independent inquiry, with a similar ethos – but he certainly had bona fides to speak on the issue, having authored the report responsible for provoking standardization of medical education in the United States.
Some of the biggest mysteries of our time are being tackled through basic research. A notable example is the work of the European Council for Nuclear Research (CERN), whose work has confirmed the standard model of particle physics.
However in times of tightening belts, research with no immediate application is susceptible to the labels “ridiculous” or “dumb”. Such “useless knowledge” can become relevant overnight due to shifting circumstances. A comprehensive debrief on the 2013-2016 Ebola outbreak in west Africa attributed propagation and failure to control in part to cultural and behavioural factors that had previously been surfaced in published demographic, anthropological and sociological research.
Universities need their partners in innovation ecosystems – in industry and in government – to help champion the production of knowledge for knowledge’s sake.
6. Find the right role for reskilling
Reskilling is a looming challenge for society. The World Economic Forum’s 2018 Future of Jobs Report found that “[b]y 2022, no less than 54% of all employees will require significant re- and upskilling.”
Universities are aware they have a role to play. To this end, the National University of Singapore launched a lifelong learning programme last year. It allows alumni to join continuing education and training courses for up to 20 years after they are admitted ensuring they have the right skills for our rapidly evolving global economy.
Companies are also leveraging university expertise to retain their workforce. AT&T launched a $1 billion initiative called “Future Ready” to retain the nearly half of its employees that no longer had the right skills. Carried out in partnership with traditional universities such as Georgia Tech, the University of Notre Dame and the University of Oklahoma, and online course providers Coursera and Udacity, Future Ready allows AT&T workers to pursue new qualifications, culminating in either MA or MSc degrees or badges that attest to attainment of specific competences.
Don't miss any update on this topic
Create a free account and access your personalized content collection with our latest publications and analyses.
License and Republishing
World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.
The views expressed in this article are those of the author alone and not the World Economic Forum.
Stay up to date:
Data Science
The Agenda Weekly
A weekly update of the most important issues driving the global agenda
You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.
More on Education and SkillsSee all
Loida Flojo and Breanne Pitt
November 21, 2024