Training scientists in the arts could make our world safer
Scientists should be taught arts and humanities in order to better understand ethics and moral value in their work. Image: REUTERS/Jason Lee
In 1959, the British physicist and novelist C.P. Snow delivered a famously controversial lecture at Cambridge University. He described a post-war schism between two groups — scientists and the literary world.
Snow identified this as a newly emergent divide, across which each party was more than happy to sneer at the other: Scientists proudly unable to quote a phrase of Shakespeare, and literary types untroubled by the second law of thermodynamics.
Those divisions within the university seem now more deeply entrenched than ever before. And those working within the arts and the sciences face a third antagonist in society: Populism, with its attendant and increasing distrust of intellectuals.
This powder keg occurs in a context of growing economic disparity and, incongruously, the increasing role of technological innovations in our daily life.
I’m a computer scientist who studies digital culture. I try my best to bridge the divides, but constantly ask the question: How can universities train our scientists, technologists and engineers to engage with society, as Snow suggested, rather than perform as cogs in the engine of economic development?
I believe we need our educational system to engage students with issues of ethics and responsibility in science and technology. We should treat required arts and humanities courses not as some vague attempt to “broaden minds” but rather as a necessary discussion of morals, values, ethics and responsibility.
Identifying society’s grand challenges
I was recently part of a conversation at the Fields Institute at the University of Toronto which asked another question: “What does the A stand for in STEAM?”
STEAM inserts arts into the acronym for STEM (science, technology, engineering and math). I chose to frame the arts more widely to include the humanities, and asked the attendees: How do we identify the challenges we wish to work on?
In the 1960s, Alan Ginsberg bemoaned what he saw as the dominant culture oppressing artistic creativity. Written today, his poem Howl might cry something like this:
“I saw the best minds of my generation spend their lives optimizing microseconds out of their high-frequency trading algorithms, or devising routing-algorithms for drone-delivered burritos.”
Are these the biggest problems for our society?
Graduates of science and engineering programs understandably chase positions in start-ups or high-salaried finance jobs. Their knowledge of algorithmic development, data analysis or simply structured scientific thinking may net them fantastic jobs at a variety of private-sector employers.
But the problems they engage with, while impacting a large number of citizens, may not improve the lot of those citizens.
Our graduates’ entire careers might develop advertising software that reaches millions, without engaging the larger questions of our lifetime for those millions.
Technology raises moral questions
There are major issues of debate in our scientific and technological community, with serious questions about bias, power and control.
Let’s take a few examples from the headlines of this year:
- We are increasingly seeing the influence of software algorithms in applications with life-changing impacts, such as criminal sentencing or employment. These software systems remain unmonitored “black boxes” that may be influenced by racial bias, special interests or simply bad science. We can’t tell, as the systems are protected from scrutiny by intellectual property law. Or, in the case of deep learning, they are evolved systems too complex to be decoded by their very developers.
- In biotechnology we see conversations about the use of CRISPR for germ-line editing — a domain in which genetic edits affect not just a patient but carry through to future generations. Others are debating gene drives, a way to short-circuit nature’s checks and balances, allowing gene edits to spread to an entire population far more quickly than natural evolution could manage.
- We’re seeing the science community and major nations seek to address the challenge of climate change through geoengineering, making massive-scale edits to our planet’s most fundamental systems.
These are not technological issues. They contain technological issues but they are not fundamentally technological issues. They are ethical ones. They require sophisticated experts to debate issues of ethics and society— to plan what, and if, we need to create.
It’s as if we’ve encountered several simultaneous Manhattan Projectsthrough the application of military DARPA funding, venture-capital investment and advancements in cloud-computing. We are seeing a whole host of life-changing technologies come to fruition after decades of basic research — and the rapid prototyping tools and production pipelines of the modern era have let us scale these new inventions faster than ever before.
And, as with the questions about ethics and the atomic bomb that led to the formation of the Federation of Atomic Scientists, we suddenly have important moral questions that only those creators have unique and important insights into.
We need to make sure STEM graduates working in these fields are able to engage with the toughest questions of our time: What, where and how should our new inventions be engaged?
Grounding experiments in empathy
I would like to see university curricula in STEM subjects expanded — to discuss whether we should develop certain technologies at all, with ethical concerns a common thread throughout our studies. The risks to society of anything else seem paramount.
I don’t argue that all policy-makers should be scientists, but rather that scientists should include the world of policy and social impact in their remit. They should be able to credibly think about and discuss those impacts with the rest of the world.
Snow thought the scientific mind “impatient to see if something can be done” — which echoes the “bias to action” so prevalent in start-up culture.
Action can be important, and even governments, not known for agile movement, are starting to embrace learning-through-doing. Finland, for example, has a department of experimentation which aims to bring design-thinking experimentation into policy work.
But even design thinking, the darling methodology-of-the-moment, grounds experiments in empathy. The developers of solutions should themselves be engaged with those affected by their works, co-creating through a direct engagement with users, with customers, with clients, with citizens.
Teaching ethics through the arts
And how else do our universities teach empathy, ethics and citizenship than through our arts and humanities fields?
There may be specific questions of citizenship, of responsibility, that we feel any and all STEM graduates should engage with (as there may be basic numeracy, stats or scientific literacy required for any non-STEM-trained citizen of the digital age).
I make no special claim to know the precise content of these classes, or to prescribe the curriculum of our degree program. We must develop them together. Tangible examples include this crowd-sourced list of Computer Science Ethics courses compiled by Casey Fiesler at the University of Colorado, Boulder.
Crises in medical research, such as the Tuskegee Syphllis Study, helped jump-start the fields of medical ethics and bioethics as well as concepts such as informed consent. Medical professionals now engage with complex questions of inclusion, representation, voice and agency.
These aren’t elements of dosage or measurement, but rather touch upon more abstract ideas of rights, values, and meaning — core elements in our study of the humanities. It’s time for the rest of the STEM field to engage with the same issues.
Don't miss any update on this topic
Create a free account and access your personalized content collection with our latest publications and analyses.
License and Republishing
World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.
The views expressed in this article are those of the author alone and not the World Economic Forum.
Stay up to date:
Long-Term Investing, Infrastructure and Development
The Agenda Weekly
A weekly update of the most important issues driving the global agenda
You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.
More on Fourth Industrial RevolutionSee all
Darko Matovski
November 18, 2024