From trade unions to NGOs: we all need a say in how tech is governed
Legal and social systems are struggling to keep up with technological advances. Image: George Kroeker/Unsplash
The Fourth Industrial Revolution — a global transformation characterized by the convergence of digital, physical, and biological technologies — is just beginning.
However, as the Fourth Industrial Revolution matures, it will have an unparalleled, disruptive impact on society, upending how people communicate, organizations create value and humanity understands itself. Building on the foundations of the digital revolution, emerging technologies in the Fourth Industrial Revolution scale up exponentially through digital interoperability, emerge physically in smart products and services, and embed themselves prolifically in society.
The Fourth Industrial Revolution has the potential, like the revolutions that went before it, to consolidate power asymmetries, increase inequalities, and advance technologies that fail to embody human-centred values.
Legal and social protection systems are in the uncomfortable position of trying to keep up as new technologies are introduced at an increasingly rapid pace. While emerging technologies like artificial intelligence and drones can provide added value towards human rights, basic service delivery, and other functions of civil society, these same technologies are already being used against civic freedoms and human rights — often by governments and the private sector to limit the impact of their activities and the voices of their constituents.
Advocacy groups, trade unions, child-rights organizations and other institutions increasingly face targeted digital threats and cyber attacks, threatening the security of vulnerable populations working with civil society organizations.
Artificial intelligence, advanced robotics, drones and other technologies have the potential to be used for repressive purposes to assert control and limit human rights.
”Government deployment of surveillance software, internet shutdowns and “blogging taxes” target specific populations within a country, often intentionally used to disrupt civic expression and freedoms. Artificial intelligence, advanced robotics, drones and other technologies have the potential to be used for similarly repressive purposes to assert control and limit human rights.
The direct and indirect effects of technology also push civil society organizations — advocacy groups, humanitarian organizations, development organizations, trade unions, and others — into difficult, often simultaneous positions in relation to emerging technologies: advocating for rights and freedoms while threats from new technologies are still nascent, difficult to understand or lack legal protections; deployment of new technologies to increase efficiency with limited resources and space to operate; self-preservation in the midst of technology-enabled government attacks on civic space; and limited engagement as the capacities needed for change (talent, technology, funding, etc.) prove to be too difficult to procure.
New technologies, new challenges
A key function of a thriving civil society sector in democracies is its ability to promote accountability, fairness, trust and transparency in society, particularly in response to gaps and failures in both the public and private sector.
While governments and companies in democratic societies have the responsibility to take account and mitigate the potential harms and effects of their activities, civil society and citizens aim to answer a range of questions concerning the decisions made by governments and companies: can we be sure that the decisions are accurate? In what ways do we understand the impact of these decisions? Can we trust that the decisions were fair and unbiased?
While the state’s legitimacy is bound by its accountability to its citizens, the private sector is not bound in the same way to consumers. However, increasingly across most industries, companies have become compelled — as a result of market dynamics, corporate interests and public image concerns — to act in ways to satisfy customers and benefit society as a whole (through corporate social responsibility). Although companies have largely followed these trends, clear gaps often exist where citizen interests and oversight come into conflict with shareholder interests.
The use of digital and emerging technologies by companies, governments and civil society groups has ushered in new challenges and entrenches existing challenges associated with accountability, fairness, trust and transparency:
- Accountability: Who should be accountable for, as well as redress, the ill-effects of digital and emerging technologies? And who is responsible for shaping the influence and impact of these technologies? Various examples stand out, including algorithmic accountability in stopping the spread of fake news; individual and group privacy violations resulting from data-sharing projects and maps; high volatility associated with crypto-philanthropy; and bias and misrepresentation in virtual reality (VR) story telling.
- Fairness: How can society ensure that individuals and groups are treated in the same way? Challenging areas include: the lack of inclusive or participative models; auditing disparate impact in algorithmic systems; and selection bias.
- Trust: How does the use of digital and emerging technologies promote, rather than decrease, trust within society? Challenging areas include: the lack of utility of digital data notice and consent mechanisms; limited dialogue and communication; and a power imbalance.
- Transparency: How does the use of digital and emerging technologies promote transparency? Challenging areas include: the lack of interpretable frameworks; harmful transparency from open data (causing group privacy violations); and information manipulation and opacity (e.g. in algorithmically-curated systems recommending alternative or extremist versions of available information).
Turning the tide, together
Concerns about data protection, digital misinformation and ethical use of technologies highlight the need for greater participation and partnership in governing how these powerful, emerging technologies shape societies.
Civil society organizations and the broader ecosystem of social innovation stakeholders (businesspeople, academics, philanthropists, social entrepreneurs, etc) need to help change the direction in which the situation is currently going.
Academia has long hosted several ongoing conversations about responsibility and ethics in data, algorithms, and technology use. In the last decade, several coalitions and groups (e.g. related to responsible data, responsible/ethical AI, responsible drones etc) have emerged involving practitioners, data scientists, computer scientists and some members of civil society. In addition, several sets of principles and recommendations have emerged from multi-stakeholder gatherings, emphasizing interpretability, oversight controls and identification of shared responsibility issues related to digital innovation, AI and other emerging technologies.
Civil society organizations have mainly been working within their own organizations and networks to draw insights on innovation and technology adoption.
However, there is a need to move faster, together: sharing insights and resources for foresight, strategy development, and shared responsibility among civil society and other stakeholders in social innovation. Key shared experiences and challenges among all types of nonprofit and civil society organizations warrants a broader platform to discuss strategies for avoiding hype, protecting against digital harms, and ensuring a different, fairer kind of industrial revolution.
Examples of shared experiences and challenges include, but are not limited to:
Alignment challenges: How does our use of data and technology align with the needs of our organization, our beneficiaries, our users? How do we identify and define this?
Capacity challenges: What parts of our current organizational capacities reinforce or catalyse responsible data and technology use? What’s missing and what internal guidance is needed? Are there trade-offs in our current approach (e.g. talent procurement and project scope)?
Responsibility challenges: How do our projects and partnerships reflect our organization’s responsibilities to protect against digital and technology harms? What is our organization’s role in addressing shared responsibility challenges in shaping how data and technology impact society?
Ecosystem challenges: How are we engaging other stakeholders in the broader ecosystem of digital and emerging technologies?
The Forum has recently launched its project on Preparing Civil Society for the Fourth Industrial Revolution. In collaboration with the Forum Civil Society community, this three-year initiative aims to accelerate knowledge and promote shared resources towards facilitating innovation within civil society organizations and the sector’s meaningful inclusion in the governance of digital and emerging technologies.
Don't miss any update on this topic
Create a free account and access your personalized content collection with our latest publications and analyses.
License and Republishing
World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.
The views expressed in this article are those of the author alone and not the World Economic Forum.
Stay up to date:
The Digital Economy
Related topics:
The Agenda Weekly
A weekly update of the most important issues driving the global agenda
You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.
More on Fourth Industrial RevolutionSee all
Daniel Dobrygowski and Bart Valkhof
November 21, 2024