Scroll down for full podcast transcript - click the ‘Show more’ arrow
If you’re a ‘digital native’ - someone who can’t remember a world before the internet - you might feel you have a good idea of the role technology will play in your life and perhaps in that of future generations.
But journalism professor Jeff Jarvis, author of a history of another transformative technology from more than five centuries ago - the printing press - says we can have no way yet of knowing where the internet, and AI, will take us.
The book is called The Gutenberg Parenthesis. Jeff spoke to us at the World Economic Forum's AI Governance Summit.
Join the World Economic Forum Podcast Club
Join the World Economic Forum Book Club
Podcast transcript
This transcript has been generated using speech recognition software and may contain errors. Please check its accuracy against the audio.
Jeff Jarvis, author, The Gutenberg Principle: I think it's too soon to believe that we know what the Internet is. It's too soon to think we can control it, but it's too late already to begin studying it.
Robin Pomeroy, host, Radio Davos: If you’re a ‘digital native’ - someone who can’t remember a world before the internet - you might feel you have a good idea of the role technology will play in your life and perhaps in that of future generations.
But this author of a history of another transformative technology from more than five centuries ago - the printing press - says we can have no way yet of knowing where the internet, and AI, will take us.
Jeff Jarvis: I'm fascinated with the idea that if you held this forum about print and its disruption in 1480, what you would be looking at is books that still look as if they were handwritten, with no page numbers, no titles, no title pages, no paragraph indentations. You'd have no sense of what was going to be done with it.
Robin Pomeroy: Journalism professor Jeff Jarvis has just published a book called The Gutenberg Parenthesis, which looks at the lessons we can learn from the age of print - as we leave it for a much more complex technology that we have yet to fully grasp
Jeff Jarvis: AI is a machine that can now finally understand us when we speak to it. It's now taking on a human nature. And so what we need is ethics and anthropology and sociology and psychology and history, brought into this discussion. And one of the things the Forum is so good at is convening groups and multi-stakeholders, as it's called.
Robin Pomeroy: Subscribe to Radio Davos wherever you get your podcasts, or visit wef.ch/podcasts.
I’m Robin Pomeroy at the World Economic Forum, and with this look at the printing press, the internet and AI…
Jeff Jarvis: The human frailties and failures that we bring to these technologies. That's what we're guarding against. It's not that the technology is dangerous. It's that we can be dangerous with it.
Robin Pomeroy: This is Radio Davos
Imagine a leap in technology that transforms the way we communicate - a transformation so fundamental that it has the power to change the structure of society, of global trade, even of how we see ourselves as humans.
The rise of the internet and the dawn of artificial intelligence might be one - or two - technological leaps that we consider unprecedented. But there are precedents in history of powerful technologies taking humanity in unpredictable directions.
One of those happened half a millennium ago when Johannes Gutenberg invented the printing press. In a new book called The Gutenberg Parenthesis, Jeff Jarvis, journalism professor and media watcher, details what happened next, and draws parallels between the transformation unleashed in the mid-1400s to that of the early 21st century.
I caught up with Jeff at the World Economic Forum’s recent AI Governance Summit in San Francisco, where leaders of business, policymakers, academics and civil society figures met to try to foresee our AI future and work out how humanity might prepare for it.
We’ll be looking more closely at the issues raised at the AI Governance Summit in subsequent episodes, but on this one, I wanted to step back and I asked Jeff Jarvis what can we learn from the history of the age of print - now we are leaving it.
Robin Pomeroy: I'm delighted to be joined by Jeff Jarvis, who's joining us to talk about the future of humanity. Jeff, is that right?
Jeff Jarvis: And the past.
Robin Pomeroy: Well exactly the past. Your book is called The Gutenberg Parenthesis. What could that possibly mean?
Jeff Jarvis: I think that we have lessons to learn from our entry into the age of print as we leave it.
I'm not saying that history repeats itself or even that it's a carbon copy or sings in harmony. Nonetheless, we went through a tremendous transition in society when print arrived. Important to say that print was first used in Korea and China, but I look at the broad spread that occurred with Gutenberg and his Bible in about 1454.
Huge societal change occurred, and I think that we have lessons from that about technological change, about societal change, about adaptation, about the dangers that exist. And so I wanted to look back at that and really learn about the culture and history of print so that we could try to judge also who we were in it as we now make decisions about what to hold on to or not as we leave it.
Robin Pomeroy: One of the quotes in the book about where we are now: 'We do not yet know what the net is until we, our descendants actually, learn what can be done with it.'
I think for the net you could also say AI.
Jeff Jarvis: Absolutely, it's a continuum, I think.
Robin Pomeroy: And you look back at the history, 1450, am I right? They didn't really know what they'd done for maybe a couple of generations, would you say?
Jeff Jarvis: Absolutely. I think that Gutenberg himself was a creature not of his own age. He was a creature of the prior age. And all he was trying to do was to automate, improve the work of the scribes.
And the timeline here, if I can do it very quickly, fascinates me because you have: 1450-4 Gutenberg is printing the bible on movable type. The next 50 years are known as the incunabula, or infant age of print, when it just mimics what the scribes did, pretty much. Around 1500 you start to see the characteristics of the book we know today: page numbers, titles, title pages, paragraph annotations, and so on.
The business, by the way, was in shambles then. Too much money had flowed in, new technologies sound familiar? And the market was sated with product of the ancients. Who rescued it, but Martin Luther in 1517. And we went through a Thirty Years' War and other problems along the way.
So come another century on, a few years this side and that of 1600, you see for the first time a huge rush of innovation with print. That is to say, the invention of the essay by Montaigne, the modern novel Cervantes, the newspaper, and a market for printed plays of Shakespeare. That really fascinated me because I think it took that long before print became so commonplace.
Then what was interesting was not the technology anymore wasn't seen as a technology anymore. It was simply a methodology of manufacture. Now what's interesting is what you could do with that and the people you could talk, to the publics you could create.
Keep going. Another century on, 1710, we see the first business model for print, really, with copyright, which was not done to protect creators, by the way. It was done to create a marketplace for creativity as a tradeable asset.
Next century on, 1800, the first time in all this time that we see any innovation with the technology itself. Steam-powered presses, paper made from wood pulp, stereotyping, typesetting machines, all happened in the 19th century, leading to the creation of the mass. A business model that we see today, invented in 1893, with the creation of the dime, ten cent, magazine where you lose money on every copy sold. You make it up on advertising. Thus the creation of the attention marketplace and the mass market that we have today.
1920s: radio, first major competitor. Print hated it and tried to keep it out of news. 1950s: television.
And here we are today. We are only about a third of a century away from the introduction of the commercial browser to the public, which is to say it's about the year 1480 in Gutenberg years.
Robin Pomeroy: Let's go way back in history then and to see what parallels we can draw.
There was a lot of opposition to print, probably first by the scribes, perhaps, they were the people who were reproducing books, usually Bibles at that time. But lots of other opposition that kind of grew in different areas in some of the years you're talking about probably throughout that. But can you give us some idea of what the first opposition was? Because people are terrified of AI and people have been terrified of the internet and social media and probably still are. Who was terrified of the printing press?
Jeff Jarvis: Some weren't as terrified as they should have been. The Catholic Church kind of didn't realise what was going to happen come Luther, and then then got terrified and then tried to control it. By then it was too late.
The scribes were of different schools. Some said, 'Thank God I can put down my pen'. Some were scolded, that that was an act of devotion that should continue. And you think about it, scribal culture continued until the typewriter. It didn't go away fully. It just changed the marketplace and changed the business considerably.
It is said that the first call for censorship of the press came in 1470, when Niccolo Perotti, a translator, was much offended by a shoddy translation of Pliny, and he wrote to the Pope and he said, Your Holiness, something must be done. Sounds very familiar. Something must be done. You must appoint a censor to check all of this stuff before it's printed. Someone who is erudite and smart.
Well, as I thought about this, I thought that's not really a call for censorship. What he was seeking, what he was anticipating was the creation of the institutions of editing and publishing that were necessary to find mechanisms of quality, authority, artistry in print, and it would eventually happen.
So I think that it's important to look not just at the Whack-A-Mole efforts to get rid of bad stuff, the Index of Forbidden Books from the Vatican, book burnings and so on, but also to look at what we do as a society to look for the good stuff.
And I think we're in the phase right now in the Internet where we say, oh my God, there's bad stuff out there, something must be done, government must step in. Moral entrepreneurs are screaming about this. It's ruining society - as if we weren't ruined already.
And I made this mistake as much as anyone after the the elections of 2016 with Trump and Brexit, I raised a lot of money to do projects on disinformation out of my school. And, fine, they were good projects. But I think it was a mistake to concentrate only on the bad stuff. What we need now is institutions that will support the good stuff, find the quality, discover it, support it, recommend it. And I think we haven't created those new institutions yet.
Robin Pomeroy: I've also done work in media literacy and you give it a gentle kicking in the book. And you suggest people of my age, older men, according to research, are the ones more likely to spread untrue news. And actually, a younger generation is doing it less.
Jeff Jarvis: Exactly. Which is to say that the kids are all right. Grandpa is screwing up the world.
And and I think that what we see happening, with any new effort to allow people more people to speak than before, the voices have always been there. They couldn't be heard in mainstream mass media run by people who look like me - old white men. And those voices can now be heard, they can now demand their spot at the table where norms are set and power is divvied up. The people who own the table before are going to object.
And so I want to be careful again not to draw parallels to history too much. But I've asked myself as I was writing the book: is our Martin Luther born yet? Is our reformation started?
As an American, I look at Black Lives Matter potentially as a definition of a reformation that was able to start because of this new technology that allows communities who were previously not heard to be heard. And if that's the case, then probably January 6th attack on the capital of the U.S. was the counter reformation. It is the old power structure, the old white power structure resenting this intrusion upon their power that the Internet enables.
And so what we have is this negotiation of power and norms when there's disruption. That's what's so much of a parallel between print and now, I think.
Robin Pomeroy: The question you raise and it is raised all the time with discussions about free speech and free speech absolutism. Because free speech, everyone would say they like it, you know, the dictator in a country would probably say, well, free speech, but up to this point. But the Internet and social media has allowed, and AI will amplify this much more, certain voices, kind of rabble rousing, and this would have been the case back in, you know, the 1400s, 1500s, to make their voice much louder than it probably would have been under the old media, the mainstream media, which is a phrase that's always trotted out. But then the question is: who's to say which voices should be heard? You know, we want experts to be heard, but who decides who's telling the truth? Who decides who's being manipulative and working for the bad? And I don't think we have any answers to that at the moment, do we?
Jeff Jarvis: No, we don't. And I'm writing another book about the Internet and media's moral panic over it. And as I was researching that, I wondered how people dealt with authority before print.
Important to say that when print came along, it was not trusted at all because the provenance was not clear. Anybody can print a pamphlet, just like anybody can make a tweet or a Facebook post or a blog. And so what was trusted more was the social relations you have with people.
I know that you, Robin, are the innkeeper and you tell the truth and you talk to people coming in all the time on the road. So you have what was known as good fama - Latin for 'it is said'. And so you would maintain your reputation, your fama - And obviously, it's the root of words like fame and infamy - and try to make sure that you were trustworthy. It also ascribed to the information you gave, to the subjects of those information, to those like me who spread it. And so it was a social structure, not based on the medium, but on the human relations.
And I think, my theory here, is that at least for a time, we have to return to that, because we don't have - publishing and editing, which were designed for authority a half millennium ago, are not up to the task of dealing with the scale of speech today.
And until we perhaps invent new institutions to do this, we rely again on these social structures that I know someone on Twitter and they tend to tell the truth and they correct themselves when they're wrong. So I trust them more. Or when the pandemic started, I began a COVID Twitter list of 600 experts. Twitter came to me to help verify some of them. I trusted them and what they had to say because I knew their credentials and where they came from.
But even the idea of expertise, I think, changes these days where lived experience is also expertise. Who do we trust to tell us about police brutality, the police or the objects of it? How do we reconsider our notions of expertise now?
And I think what we did, I think we lived as a society through a very short period, about a half a century, of mass media, and of this idea that somehow the media would tell us a truth.
In America, we had our most famous news anchor, newsreader was Walter Cronkite. And he would end every broadcast saying 'And that's the way it is'. But for many, many Americans, it wasn't the way it was. They were not represented there. They were not served there. Now they have a voice.
And I think what is going to be very hard for us to get used to is that we don't have a simple binary world with left and right and two big camps and everybody is in one of them. We have to return to the nuance that comes from actual conversation.
And what impressed me most about the history of print was how - obviously society was conversational before print - but even the early days of print were conversational. Luther and the Pope were in conversation with their books and their burnings of them.
What killed that, I think, was the mechanisation and industrialisation of print in the 19th century where we got to scale. Before that moment, the average circulation of a daily newspaper in the U.S. was only 4,000. It was a good Substack newsletter. Then it went to hundreds of thousands and millions. Now, I wonder whether we have the opportunity to return to a human scale. But that means it's a lot more complicated for each one of us to take responsibility to decide who we trust and why.
Robin Pomeroy: In the work I've done on disinformation, our workshops with teenagers, the only conclusion that we ever came to in each of those conversations was: who's saying it, can you verify who's saying it? And that's exactly what you're talking about.
So that's hugely important. But I'm going to challenge you on this idea that things aren't binary anymore, left and right. And in fact, and I know you talk about it in the book and you kind of dismiss this idea of the echo chamber effect of social media, that the more we tell the algorithms our political persuasions or the things, the type of voices we get, we want to listen to, the more we get of those. And therefore, if you're a liberal, you're listening to liberal stuff. If you're very conservative or on some kind of extreme, or a conspiracy theorist, most of the stuff you're getting, the information and the views you're getting from the Internet, are reaffirming your bias.
That's a dangerous thing. I believe you don't quite believe it in the same way that maybe I do and probably a lot of other people do.
Jeff Jarvis: Oh, yes. I'm a minority here. I think the implication these days on the Internet is that the Internet made us hate. The Internet put us in filter bubbles. The Internet did this to us.
There's a few issues I have with that. One is the third person effect, the idea that I'm immune from advertising and propaganda and pornography and anything but everybody else is in for it. Right? And that we somehow believe that exposure to the hypodermic theory of media influence, exposure to these ideas will convert people. I think that's fallacious because I think what we do with the Internet is we bring our past and our history to it.
Then the next question is, does it put us in filter bubbles or echo chambers, which is accepted wisdom? There's a lot of research since that says no.
One of the books I quote is by Axel Bruns, a German researcher now in Australia called Are Filter Bubbles Real? And his answer in his comprehensive literature reading is no - that Google does not just give me different answers from you. And people do not select their friendships based just on their political views and so on and so on.
Also convincing to me is Michael Bang Petersen, who is a political psychologist in Denmark, found that in research he's done, that the filter bubbles we live in are the filter bubbles we create in our real lives.
Bill Bishop wrote a famous book called The Big Sort. In America in the 70s, as we got white flight out of the cities, Americans moved into places to be around people who they perceived to be like them, and they got the same neighbourhoods and the same jobs and the same houses of worship and the same bowling clubs. And so they built a filter bubble of familiarity around them.
What the Internet does, Peterson argues, which is counterintuitive, is that it doesn't create filter bubbles. It pops them. It exposes people to those they think they dislike or mistrust or hate.
And by the way, the Internet does provide a generous stock of spitballs to then fire in this case. The Internet has a role. But I think it's potentially simplistic to think that we can be changed that much.
Now, the other thing about us in the fields that we work in is we think that facts will solve things. And as a journalist, we sell information. So information must be the solution. A lot of the problem now, the belief is that it's not about information, it's about belief systems and identity and projecting an identity. And journalism is not built for that. Media are not built for that. How do we reimagine them, I think, around getting people out of that belief system of hatred. And I don't know what the answer is.
Robin Pomeroy: It's a wonderful book because it's got all this history in which people might not be familiar with. And then but then also you do challenge some of that received wisdom.
Another one is, and this this hurts me as a journalist, this notion of the story and the narrative, because you know that's what we do. We make sense of the great chaos out there and turn it into a - true if you're a journalist - story. And why do you say the age of the the story kind of will die with the printed word as well?
Jeff Jarvis: Here we are at an AI conference at the Forum. And I've come to believe that I think one of the greatest impacts of generative AI Is the final and complete commodification of content.
This idea of content is a Gutenberg era notion - that content is that which fills something, that fills a book between the covers, the Alpha and the Omega. And to fill that book or that article or that magazine, we in journalism created a structure of having a story and an arc. We made the world fit neatly into that. So I would ask you whether you think that that is a better metaphor for life as it exists, or the web that scrolls continuously, does not begin, does not end, is not linear, goes every which way around. I think that's a more realistic representation.
It's chaotic, it's disturbing, but I think it's more real.
So the story is interesting, too, because I think that what I've learned trying to teach journalism students is to caution them of the power of the storyteller. We, as the storytellers, get to decide what it's about, who's in it, who's not in it, what they get to say. And I think that we now need to help people tell their own stories rather than extracting them.
So I'm not fully killing the story, but I do want us to challenge the idea that we control it.
Robin Pomeroy: You'd like to see the end of the neat, happy ending or the ending, because that just doesn't ever exist.
Jeff Jarvis: It really doesn't. There's a fascinating book I quote in there about the idea - How History Gets Things Wrong is the title of the book - saying that we projected our theory of mind onto history and on the politicians and on true events. We think we can mind read, we can explain what people do. And that's the essence of the story. And it's false. We don't know. It's a bit of a stretch, but it's a fascinating little stretch.
Robin Pomeroy: Artificial intelligence then. We're here at the Artificial Intelligence Governance Summit. No one knows, just like in Gutenberg's time, where this technology will take us and when. But what more can you imagine? AI - let's look at the positive side here - what could it do? Because you've got all these voices. Everyone's telling, I want to say their story, but everyone speaking. And this has been happening since the Internet was created. There's too much noise out there. It's really hard. How do you curate that? How do you find what's important or interesting to you? And I guess social media apps do do that. People will scroll through things that are delivered that the app, the algorithm knows they like. But do you see AI as improving on that? It's it's just not good enough yet, is it? That kind of thing? Where do you see us going?
Jeff Jarvis: I think there's a few pieces to this. I think social media has gotten condemned for all kinds of things. But if you look at some of the now that Elon Musk has utterly ruined Twitter, and you look at the competitors that are out there. Bluesky, for example, is going to offer people multiple choices of algorithms, which I think is very, very interesting. If you want the Alex Jones conspiracy Internet, you have it anyway, you can have that. Or you can have the Disney - 'nothing's wrong here' Internet. And you can choose based on that.
So I think that AI will enter into these things to help you decide what you want. One might argue again that's making filter bubbles and we'll only see in the application.
The other interesting part of AI think is again this notion that it commodies content. I think I'm special because I'm a writer. Here we are as journalists and storytellers, and we have a skill that scares a lot of people because they don't know how to write.
I'm fascinated with the idea that generative AI could help extend literacy, could help people who are intimidated by writing. And I blame Montaigne for this. It was with the creation of his brilliant essays, he raised the bar for inclusion in public discourse to be able to write. And people don't want to write emails or PowerPoints, let alone articles or books. Well, now they have a machine that can help them do this and illustrate it too.
So I was enthused about this. And then I was talking with the executive students on one of the programs I started at the school, and they said, 'Hold on, Jarvis. Okay. Maybe people who can't write or don't think they can write can use this machine, but that then potentially loses their voice. And it makes them sound like the people who could afford the power to write in the past, and we lose their structure and their dialect'.
Umberto Eco says that a dialect is a language without an army and navy, and so language and proper wording became a question of power.
So I don't know where this lands, but I think that the idea that people can have the power now to create in ways that... I can't draw, I can't draw a damn, but if I have something in my head that I want to illustrate now, I have a mechanism to do that. And I think that's powerful. I think that extends literacy.
I met a professor named Professor Parker from Insead in Davos probably ten years ago, where he argued that there isn't too much content in the world. He said there's too little in many languages. At the time he was using, way back when, to do things like create books on agriculture for people in nations where they didn't have them in their in their language, or create radio shows on the weather for people who were not literate to read.
To use this to extend accessibility on both sides - speaking and hearing - is really, really interesting. And I think there are good uses like this.
Robin Pomeroy: AI Governance Summit and the AI Governance Alliance, this thing being created by the World Economic Forum to bring industry, politicians, academics like yourself together to discuss these big issues. And how many years since Gutenberg did you say took are we?
Jeff Jarvis: More than half a millennium.
Robin Pomeroy: No, I mean, from the Internet,
Jeff Jarvis: Oh, about 1480.
Robin Pomeroy: We're at 1480. So it's as if back in 1480, they created a forum for the printers and the clerics and the scribes and the politicians and the royalty, all of whom were about to see, or certainly their children's children, all those power structures change, all those jobs change. What do you think should be the priorities for a forum like that that's getting all those various voices? What should their priorities be to look at now?
Jeff Jarvis: I'm fascinated with the idea that if you held this forum about print and its disruption in 1480, what you would be looking at is books that still look as if they were handwritten, with no page numbers, no titles, no title pages, no paragraph indentations. You'd have no sense of what was going to be done with it.
The printing press, by the way, was not just used for books, it was also used for for indulgences. And that was going to become worse. But it was still going on.
And probably the more important impact early on was the creation of bureaucracy, that you had the opportunity to create proclamations and forms and so on. But it was very, very early days.
There was a wonderful Rand paper written by James Dewar in '98, the same year as as Google started. He had also read Elizabeth Eisenstein's The Printing Press as an Agent of Change. And his conclusion out of doing that was that the countries that tried to control the press were left behind in the more general development of culture and society, and that it's better, he argued, to get to the unintended consequences as quickly as possible so we can deal with them.
And I think what we hear at this discussion in the last two days here in San Francisco is a fascinating debate about just this. Can we control this? Well, why do you want to? To what end? What's your goal? What are you trying to prevent? Define safety. Define transparency. It's all really complex and difficult.
There was a huge discussion and disagreement in the last two days about large language models and open source. There's one camp who's been here who says that we must have open source. Andrew Ng yesterday at the Forum discussion said he can't believe in 2023 he's trying to convince governments to leave open source alone, because Europe is talking about outlawing open source with LLMs. Some people here are saying open source could be horrible because any safety nets that are built in can be taken off. On the other hand, open source is going to allow researchers and countries that otherwise could not afford this technology to use it and so that it doesn't just stay in the hands of the few rich companies and it creates a competitive atmosphere.
So I think we have to let that debate occur. I think that it's foolish probably to think that - it's hubristic, to think that we can control all of this.
So what do we do? We have to study it. One of the things that amazed me most when I read Eisenstein's book was that she was oddly defensive - and I didn't understand why at first - about defending the idea that there should be a study of the effect of print and culture. And when I finally realised was she was saying this because there hadn't been. She was inspired, in a bad way, in a sand in a pearl way, by McLuhan. She couldn't stand McLuhan...
Robin Pomeroy: Remind us who McLuhan is.
Jeff Jarvis: Marshall McLuhan - famous media theorist who was the the king of bons mots and quotable lines. And he threw them out, oftentimes didn't really think them through, but they're delightful. I did a page and a half of my book where I did nothing but, quote McLuhan isms. And and they're delightful, but one could argue with them.
So she wanted to argue with them. And she went looking for the research to argue with them. And she was shocked that there was virtually no research done on the impact of print.
Well, so we can't wait 500 years to have that discussion about the Internet. I think it's too soon to believe that we know what the Internet is. It's too soon to think we can control it, but it's too late already to begin studying it.
One thing that I'm hoping to do, I'm going to 'retire' - air quotes - soon for my present institution. I hope to go to another one where I really want to try to work on a new program in Internet studies. I think one of the lessons from the Gutenberg years is, as I said earlier, the technology fades, the technology gets boring. And as Clay Shirky says, it's when the technology gets boring, that it gets important because then we accept it and we don't see it as technology. And so I think that the technologists will fade in the background, and what matters is what we do with it. And and so I think that is important to look at, not as technology, but as the Internet, as a human network.
AI, is a machine that can now finally understand us when we speak to it. It's now taking on a human nature. And so what we need is ethics and anthropology and sociology and psychology and history and the humanities brought into this discussion. And one of the things the Forum is so good at is convening groups and multi-stakeholder, as it's called. I think it's also about multi discipline and bringing together people from these many fields to judge both the Internet and AI as a human question.
What that means to is we have to acknowledge the human frailties and failures that we bring to these technologies. That's what we're guarding against. It's not that the technology is dangerous. It's that we can be dangerous with it.
Robin Pomeroy: So Jeff, in 500 years time, people will look back on your book and see that you were right and we were on this cusp of a new era? Or do you think or will it be sooner than that?
Jeff Jarvis: That's probably where my argument is most vulnerable.
The people argue I with, it's: No, Jeff, you're wrong. You say that this took 500 years to get where we are now. No, can't you see this happening so fast?
Well, I think what I have to say is more disturbing than that in the one hand, is that, no, I think it's actually happening slowly. Which is to say that the change has just barely begun and we don't know how far it will go. That's the scary part.
But the comforting part is it means we have time.
We figured out print eventually. We got a Thirty Years War. We had some peasant wars. We had lots of problems. But we figured it out and used it ultimately to society's benefit.
I am confident that we can - not that we will - but we can do the same with the connected society, the society of ubiquitous data, the society of thinking machines.
I think we can take that future and mould it, but it's our responsibility to do so.
We can't stand back and say it's technology's fault. It's their job. Let them do it. We must take responsibility for this.
Robin Pomeroy: Jeff Jarvis, thanks very much.
Jeff Jarvis: Thank you.
Robin Pomeroy: Jeff Jarvis’s book is called The Gutenberg Parenthesis - The Age of Print and its Lessons for the Internet.
Jeff made the point there that for humanity to have any chance of understanding the impact of technology on our future, we need to bring diverse groups of people together. The World Economic Forum is doing that with its AI Governance Alliance - which unites industry leaders, governments, academic institutions, and civil society organizations to champion responsible global design and release of transparent and inclusive AI systems. Find out more at wef.ch/AIGA, for the AI Governance Alliance.
You can watch or listen to some of the great conversations from the AI Governance Summit, the audio for three sessions is available on our Agenda Dialogues podcast - just search for Agenda Dialogues on your podcast app, or check the links in the show notes for this episode.
And there’s plenty more on AI on previous episodes of Radio Davos, please subscribe wherever you get your podcasts. And if you like Radio Davos, please leave us a rating or a review. And join the conversation on the World Economic Forum Podcast club - look for that on Facebook.
This episode of Radio Davos was presented by me, Robin Pomeroy. Studio production was by Gareth Nolan.
We will be back next week, but for now thanks to you for listening and goodbye.
Podcast Editor, World Economic Forum
Filipe Beato and Jamie Saunders
November 21, 2024