Scroll down for full podcast transcript - click the ‘Show more’ arrow.
For half a century, Nile Rodgers has been making hit records that have touched people's hearts around the world. The creative force behind disco pioneers Chic, and some of the best known songs of David Bowie, Madonna and Beyoncé, tells us the definition of an artist: someone whose work "speaks to the souls of a million strangers".
But what if generative AI can make music that's just as good? Is AI a threat or a blessing to art and human expression?
We also hear from the head of the Hollywood actors' union on why moviemakers went on strike over the threat posed by AI. And from Refik Anadol, a leading light in AI-generated art.
Duncan Crabtree-Ireland, National Executive Director and Chief Negotiator of the actors’ union SAG-AFTRA
Refik Anadol, Media Artist and Director, Refik Anadol Studio
Nile Rodgers, musician and founder of the We Are Family Foundation
Nile Rodgers interview: https://www.weforum.org/videos/ai-nile-rodgers/
Check out all our podcasts on wef.ch/podcasts:
Podcast transcript
This transcript has been generated using speech recognition software and may contain errors. Please check its accuracy against the audio.
Duncan Crabtree-Ireland, National Executive Director, SAG-AFTRA: Someone is using your face and your voice to deliver a performance or a message that you had nothing to do with.
Robin Pomeroy, host, Radio Davos: Welcome to Radio Davos, the podcast from the World Economic Forum that looks at the biggest challenges and how we might solve them. This week AI and art.
If the machines can write, paint and perform as well as us - what do we have left?
We hear from the head of the Hollywood actors’ union who led his members out on strike last year as they fought against the unbridled use of AI in movies.
Duncan Crabtree-Ireland: It's not only about having a job. What these tools do is they take someone's image, their likeness, their voice, their performance, and they turn it into something that they never participated in the creation of. That's a very personal thing.
Robin Pomeroy: AI may be a threat to some artists, but others welcome it.
Refik Anadol, artist: I do believe human creativity will be completely enhanced through these new tools of imagination.
Robin Pomeroy: Refik Anadol creates vast, dynamic visual art - powered by AI which he says it just the latest tool in the history of making art.
Refik Anadol: For many centuries, as humanity, we used pen, pencil.
I think it was 2008 when I truly started to imagine the idea of using data as a pigment. But of course, when AI started to become a partner, a collaborator, it enhances that imagination at a whole new level.
Robin Pomeroy: What about human soul? We speak to one of the most successful hitmakers in the history of pop.
Nile Rodgers, musician: A hit record speaks to the souls of a milli on strangers. That's what an artist does. That's what my teacher taught me. And I went, oh my God. He just described an artist.
Robin Pomeroy: But the great Nile Rodgers is surprisingly open to AI.
Nile Rodgers: Any tool that allows an artist to create is an amazing thing.
Robin Pomeroy: I’m Robin Pomeroy at the World Economic Forum, looking at AI and art...
Nile Rodgers: 'We’re giving love in a family dose'. Boom! That's it. We got it.
Robin Pomeroy: This is Radio Davos
Nile Rodgers is a legend of pop music - from the 70s disco of Chic, into 80s producing some of the best known tracks by David Bowie, Duran Duran and Madonna, through to more recent times when he has written and produced for Daft Punk and Beyonce.
He was in Davos in January to receive a Crystal Award, which the World Economic Forum bestows upon artists whose work goes beyond their art, into making the world a better place. Nile Rogers created the We Are Family Foundation after the Sept. 11, 2001 attacks on the United States, to inspire and educate people about mutual respect, understanding and appreciation of cultural diversity.
Pop music is art, he says, because it speaks to the souls of a million strangers. But how soulful will it be if musicis created by generative AI? We’ll hear from Nile Rodgers later in the show.
First, to the performing arts. Last year Hollywood pretty much shut down due to strikes from the creative people who had a variety of grievances, not least the threat posed by the rise in AI.
According to the person representing the striking actors, the stickiest issue in the negotiations with the studios to end the strike was indeed about the guardrails needed to protect actors as the industry expands its use of the technology.
Here he is, Duncan Crabtree-Ireland, speaking in Davos to my colleague Spencer Feingold.
Duncan Crabtree-Ireland: I'm Duncan Crabtree-Ireland, and I'm the national executive director and chief negotiator of SAG-AFTRA, which is also sometimes known as Screen Actors Guild-American Federation of Television & Radio Artists.
We represent about 160,000 members who are actors, performers, they are journalists, on television, radio and the internet, also recording artists as well. And so we like to say, basically anyone who works in front of a camera or behind a microphone, we represent them. We are based in the United States, so the vast majority of our members are there. But we do have members all around the world, in various countries, who may have worked on projects for US-based entertainment or media companies.
And our job for them is we're their labour union. So we negotiate collective bargaining agreements. And basically our mission is to look out for our members' professional lives and make sure that they're properly taken care of, that they have strong protective contracts, and that when they go to work they can focus on the creative work that they do instead of having to focus on the details of those agreements.
Spencer Feingold: Your organisation, your union was very much in the news recently with the huge Hollywood strike. It kind of affected the globe. Netflix series, put on hold. Hollywood shut down, billions of dollars at stake there. Do you mind just summing up the strike? What went on there? How did it start?
Duncan Crabtree-Ireland: Sure. Well, so in in this industry, there are three of what they call above the line unions the Directors Guild, the Writers Guild and SAG-AFTRA, who represent creative talent, who work in these, film, television and streaming projects. And so, our contracts all expire within about 60 days of each other every three years. And so every three years there's this cycle of negotiations.
And so what ended up happening this year was the Directors Guild was able to reach an agreement with the companies, but the Writers Guild was not. And they went on strike.
And we began our negotiations on June 7th of last year. And after bargaining for 35 days, which included a historic 12 day extension of the contract to allow more time to try and work through some, you know, tough and complex issues we were dealing with, ultimately, we realised that we weren't going to be able to reach a deal without a strike. And so we also, went on strike.
And so between the Writers Guild and us, this joint strike was the first time in 60 years that both the Writers Guild and SAG-AFTRA had been on strike at the same time against the studios and now the streamers as well.
And so that really reflected, I think, what was a very unique moment in time because, we were fighting not just for basic fair compensation like we always are, and other important terms of these contracts, but this year became a year in which we had to really take on the issue of artificial intelligence and how it was going to change the industry and what we needed to do to protect our members.
It became an important issue for the Writers Guild, and certainly, from the very beginning, was a crucial issue for us, and it ended up being one of the first issues that we took on. It was in our proposal set on day one of negotiations on June 7th, and it was also the very last issue that was resolved on day 118 of our strike. So it really represented one of the most complicated and difficult pieces to negotiate of this already very complicated deal.
Spencer Feingold: Were you surprised that I became such a big sticking point in negotiations? Did you see that coming?
Duncan Crabtree-Ireland: I saw it coming. I think we all saw it coming. We knew it was going to be super important for our members. Every year we go to a trade show in the US called CES, which is otherwise known as the Consumer Electronics Show. This is a place that we have over the last two decades, used as a way to sort of see a little bit around the curve and kind of know what's coming. So because of our involvement there, we really started focusing on AI and the potential of generative AI a number of years ago. And of course, it really took off in the last 18 to 24 months.
So when we started this negotiation in June of last year, we knew that I was going to be a big issue. We had an extensive proposal on AI. What we didn't know was that the companies would be so insistent on refusing to agree to what we thought were really reasonable proposals.
We came into this not looking to ban AI. A lot of people think unions just want to stop AI. We didn't take that approach because we feel that approach is not likely to be successful. Past history tells us that. So we came in saying we're willing to partner with you on AI, but there have to be guardrails and protections built into the contract, and it ended up being very difficult to get the studios or streamers to really agree to the kinds of protections we needed. And that is something that ultimately only happened several months into the strike, when the CEOs of the studios and streamers got directly involved, and we really started hammering out what would become the ultimate deal on AI.
Spencer Feingold: Do you mind just diving into the threat that AI poses to people in your industry, to the workers that you represent, the actors, the writers, the producers, etc.?
Duncan Crabtree-Ireland: Sure. And let me just just to be clear, I don't represent writers or producers, although I'm familiar with their issues. And the Writers Guild, of course, in a way, is one of the most straightforward sets of concerns because anyone who's used ChatGPT, just as one example, knows that if you are a professional writer, you might have some concerns about what people might decide to do with ChatGPT, especially if quality isn't necessarily the primary concern in the output that you're generating. So that was a concern for the writers.
For Sag-Aftra and for our members, primarily actors in this case, the concern was there's already AI being used to digitally replicate actors. It's, something people may not realise, but but they may be familiar with, when Carrie Fisher was replicated for Star Wars, when Paul Walker was replicated to finish out the Fast and Furious movie that he was in and the year that he passed.
These are things that were done with early iterations of AI tools, and this is something that was already happening. So it was very clear to us that, as these tools become more capable and less expensive to use, that your use was going to grow.
And especially because what these tools do is they take someone's image, their likeness, their voice, their performance, and they turn it into something that they never participated in the creation of. That's a very personal thing. It's not only about having a job, it's also about the fact that someone is using your face and your voice to deliver a performance or a message that you had nothing to do with the creation of that.
And so we knew we had to have guardrails in place to prevent that. And ultimately we succeeded in achieving those.
So in this contract, there are provisions that say the studios and streamers can't do anything using AI or using digital replication without the informed consent of the performer. So you can't put words in someone's mouth without their approval. You can't use them for things that they haven't signed off on, and it can't be some generic line in a contract that says, I hereby give you permission to just do whatever you want with my image or likeness. It has to be specific, detailed information about the intended use.
And that was a big part of the fight we had, because that constrains what the studios and streamers can do. But it had to be constrained in that way. Otherwise as our members are considering being involved in this industry, they cannot give up their right to say what's going to be done with their image, their voice and their performance.
Spencer Feingold: I'm curious too. I mean, you said your union had seen it coming, had developed kind of a strategy. But I think for most people, the development of AI has been at like almost lightning speed, like in the last year it just developed so fast. Amongst your members, did you kind of have to educate them on on what this means to them, what AI means to the industry?
Duncan Crabtree-Ireland: Absolutely. I mean, that's an ongoing process, I think, because, you know, when people think about AI, they have a certain image in their mind. For a lot of people, it's Terminators or Skynet or or whatever. And even for professionals working in this industry, if they haven't had the opportunity to really educate themselves about what these tools do, how it generative AI in particular might be used and won't be used, that becomes something that can spark a level of fear that can almost be paralysing.
And so a certain degree of fear is valuable. It helps us avoid danger. It helps us make the right choices in difficult situations. But an excessive amount of fear can actually be harmful to us and prevent us from taking the action we need to take to protect ourselves. That's where we have to navigate that, and we have to make sure that our members have had the chance to inform themselves so that they can make really solid decisions about how to address AI.
And in the case of this negotiation, our negotiating committee, our national board, unanimously endorsed what was ultimately negotiated in this process. And then we had a very robust discussion with our members. It lasted over three weeks, and at the end of that, about 80% of our members voted in favour of this agreement, giving it, I think, a resounding, sense of approval. And it's a strategic approach to how we deal with AI.
Spencer Feingold: Were you surprised by that unity , ultimately that came together?
Duncan Crabtree-Ireland: No, I mean, we had extraordinary unity throughout the strike. I think the AI issue is the most complicated and difficult issue of all of the complicated, difficult issues in this deal. So ultimately, I felt confident that there would be a sizeable majority of our members who would be on board with it.
But I think it's important to recognise that there's also a sizeable minority who do have issues with it, about 20%, some of those people really want us to just try to block AI. And I understand the sentiment, but I also believe very firmly that strategically that's the wrong approach, that, trying to block the adoption of this technology will essentially mean that we walk away from our opportunity to channel its direction, and that blocking technology has never proved to be effective, and I don't think it will prove to be effective here.
Spencer Feingold: It seems a bit like AI is kind of an unstoppable technological wave that's come in. Do you think there'll be issues down, more issues down the road? Is this a very much unsettled issue? AI and entertainment?
Duncan Crabtree-Ireland: Absolutely. I think it's very clear that the capability of AI is going to evolve. The desire of companies and how they want to use AI is also going to evolve, and our contractual protections and provisions are also going to evolve.
But I'm very confident that the protections we have in place now will serve us well during the course of this contract. We will be renegotiating this deal in less than two and a half years from now. And, at that time, we'll have the benefit of seeing what's happened over that time period. Part of this agreement is that we are entitled to have meetings every six months with all of the companies to find out what they're doing in AI in general, and also specifically with generative AI. And so having had those meetings by that time, I think we'll be in a really strong position to make sure that we're channelling our efforts in the right direction.
But I think ultimately, again, you know, if we put all of our, power and force into trying to block AI and what would ultimately be an unsuccessful effort, we then we'll have given up the chance to really, nudge it in the right direction. And I do think putting guardrails up in and pushing AI into the right kind of implementation is how we can ultimately see a human centred use of AI in the entertainment industry instead of something that's dehumanising or devaluing of creative talent.
Spencer Feingold: Yeah, I mean, I guess, like you said, like a perfect example that would be using a famous movie stars likeness, you know, scanning that without any of their involvement. Are there any other kind of like, big red flags uses of AI that you can you can think of or explain?
Duncan Crabtree-Ireland: Well, maybe I can just adjust that one a little bit, because in the past, yes, it would have been probably a big famous movie star who would be likely where that likely be used. But I think as this technology has become less expensive and more readily available, actually our concern is not limited to high profile SAG-AFTRA members. Really, any SAG-AFTRA member could have their image like this voice or performance misused. And so we're we're focussed on the breadth of our membership in this area. So I think that's that is a definite reality.
I think the other area of concern, well, there's several, but one of them is, how to fairly compensate people when they're, digital replica of them is used to create something.
The formula that we've settled on largely in this contract is determining what an equivalent amount of work time would have taken to create that same amount of work and then paying on that basis. There are other models that could work as well. So it remains to be seen what will become ultimately the dominant model in the industry. The one that's in our contract is that one for replicas that are created as part of a creation of a project.
The other thing that I think is going to be interesting, less so during the term of this contract, probably more so in the maybe the next term is how generative AI is used and if it is going to be used to create fully synthetic performers or performances. Because it's one thing to scan a performer, maybe train an AI using that particular performer's past performances and then create a performance that reflects them. It's another thing to take a generative AI system and train it with thousands or tens of thousands of performers, and then have it create a so-called new performer who then, you know, doesn't have a corresponding human being. What does that mean? How does that affect jobs? How does that affect fair compensation? And is it really fair to even call that a performer when what it actually is is an AI driven synthesis based on a bunch of other performers' creative work?
As you know, there's a whole battle going on now over training AI systems and where rights ought to land in that regard. And I think for the for our members' viewpoint and my personal viewpoint is any time creative output like that is used to train an AI system, those creative artists, writers, actors, whomever, deserve to be compensated for that.
Spencer Feingold: I read recently about a deal that your union struck with voiceover artists. Do you mind just explaining that and how AI was involved in those negotiations?
Duncan Crabtree-Ireland: Sure. You're probably referring to the deal we announced last week at CES, which is with a company called Replica Studios and this is a an AI agreement. They're an AI company. Specifically, what they do is they create digital voice replicas for use in producing video games. And so this is a company we've been negotiating with for a couple of years now. They came to us and said, we want to do this, but we want to do this in an ethical, you know, respectful way to performers.
So we negotiated what I think is, at present a gold standard agreement for this kind of work. And it should provide a great deal of comfort to our members who might be uncomfortable agreeing to have a replica made of their voice to be used in work that they traditionally might have done in person. And so this agreement contains a number of limitations. It's, every aspect of this agreement is equal to the deal we made with the studios. And streamers were better, and in some cases, because it's more specific to the video game world, there are some, you know, significantly improved terms, such as time limitations on the use of replicas, such as enhanced transparency and, safe storage standards for digital replica and biometric data that's been, collected as part of that process.
So I think it's a great example of how these agreements will evolve over time. And I expect there to be further evolution even from this agreement as time goes on. And we continue to negotiate new agreements for AI and all the different industries where our members work create.
Robin Pomeroy: Duncan Crabtree-Ireland, National Executive Director and Chief Negotiator of the Hollywood actors’ union SAG-AFTRA, speaking in Davos to my colleague Spencer Feingold.
In Davos, our next guest had installed huge digital screens with his artworks - swirling shapes and colours inspired by nature and generated by algorithms.
That artwork, called DATALAND: Rainforest by Turkish-born Refik Anadol is currently on display at London’s Serpentine gallery - until 7April, 2024.
Refik Anadol spoke, in Davos, to my colleague Kateryna Gordichuk.
Refik Anadol: I am Refik Anadol. I'm a media artist and director. I'm an artist working with AI for eight years and with data more than two decades. So I'm trying to blend these mediums together to create impact.
I think it was 2008 when I truly started to imagine the idea of using data as a pigment. And over the years, I never, ever felt that I'm done with the idea of using information around us as like a pigment.
But of course, when AI started to become a partner, a collaborator, it enhances that imagination to a whole new level.
For example, I have been truly immersed and inspired by nature. I think nature is the most intelligent thing we have and we have to preserve it. We have to understand it. And last several years, I'm explicitly imagining nature in a new way.
It's not so different than, like artist Monet when he imagines like atmosphere or water lilies. I have a chance to work with AI, to imagine nature and through this new perspective.
And there's so much ways of understanding nature when it comes to working with data and AI. Nature is a multi-sensory environment, but I think what helps us, Disney of tools and imaginations. This allows us to discover and possibilities and a data land we are trying to imagine. Like yes, we can go to nature, but can nature come to us? And how can we preserve it? And how can we inspire humanity without, um, hurting it and still love and respect it? And it's a really nowadays for me and my teams deep through honest, intense.
Kateryna Gordichuk: Refik, I read that you use in your projects 'nature's inherit intelligence'. What does it mean for you to use this intelligence? And do we use it often in our daily lives as people?
Refik Anadol: So when we think, at the moment majority of AI research focused on human intelligence, specifically reasoning. But I think these AI models unfortunately doesn't know nature explicitly. And it was just like a aha moment that perhaps there was a rush for something and we missed the most intelligent thing that we have: nature.
So in my practice last several years, to truly demystify how we can represent the beauty, the intelligence of nature,
Kateryna Gordichuk: I think what you demonstrate with this project is that, despite fears about AI, it can be used for good and help bring back to communities. Why is this such an important message for you to deliver?
Refik Anadol: So my most important message to deliver about AI is to be sure that it is a mirror. And AI will exactly reflect who we are. So it's not about AI. It's about humanity.
But the same technology I do believe can be used to create possibilities, can be used to solve certain problems. And as a hopeful mind, I do believe that we can bring inspiration, joy and hope for humanity by using AI.
Kateryna Gordichuk: How do you see AI enhancing human creativity in the future, in the years to come?
Refik Anadol: I do believe human creativity will be completely enhanced through this new tools of imagination.
But again, I see AI as a collaborator. I do not believe AI is the only creator. I believe truly human mission, collaboration. And in that reality, I see so much possibilities.
And again, as an artist, I may not able to draw very well, but I know how to use a thinking brush. I know that this brush will never forget the images of Amazonian flowers or the patterns of landscapes of rainforest.
This type of art making also starts with tool making. Like in this context, let's think about a traditional painting situation. And most likely the brush. The pigment and the canvas is everyday the same. But the ideas are changing, right?
In this context, every morning a new brush, a new pigment, a new canvas appears, so it brings us infinite possibilities. And when you think about mission intelligence as a collaborator and it has so much possibilities.
For many centuries, as humanity, we use pen, pencil, you know, printing press and then and of course, when technology evolved into like, you know, first of all, as an artist, I was so fortunate to witness internet web one, web two, web three, AI quantum computation. First of all, it's incredible place to be alive.
In the in the past centuries, artists have been barely seeing innovation in the life to get inspired. And we are now surrounded by all this, every day, new innovation and discoveries. I think it's incredible to be alive at these moments. But embracing this understanding, this is also important. So I think it's artists' role to translate what's going on and where we are going.
Robin Pomeroy: Refik Anadol, putting the case for using artificial intelligence to create art.
So we have looked at movies and TV, and visual art. So what about music.
With the global hit Get Lucky, Nile Rodgers collaborated with robots - the band Daft Punk. But, of course, there were humans under the robot suits and while digital technology has been a big part of music for decades, the rise of generative AI could mean we are in for some big changes.
So what does this veteran of the pop charts - who has been making hits for more than half a century - think of using artificial intelligence to make music?
I was my huge pleasure to sit down with Nile Rodgers in Davos, and for any listeners thinking they don’t know him - they really do, With legendary bass player Bernard Edwards he founded disco band Chic in 1970s, whose funky rhythms formed the basis for early hip hop. He produced and played on David Bowie’s Let’s Dance and Madonna’s Like a Virgin in the 1980s.
Just last year, he received a Lifetime Achievement Award At the Grammys and, on the same evening won another for Best R&B Song for Beyonce's "CUFF IT".
If anyone knows anything about the soul of music - its inherent humanity - it could be him - here's some of my conversation with Nile Rodgers.
Robin Pomeroy: Let's talk about the power of music, then. I mean. What is. What's the connection you're making? You've been involved in so many great songs that everyone in this conference here knows. Even if they don't know, they know those songs. Right? Or what was the power of a great song, do you think?
Nile Rodgers: So, here's something that my jazz tutor taught me. One day I was going out to do a gig. That's how I grew up, I used to do cover songs for money. And sometimes we were lucky and we could do original songs because in the old days, people liked original bands playing new music. That was the discovery engine, go to a club.
And one day I was taking a lesson, and my jazz tutor, he's never had a hit record in his life, but he saw that I was really upset. He asked me why. Because normally I'm so upbeat. And I said, well, look at these, like, lame songs I got to play tonight. And he says, Nile, don't you realise that they're all hit records? Why would you call them lame? I said, look, the first song I got to play tonight is sugar, Sugar by The Archies.
Robin Pomeroy: That was number one the day I was born in England by the way.
Nile Rodgers: Oh my God.
Robin Pomeroy: Well, exactly. Yeah. It's not the coolest thing in the world, is it?
Nile Rodgers: So he said to me, he said, how do you know that Sugar Sugar has been number one for about six weeks now. And I said, what has that got to do with it? He says Sugar Sugar is a great composition. I said, how can you call 'Honey do do do do do do. Oh, sugar, sugar.' How could you call that a great composition?
Here's the greatest lesson in my life. He says, because it speaks to the souls of a million strangers. And I went, oh my God. He just described an artist.
I'm not an artist yet. I need to learn how to speak to the souls of people I will never, ever meet.
You know, you go to the Louvre, before you go to any great museum or even a not so great museum, what those people are going to experience, the person who composed that, let's just use musical lingo for this example. will never meet those people, chances are. I mean, you won't meet Pablo Picasso. You certainly won't meet some of the old masters. I'll never meet Beethoven and I'll never meet Prokofiev.
And that's what my teacher taught me. A hit record speaks to the souls of a million strangers. That's what an artist does. You want to do music that everybody will like. And of course we're not talking absolutes. I hate absolutism. But what he does mean is that a great preponderance of the population will like your music.
Robin Pomeroy: This idea of the artist touching people from a distance in time or in geography is really important. And I wonder what you think of, at this meeting here in Davos, everyone's talking about artificial intelligence, generative artificial intelligence. And now you can make a record in seconds by just telling a computer to make one. I want a song that sounds like Chic. You know, do me a song like that. And maybe it sounds a bit like it.
What is your opinion of... Because I know you've made those records. But I love the records for the music. But I love the fact there's a connection with you, particularly as I'm having the privilege of talking to you now. But do you see artificial intelligence as a danger to to that kind of human connection?
Nile Rodgers: So, a few years ago, I was down in Costa Rica with Deepak Chopra, and we're sitting there and we're talking about AI, and this is years ago. And he says, Nile, technology can be beautiful and diabolical, just like people. And I thought to myself, wow, he's absolutely right.
So I think AI can be beautiful, but AI can be diabolical, just like people, as Deepak said. So I think that any tool that allows an artist to create is an amazing thing.
But of course, I was watching an interview with Sean Penn where he was saying that, he was with a friend who doesn't speak Japanese, and with AI, he was able to speak Japanese, in his voice and and grammatically, it was all perfect. And then even the AI synced up his mouth, even though it was like, you know, it wasn't him. He was just speaking English. But when you watched the broadcast, the film that they made, the guy's mouth corresponds to the Japanese even as closely as where the tongue would be behind the teeth or something. If he was saying, you know, like tcha, something like that. Perfect.
So things like that sound terrifying to me. But then I think about I just read that a group that I was working with. We just had a huge hit. They did it in ten different languages. And is that artistically cool? To me, that's a tool that helps the artist communicate more.
Like in other words. So the thing about K-pop, sorry, I forgot to mention was a K-pop record, K-pop records, what they typically do is they have obviously words that are in Korean, but a lot of words that are also in English. They don't have super sophisticated, polysyllabic words, but they have words that people recognise, you know, go, stop, pretty, beautiful. high, fly, dance, bounce you know, things like that. You know bounce, bounce, bounce... And then they'll speak Korean. But if you have a tool that can still have the same basic musical format...
Great example: opera. We translate operas into other languages when the producer finds it necessary. I happen to like most of the opera that I like in the original tongue. But it happens all the time. Think about Broadway shows.
So if you have a tool that can do it faster and more efficient, is that any less artistic than spending the time to learn that stuff and you just really, reciting it by rote? You don't really know the new language. You just know how to sing in it.
Robin Pomeroy: Do you use any AI tools yourself?
Nile Rodgers: I've not used to it yet, but, I could easily think of the time.
Right now. I work with Apple, and, because I'm an artist in residence, they would obviously like me to not only come up with projects of my own that are interesting so that they can help me bring those products to fruition. And when I say products, what I really am talking about is what I create, which is music.
So I would assume at some point in time that I would probably use it, but I certainly wouldn't use it to imitate somebody else with that. I mean, even now, people can't believe that we're still a 100% live band. So all the bands that you probably like, or if you have kids, all the bands your kids like are all on tape. They're all working with ProTools. They all have click tracks. Not Chic, we're completely live.
Robin Pomeroy: My ten year old daughter loves your music, by the way.
Nile Rodgers: But and the reason why I love being live is because that's my art form. But I don't criticise the people who aren't doing it that way. We just did a show two days ago and we made a couple of mistakes, and we laughed and we go, mistakes are part of live, and we turned it into part of the show.
Robin Pomeroy: With the rise of AI, I think that live performance, already even, with the streaming of music and everything, live performance has become so important. I think it's just going to get more and more so, don't you think? Because that really is, that's a human being in front of you, particularly playing an instrumental, singing with a real voice.
Nile Rodgers: I hope so. I don't really ever think that one form of expression excludes the other. It just maybe comes more popular. And you know, in a weird way, I feel like there's nothing wrong with that.
I mean, you know, when somebody announced my camera could be 8K, I was like, really? Wow, how cool is that? You know, when I first started out, my camera was like the cheapest. I go back and look at those images from my VHS camera and go, Jesus! We thought this was cool? But at the time we thought it was really cool.
You know, so I, I like technical progress. I don't like technical progress that, that isolates people. That feels uncomfortable to me. So like social media where people can say things. Me, I don't mind, it doesn't bother me. When you're in the music business, you can take anything. I mean, I cannot tell you how many times record executives have screamed at the top of their lungs, telling me this is the worst piece of work. Six weeks later, number one. Excuse me, who was the person who said, I don't know, I loved it the first time I heard it.
And this is the truth. I was writing my autobiography. And I tried to go back and talk to all the people who had left the conference room when we played Le Freak. I just wanted one person to tell the truth. Every last one of them said that they loved it. And I'm telling you, when we played that song, we emptied the conference room. They all walked outside. It was like, who's going to tell him that this song sucks? It's the biggest selling single in Atlantic's history.
Robin Pomeroy: You've been involved. You've written so many great songs. You produce so many great songs. Is there one song that you're really proud? If you had to pick one achievement or the song that you love or something, what would it be?
Nile Rodgers: It's really difficult because so many songs were life changing.
You know. We Are Family was, in a very peculiar way, the template for the way music would be made now.
When Sister Sledge walked into that studio, they saw Bernard and myself, Bernard my ex-partner, we're standing there, we're just finishing up the last lyrics to that particular song. We had already finished the entire album, sang it, everything, all done. So we're just finishing up. 'We giving love in a family dose'. Boom! That's it. We got it. Hey. What's that? Oh, that's your album?
Robin Pomeroy: You're kidding me. You just finished writing it as I walked in the room.
Nile Rodgers: And they walked in, and we had never. We never met Sister Sledge.
Robin Pomeroy: Did you know you're writing it for them?
Nile Rodgers: Of course we did. We found out that they were sisters. Right? You really sisters? We thought it was like the Cornelius Brothers & Sister Rose, we thought, that's a cool thing to say. Soul Brother Number One.You know, Sister Sledge. You know, Sister Rosetta. We didn't know they were actually four sisters. So when we found that out, it was like, whoa, wait a minute. We can actually write a song like. We Are Family. We just made it up.
Robin Pomeroy: Did they then sing it straight away as well?
Nile Rodgers: They liked the track because they walked in and they heard it and they were like, wow, that's cool. What is that? And we said, that's your album. That's one of the songs on your album, and, what do you mean, one of the songs on our album? We came here to to hear what you want to play and work together and like, no, it's done on it. We got to do a sing along with it. And boy, we got off to such a bad start. Because they were offended by that.
Robin Pomeroy: Because they wanted to kind of workshop it with you and write it together.
Nile Rodgers: Absolutely. But the thing is, is that we were young. We didn't know any better. What we did know was every song that we did our way, everyone was a hit.
So by the time we did We Are Family, we had Dance, Dance, Dance, Everybody Dance, I think Le Freak was about to come out, but we had already, the girl who was our lead singer (Norma Jean Wright), we made her album. She had already had a couple of gold singles. So, I mean, even though it never became, like, huge, still speaks to the souls of a million strangers. Norma Jean had a record called I Just Can't Wait Til Saturday. It was a big record.
So every single we have released. Was done the same way. It was all me and Luther Vandross, Bernard Edwards, our little crew and we would just make the songs. Luther and his crew would sing them. Boom! Here you go. Hit record.
Robin Pomeroy: Now, I could I could talk to you all night. I'm being waved at by someone to wind up. It's been such a pleasure. Thanks so much for joining us.
Nile Rodgers: Thank you.
Robin Pomeroy: Nile Rodgers - what will happen to showbiz anecdotes like that if generative AI is making the music?
You can watch Nile Rodgers accept his Crystal Award on the Forum’s website, and find out more about his charity at wearefamilyfoundation.org.
Thanks to him and to our other guests today, artist Refik Anadol, and union leader Duncan Crabtree-Ireland.
You can find plenty more about artificial intelligence on the World Economic Forum's website and on previous episodes of Radio Davos - available wherever you get podcasts and at wef.ch/podcasts.
This episode of Radio Davos was written and presented by me, Robin Pomeroy with reporting by Spencer Feingold and Kateryna Gordichuk. Studio production was by Taz Kelleher.
We will be back next week, but for now thanks to you for listening and goodbye.
Podcast Editor, World Economic Forum
Matt Price and Anna Schilling
November 20, 2024