Artificial intelligence is trying to write the next Game of Thrones book
Fans might have to wait until 2019 to see how George R.R. Martin's epic saga wraps up. Image: REUTERS/Robert Galbraith/File Photo
If you're hanging out feverishly to find out what happens after this weeks' drama-filled season seven finale of Game of Thrones, you're not alone.
With the TV show now firmly ahead of the books, fans might have to wait until 2019 to see how George R.R. Martin's epic saga wraps up.
So to give us some much-needed new material to over-analyze in the meantime, an algorithm has started to write the sixth book for us, and its predictions back up some long-held fan theories.
After feeding a type of AI known as a recurrent neural network the roughly 5,000 pages of Martin's five previous books, software engineer Zack Thoutt has used the algorithm to predict what will happen next.
And it might be the best we get for a while, because, let's face it, most of us are so sick of waiting that we've pretty much given up on Martin delivering The Winds of Winter anytime soon... even Martin himself.
Warning: fan theories and season seven spoilers ahead.
According to the AI's predictions, some long-held fan theories do play out - in the five chapters generated by the algorithm so far, Jaime ends up killing Cersei, Jon rides a dragon, and Varys poisons Daenerys.
"Jaime killed Cersei and was cold and full of words, and Jon thought he was the wolf now, and white harbor..." begins chapter five.
You can read all the chapters in full on the GitHub page for the project. Each chapter starts with a character's name, just like Martin's actual books.
But in addition to backing up what many of us already suspect will happen, the AI also introduces some fairly unexpected plot turns that we're pretty sure aren't going to be mirrored in either the TV show or Martin's books, so we wouldn't get too excited just yet.
For example, in the algorithm's first chapter, written from Tyrion's perspective, Sansa turns out to be a Baratheon:
"I feared Master Sansa, Ser," Ser Jaime reminded her. "She Baratheon is one of the crossing. The second sons of your onion concubine."
There's also the introduction of a strange, pirate-like new character called Greenbeard.
"It's obviously not perfect," Thoutt told Sam Hill over at Motherboard. "It isn't building a long-term story and the grammar isn't perfect. But the network is able to learn the basics of the English language and structure of George R.R. Martin's style on its own."
Neural networks are a type of machine learning algorithm that are inspired by the human brain's ability to not just memorize and follow instructions, but actually learn from past experiences.
A recurrent neural network is a specific subclass, which works best when it comes to processing long sequences of data, such as lengthy text from five previous books.
In theory, Thoutt's algorithm should be able to create a true sequel to Martin's existing work, based off things that have already happened in the novels.
But in practice, the writing is clumsy and, most of the time, nonsensical. And it also references characters that have already died:
It made Ned better stop until the fire was falling, standing beneath the arch of a shattered still distant field where the shadow tower paid the camp behind.
Still, some of the lines sound fairly prophetic:
"Arya saw Jon holding spears. "Your grace," he said to an urgent maid, afraid. "The crow's eye would join you.
"A perfect model would take everything that has happened in the books into account and not write about characters being alive when they died two books ago," Thoutt told Motherboard.
"The reality, though, is that the model isn't good enough to do that. If the model were that good authors might be in trouble ... but it makes a lot of mistakes because the technology to train a perfect text generator that can remember complex plots over millions of words doesn't exist yet."
One of the main limitations here is the fact that the books just don't contain enough data for an algorithm.
Although anyone who's read them will testify that they're pretty damn long, they actually represent quite a small data set for a neural network to learn from.
But at the same time they contain a whole lot of unique words, nouns, and adjectives which aren't reused, which makes it very hard for the neural network to learn patterns.
Thoutt told Hill that a better source would be a book 100 times longer, but with the level of vocabulary of a children's book. Of course, though, that would defeat the purpose of the series we all know and love.
Still, these five, clumsy chapters are the best we've got while we sit around waiting for Martin to finally finish The Winds of Winter, or HBO to release the final season (our money's on the latter happening first).
So for now we're going to pore over them for any clues as to what's going to happen next - who knows, maybe the algorithm has some ideas about how to take down an ice dragon...
Don't miss any update on this topic
Create a free account and access your personalized content collection with our latest publications and analyses.
License and Republishing
World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.
The views expressed in this article are those of the author alone and not the World Economic Forum.
Stay up to date:
Artificial Intelligence
Related topics:
Forum Stories newsletter
Bringing you weekly curated insights and analysis on the global issues that matter.
More on Emerging TechnologiesSee all
Michele Mosca and Donna Dodson
December 20, 2024