At Hidden Door we’re inventing new ways for kids and families to create stories with AI. This is an excellent time to be making creativity tools with machine learning!
Over the last year, the capabilities of natural language processing systems have taken a huge leap forward. Most recently, OpenAI, an organization working on general artificial intelligence, released software called GPT-3 (generative pretrained transformer 3 - yes, this is the 3rd generation, each bigger than the last). GPT-3 has caused a ton of excitement in the tech community, and a lot of hype.
What's impressive about it?
GPT-3 is trained with a large amount of information, which you can think of as All Of the Internet That’s Mostly in English And a Bunch of Books. The system then uses that training data to solve language problems. It takes some text (a “prompt”) and predicts the next most likely text. It can write articles, fiction, and even code. It can also translate between languages or classify the sentiment of a text.
All of these tasks were possible a year ago, but you would have had to design and construct a system for each one. The most impressive thing about GPT-3 is that it can do all of these things with the single pre-trained model given just a few examples.
Further, the text predictions are shockingly good. It can take a name or an idea and effectively riff on it. The new things it generates are not that dissimilar from what a person might have suggested. This is what most people are highlighting, and it is genuinely impressive to see a machine able to creatively come up with a new character’s name or disposition.
You may have seen some examples online of articles partially written by GPT-3 or its predecessor GPT-2 and not been able to tell that it was written by a machine, to the extent that there are now browser plugins to help alert you when what you’re reading may be written by an automated system.
Yet as we consider tools like GPT-3 for storytelling, there’s an important catch: GPT-3 isn’t actually creative or intelligent! Worse, it’s frequently toxic.
Let’s take a closer look. Like all machine learning models, GPT-3 learns by example and generates output entirely based on probabilities of what it’s seen before. It has no inherent understanding of language, truth, or common sense. It doesn’t think or know anything about the world. This means it often lies, or hallucinates. Sometimes this gibberish is amusing or even thought-provoking. It can also be frustrating; when writing long passages it may quickly become incoherent, drifting or forgetting important details, or getting caught in a loop where it repeats the same thing over and over again. In the ideal case, you get a story that’s comfortably mediocre.
It can also be deeply offensive, eagerly generating content that’s biased, racist, misogynistic, ageist, and more. It’s a product of its source material and the world that created it. By analogy, imagine teaching a space alien to write by showing it Reddit forums or Twitter feeds. You probably wouldn’t trust it to write for a public audience, let alone for kids, without some serious supervision.
Human creativity is in part about drawing meaningful yet unexpected connections. It’s comforting to hear stories from people like us, and equally wonderful to hear stories from other cultures and backgrounds. We aspire to create systems that encourage creativity, with diversity and representation, safely. Machines like GPT-3 simply don’t do that. Not on their own.
If you’re interested in learning more about what we’re building, or have thoughts about what makes a great storytelling experience, we’d love to hear from you!