The Conversation

Your next favorite story won’t be written by AI – but it could be someday

Haoran-Chu-U-Florida

Haoran Chu, U. Florida

Stories define people – they shape our relationships, cultures and societies. Unlike other skills replaced by technology, storytelling has remained uniquely human, setting people apart from machines. But now, even storytelling is being challenged. Artificial intelligence, powered by vast datasets, can generate stories that sometimes rival, or even surpass, those written by humans.

Creative professionals have been among the first to feel the threat of AI. Last year, Hollywood screenwriters protested, demanding – and winning – protections against AI replacing their jobs. As universityprofessors, we’ve seen student work that seems suspiciously AI-generated, which can be frustrating.

Beyond the threat to livelihoods, AI’s ability to craft compelling, humanlike stories also poses a societal risk: the spread of misinformation. Fake news, which once required significant effort, can now be produced with ease.

How do stories influence people? Their power often lies in transportation – the feeling of being transported to and fully immersed in an imagined world. You’ve likely experienced this while losing yourself in the wizarding world of Harry Potter or 19th-century English society in “Pride and Prejudice.”

Given the power of stories, can AI tell a good one? This question matters not only to those in creative industries but to everyone. A good story can change lives, as evidenced by mythical and nationalist narratives that have influenced wars and peace.

Studying whether AI can tell compelling stories also helps researchers like us understand what makes narratives effective. Unlike human writers, AI provides a controlled way to experiment with storytelling techniques.

In our experiments, we explored whether AI could tell compelling stories. We used descriptions from published studies to prompt ChatGPT to generate three narratives, then asked over 2,000 participants to read and rate their engagement with these stories. We labeled half as AI-written and half as human-written.

Our results were mixed. In three experiments, participants found human-written stories to be generally more “transporting” than AI-generated ones, regardless of how the source was labeled. However, they were not more likely to raise questions about AI-generated stories. In multiple cases, they even challenged them less than human-written ones. The one clear finding was that labeling a story as AI-written made it less appealing to participants and led to more skepticism, no matter the actual author.

Why is this the case? Linguistic analysis of the stories showed that AI-generated stories tended to have longer paragraphs and sentences, while human writers showed more stylistic diversity. AI writes coherently, with strong links between sentences and ideas, but human writers vary more, creating a richer experience. This also points to the possibility that prompting AI models to write in more diverse tones and styles may improve their storytelling.

These findings provide an early look at AI’s potential for storytelling. But creativity and real-life experiences are where AI falls short. Creativity means coming up with new ideas, while AI is designed to predict the most likely outcome. And although AI can sound human, it lacks the real-life experiences that often make stories truly compelling.

For now, screenwriters and novelists aren’t at risk of losing their jobs. AI can tell stories, but they aren’t quite on par with the best human storytellers. Still, as AI continues to evolve, we may see more compelling stories generated by machines, which could pose serious challenges, especially when they’re used to spread misinformation.

[Abridged]

Categories Opinion The Conversation