Scroll through social media, and it’s almost impossible to avoid AI-generated images and videos. At first glance, they can look realistic. But stare a little longer and they often start to feel a little… off.
Maybe it’s the lighting that doesn’t quite make sense, the skin that’s too smooth, or the infamous extra fingers. They’re not always obvious fakes, and as the tech improves, it’s harder to tell for sure – take a look at Sora 2. But it’s hard to shake the feeling that something often isn’t right.
That uneasy feeling got me thinking. Is this just another version of the uncanny valley? A concept that’s been around for decades to explain why humanoid robots can creep us out. Could it also apply to the flood of AI images and videos filling our feeds?
I’ve written before about the rise of AI slop and the strange ways we react to machine-made content. This time, I asked researchers who study the uncanny valley whether its principles could also apply to the digital realm.
Welcome to the uncanny valley
“The uncanny valley effect describes how we react emotionally when things start to seem increasingly human,” says Dr Steph Lay, a horror writer, psychologist, and expert on the uncanny.
“At first, we respond positively, but that only holds up to a point. If something gets too close to human but still isn’t perfect, we start to react with disquiet or unease, even disgust. Think of dolls, clowns, or statues. There’s something creepy about how they’re nearly human but not quite right,” Lay says.
Researchers think this sensitivity has evolutionary roots. Spotting small irregularities in faces and bodies may once have helped us avoid danger, detect illness or decide who to trust.
Blame the robots
The concept of the uncanny valley is most often applied to humanoid robots. Especially the kind you’ve probably seen unveiled at tech conferences. But not all robots are unsettling; our expectations play a big role.
“I did some research into how people would feel about having a robot living as a companion in their homes, and most people baulked at the idea of sharing their home with a near-human helper,” Lay tells me. “They would feel much more comfortable with something that looked definitely artificial.”
Dr. Christoph Bartneck, a professor of human-robot interaction in the department of Computer Science and Software Engineering at the University of Canterbury, says it comes down to how closely we scrutinise human likeness. “The more human-like a robot or AI-generated character becomes, the higher our expectations. We don’t expect a floor cleaning robot to exhibit biological movement. It is okay if it moves like a machine.
“But once the robot becomes human-like, we apply human standards,” Bartneck explains. “We’re sensitive to small changes in facial expressions, gestures, and posture. Even the slightest irregularity in someone’s gait can throw us off.”
It’s these “small changes”, the barely-there glitches in a movement or smile or walk, that tip us into the uncanny valley.
When the uncanny hits your feed
So what happens when the almost-human figure in front of us isn’t a robot on a stage but an image or a video online?
“With AI-generated content, there’s often something that the algorithm gets wrong,” Lay says. “Even when the overall image looks polished and perfect, that flaw tips us into that disquieting valley. We might not be able to explain why straight away, but something definitely feels off.”
Interestingly, Lay doesn’t think our instinct to spot those flaws will fade, even as AI improves. “I think this sensitivity will always be there. We’re very closely attuned to what’s real and what’s not, particularly when it comes to faces.”
We may adapt, but not in the way AI companies might hope. “With the current advances in image and video generation technology, we’re in an unprecedented period for exposure to things that aren’t real. Our perceptual systems are primed to learn and adapt, so I think we’ll just get more discerning over time.”
The real stakes of fake faces
The uncanny valley feeling can also shape how we respond to the content around us.
I was interested to know whether people care if the things they see are real or not. “In my experience, people absolutely care,” Lay says. “It all comes down to the reason the image was generated.” She points to her own work on Into The Fog, a YouTube channel telling paranormal stories. Because the creators are transparent about which images and videos are AI-made and which are archival, audiences accept them as part of the storytelling and atmosphere-building.
The problem is when that same content shows up in our feeds with no label or context. “If we’re talking about AI content we encounter on social media, that doesn’t come to us in a neutral way, it’s pushed by an algorithm that has been meticulously tuned to show us things that we’ll react to,” Lay says.
And once it lands, the effect compounds. “The echo chamber effect is real, but it’s subtle and complex. Very few people deliberately seek out AI content. It comes as a push into our worlds, in spaces we think of as social, so there’s already an expectation we’ll engage with it.”
Spotting the unreal in real time
There may be no way out of the uncanny valley. Our brains are simply too finely-tuned to irregularities. So we’ll probably always feel a bit… off about the almost-real AI images and videos filling our feeds.
That’s not a bad thing. It might keep us sharper about what’s real and what’s not. And if you do struggle to tell, Lay says the old advice still holds. “If something looks too perfect, it probably isn’t real.”
She also suggests stepping away for some perspective if you notice that uncanny, unsettled feeling setting on. “If something you see disturbs you, then get away from the screen for a while,” Lay recommends. After all, the more time you spend staring at the almost-real, the harder it gets to see the truth.
Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button!
And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.