“Wait a minute, wait a minute. You ain’t heard nothing yet.” So went the first line of audible dialogue in a feature film, 1927’s The Jazz Singer. It was one of the first times that mass media had conveyed the sight and sound of a scene together, and the audience was enthralled.
There have been improvements since: black and white has become colour, frame rates and resolutions have increased and sound quality has improved, but the media we consume still caters overwhelmingly, if not exclusively, to our eyes and ears.
With the average person’s screen time now nearly seven hours a day, and much of that time spent indoors, our overreliance on sight and sound has only intensified. But given that humans are animals with five (or arguably many more) senses, are we neglecting our other faculties, and what is it doing to us?
Many psychologists categorise our main senses as being either rational or emotional, and there is evidence to back it up. “Smell [and taste are] directly connected to the emotional processing areas of the brain,” says Charles Spence, a professor of experimental psychology at Oxford University, “whereas the rational senses like hearing and vision get processed in the cortex.” In fact, Spence says, more than half of the neocortex – itself more than half the volume of the brain – is given over to processing what we see.
There is no denying that we are highly visual creatures and that is partly why our media are primarily audiovisual. “I think it’s mostly driven by the fact that a lot of the information that we consider important today can be conveyed via visual or auditory means,” says Meike Scheller, an assistant professor in the department of psychology at Durham University. “But what we consider important doesn’t necessarily mean these are the things that we need.”
If you ask people which sense they could not live without, most will say sight, but evidence suggests what we would really miss is our sense of smell. “There’s a much higher rate of suicide and suicidal ideations among people with anosmia, because it’s a sense that’s so strongly linked to our emotions,” says Scheller.
So is neglecting some senses in favour of others affecting our emotional lives? In as much as our emotional health is tied to our social health, the answer is almost certainly yes. “Smell is a really important cue for social communication and this is something that’s not implemented in any technology we’re using today,” says Scheller.
For example, it has been shown that we tend to sniff our palms unconsciously after shaking hands with someone. “That gives you hints about all sorts of things, from their health, to their age, even their personality,” says Spence. “A fair amount of that’s lost if we’re interacting digitally only.”
Touch is similarly important to our emotional lives, and in ways that the finger-focused haptics of our digital devices cannot satisfy. C-tactile afferents, a kind of nerve receptor abundant on the hairy skin of our arms (but not the pads of our fingers), have been shown to create positive emotions when stimulated. “These receptors like slow, warm, tactile stroking,” says Spence.
The cold, sleek touchscreen of a smartphone simply cannot replace the soft, warm, imperceptibly smelly skin of another human. For adults, this may mean less satisfying social lives, but for a generation of children who are increasingly being socialised through technology, the effects could be severe.
Scheller says that children learns to interpret their senses with reference to each other. We might learn to associate some subtle odour with the sound of a person shouting or the sight of them smiling and use these signals to navigate social situations in future. “Those children growing up with less input basically have less training in being able to categorise how certain things smell, or what a certain touch might mean,” says Scheller. “If all of a sudden we take something away that has evolved over millions of years, that will not only be the removal of one sense, but it will affect how all the other senses work.”
Marianna Obrist, professor of multisensory interfaces at University College London, says: “The way we experience everyday life is for all our senses. Everything is multisensory.”
For instance, it is easy to think of the experience of eating as being primarily about taste, but our food’s shape and colour, smell and sizzle, temperature, texture and weight appeal to our vision, olfaction, audition and touch. “All those senses have already started playing before you’re even eating,” Obrist says. And then there is mouthfeel: the physical sensations of spiciness or sourness and of course the flavour.
Removing just one of those senses can have an impact on the whole experience. For example, when people eat ice-cream in the dark they are less likely to enjoy it, or even be certain what it tastes like. “Whenever we have multisensory stimulation, we get a much better and richer representation of the environment around us,” says Scheller.
So what are we doing to make our technology more multisensory? Obrist previously headed SenseX, an EU-funded project aimed at helping designers conceive new ways of integrating touch, smell and taste into their products. The team’s efforts included spraying odours under a subject’s nose to heighten key moments of Christopher Nolan’s film Interstellar, blasting them with ultrasound waves to simulate touch and using high-intensity acoustics to levitate food on to the tongue without needing wires or tubes.
It is hard to imagine that anytime soon you will watch Robert Duvall’s Lt Col Kilgore deliver Apocalypse Now’s most famous line while your laptop spritzes eau de napalm-in-the-morning up your nose, but smell and taste interfaces may be on the horizon. Researchers are already using AI to try and find primary odours from which any smell can be concocted, and Obrist is the chief scientific officer of OWidgets, a company that produces digitally controlled scent delivery systems with applications in research, healthcare and immersive reality experiences.
There are also companies such as Dexta Robotics in China that are bringing tactility to virtual reality with a glove it calls the Dexmo.
“Dexmo can provide tactile feedback and force feedback at the same time,” says Dexta chief executive Aler Gu, “meaning when you scroll your fingers through a virtual brick, you can feel the texture of the surface. When you grab and move the brick from one point to the other, you can feel the physical shape.”
Media that harnesses all the senses would surely enrich our daily interactions with technology, but it is not hard to imagine more insidious uses emerging. In 1957, an American market researcher named James Vicary claimed to have spliced single frames reading “Eat popcorn” and “Drink Coca-Cola” into a film. He reported a 57.5% and 18.1% rise in popcorn and Coca-Cola sales respectively, and the concept of subliminal advertising was born.
Vicary was later exposed as a fraud and the efficacy of subliminal advertising has been a matter of debate ever since, but would technology that could digitally deliver smells and tastes be a gift to unscrupulous advertisers? “Our bodies have a very strong emotional response to [these senses]. They can be extremely powerful,” says Scheller. “It has great potential to influence our decisions because we are very emotional decision-makers.”
Studies have shown that exposure to certain tastes and odours can influence our judgment of other people’s appearance and personality, and even alter our behaviour. Tasting bitter foods, for example, can make us hostile, and a 2005 patent application suggests the smell of pink grapefruit will make a man perceive a woman to be younger than her actual age.
Obrist’s team has found that sour tastes can make us more willing to partake in risky behaviour. “You might be doing some e-banking or online shopping, and you’re drinking your sour lemon drink, and that might indirectly influence your decisions,” she says, and it is not hard to imagine how an e-commerce or gambling app may exploit devices that can deliver tastes and smells.
To an extent, this kind of thing is already happening. Companies are known to pump pleasant scents into their shops, and American chain Cinnabon deliberately places ovens near store entrances, sometimes baking trays of just sugar and cinnamon, to entice passing shoppers.
And what if we take it even further? Of the nearly 63 million people who voted for Donald Trump in 2016, the vast majority had only experienced him through two of their senses. What if media outlets used our devices to deliver a subtle aroma of soured milk while airing a speech by one political candidate and freshly baked biscuits for another?
After all, a study from 1940 showed that people were significantly more or less likely to identify with political slogans such as “Down with war and fascism!”, “Workers of the world unite!” and “America for Americans!” depending on whether they were subjected to a putrid smell or given a free lunch.
If the news allowed us and our leaders to taste air pollution in Delhi, feel wildfires in California, or smell the smoke and sewage in Gaza, would the appeal to our more emotional senses move us to act, or to bury our heads deeper in the sand? It is hard to imagine a willing audience for such a sensory assault, but our senses evolved to help us navigate and respond to the world we live in, and from that point of view, only using two of them cannot be ideal. “The more information we have,” says Scheller, “the more able we are to actually act within our environment.”
For the time being, instead of holding out for digital technologies that can stimulate our neglected senses, Scheller suggests we may do well to go outside and see our friends in person, feel the breeze on our skin, smell the roses. After all, as far as our devices go, we ain’t smelled nothing yet.