If you’ve ever felt that a Zoom meeting or video call wasn’t scratching your itch for an in-person chat, science is on your side. Staring at another person’s face via a live computer screen prompts lower levels of certain brain activities and social arousal, compared with seeing them in reality, according to a study published last month in the journal Imaging Neuroscience.
In a world where screens now frequently supplant real-life sit-downs, the study hints that there could be social consequences to leaning heavily on video conferencing platforms for education, telemedicine or professional exchanges or in our personal lives. The new research also implies that improvements to virtual communication technology could make a difference.
“This is a very nice study,” says Antonia Hamilton, a social neuroscientist at University College London, who was not involved in the research. The researchers’ multimodal methods—multiple assessments of brain activity and social engagement that they used to detect differences between virtual and real-life interactions—were “particularly impressive,” she adds.
The study authors measured eye movements, pupil size, electrical activity in the brain (using electroencephalography, or EEG) and brain blood flow (via functional near-infrared spectroscopy, or fNIRS) among a total of 28 participants. Grouped into 14 pairs and fitted with electrodes and hatlike fNIRS devices, the participants spent a total of three minutes alternating between staring silently at each other for a few seconds and taking brief rest breaks. In half of the trials, pairs faced each other in person through a transparent pane of glass; in the other half, they did so through a live video monitor. The researchers controlled for image size and camera angle to ensure that the face shown on the monitor closely matched the person’s real-life appearance. Each participant completed both types of trial.
In nearly every type of data collected, the study authors found significant differences between participants’ brain and eye activity when comparing the virtual and real-life trials. People looked at their partner’s eyes for longer in person than virtually. During the screen-based task, people’s eyes moved from side to side more, possibly indicating higher levels of distraction. Pupil diameter, a proxy for social arousal and emotional engagement, was larger among participants during the real-life staring task than when their eyes were directed at a computer monitor. Some electrical activity associated with facial recognition and sensitivity to moving faces was stronger among participants during the in-person task, according to the EEG data. And during the in-person trials, the fNIRS measurements (which are similar to those collected by functional magnetic resonance, or fMRI, imaging) showed higher levels of activity in brain parts related to visual attention, facial processing and visual stimulation.
“We now have a wealth of information” demonstrating that video and real-life interactions are meaningfully different for human brains, says Joy Hirsch, senior author of the new study and a neuroscientist at the Yale University School of Medicine. “The context of live social interactions matters perhaps more than we thought.”
The findings are further evidence of what other recent research has begun to demonstrate: that virtual interactions may be less socially effective than those that are conducted in person. One study published in April found that people talk to each other less adeptly via Zoom than in real life—they take fewer turns in conversations. (Zoom did not respond to a request for comment.) A different study from 2022 used EEG to find that paired participants’ brain activity is less likely to sync up across a screen than when they are sitting in the same room.
“It’s reassuring to see that there’s an effect” across all these new measurements, says Guillaume Dumas, a computational psychiatry researcher and cognitive neuroscientist at the University of Montreal. Dumas was one of the authors on the 2022 EEG study but wasn’t involved in the new research. The novel results echo much of what Dumas’s previous work showed but also add to a specific understanding of how video calls change face perception—“which is an important aspect of our social life,” he says.
Yet facial perception isn’t everything, and Dumas notes that he would’ve liked to see tests of more active interaction rather than just silent, still staring. In the new study, he explains, “we are dealing with something that’s very static, compared to what we usually mean by social interaction.”
Jennifer Wagner, a developmental cognitive neuroscientist at the College of Staten Island, City University of New York (CUNY) and the CUNY Graduate Center, who also wasn’t involved in the new study, agrees. “While the results are compelling and contribute to our understanding of face processing, future work will be needed to determine if these differences between ‘in real life’ and ‘on-screen’ remain in conditions when faces are socially interactive,” Wagner says.
Other limitations include the relatively small sample size of 28 participants, Hirsch notes. Wagner adds that not all of the EEG data were in complete agreement. And it’s difficult to account for every factor of difference between looking at a screen and looking through clear glass: there is the possibility that variables such as screen brightness or image resolution made it harder for participants to focus on the monitors than through the glass, Dumas suggests. Yet those things are true in actual video calls as well—which implies that perhaps small, scientifically informed adjustments could boost our experience of connecting online.
The video meeting “is with us forever and ever,” Hirsch says—adding that her research obviously isn’t a reason to avoid such calls altogether (nor necessarily to ban remote work, which has its own benefits). Instead she hopes it will help people better understand the deficiencies of video calls and serve as an impetus to improve virtual communication. “One of the take-homes is that we can identify limitations of this technology and use it accordingly,” she says.
Perhaps monitors with cameras integrated into screens could enable easier eye contact and more social synchronicity, Hirsch says. Reducing video latency and audio glitches might improve engagement, according to Dumas. Augmented reality headsets or more three-dimensional projections of people (as in Google’s Project Starline) could be additional high-tech ways of addressing the problem, he says.
And perhaps the real answer lies in acknowledging that sometimes there’s no replacement for face-to-face interaction. Life, after all, exists beyond our screens.