Virtual reality is here: People can strap on headsets and sensors and interact with CGI avatars of themselves and others in video games, military training simulations, engineering design spaces and even virtual medical labs.
But these avatars have a problem. They tend to be crude, expressionless, graphical sock puppets, lacking that human ability to express subtle facial emotion.
Guillermo Bernal, a Ph.D. student at the Massachusetts Institute of Technology, might have just figured out a solution. In 2017, Bernal launched Emotional Beasts, an effort to adapt off-the-shelf virtual reality hardware with an open-source game engine to create emotive VR avatars.
For his work, MIT awarded Bernal the 2019 Harold and Arlene Schnitzer Prize in the Visual Arts.
“If you go to any state-of-the-art virtual reality platform, you’ll see avatars with faces that are static masks,” Bernal, 32, said in a statement announcing the prize. “I’d like to give them facial expressions, to show whether they are happy or surprised or even angry.”
But that’s easier said than done. For an avatar to express simmering rage or embarrassed arousal or cautious curiosity, the computer running the avatar must be able to sense the same emotions in the live human subject.
Then there are the potential secondary effects. If you teach a computer to read emotions for the purposes of say, setting up a virtual classroom for refugees or facilitating remote mental-health diagnoses, what’s to stop some retailer from using the same tech to figure out which images and ads excite you?
What, in other words, will stop someone from using emotive VR to sell us stuff? For one, law. As Bernal refines his technology, he’s also grappling with the ethical and legal implications of its possible success.
Bernal first needs to equip a VR headset with sensors that could accurately read a wide range of human emotions. He started with a Vive VR headset, which retails for around $500, and modified the overall equipment rig with additional sensors.
Vive already has a camera for tracking the wearer’s eye movements a microphone for registering voice commands. To that, Bernal added what he described as “bio-signal sensors.”
Dry electrodes measure skin tension on the forehead, an indicator of how much the wearer is sweating, which in turn indicates how agitated they are. Bernal also added a heart-rate sensor on the user’s temple and tweaked the microphone to register different tones of voice.
Combined, the sensors, allow the Emotional Beasts setup to “get insights into the respondents’ physical state, anxiety and stress levels (arousal), and ... determine how changes in their physiological state relate to their actions and decisions,” Bernal wrote in a research paper.
The headset plugs into a cluster of computer microcontrollers that run custom algorithms in Python script and feed it all into a VR space that Bernal created using the open-source Unreal graphics engine.
By 2018 he had a working prototype. But modifying the hardware and writing the software is just part of the solution. Bernal also needs data. Specifically, a library of how diverse groups of people all over the world express different emotions.
The way a young, upper-middle-class white guy in America shows anger on his face isn’t necessarily the way a middle-age, working-class Chinese woman might do so.
“Emotions are more complex and socially determined than the simple positive-negative, strong-weak arousal model suggests,” a team led by New York University’s Meredith Whittaker warned in a 2018 research paper. “Even distinguishing fear, anxiety and disgust on physiological grounds turns out to be extremely problematic.”
To accurately project a user’s emotions onto a VR avatar, the system must speak the user’s emotional language. “Data is king in everything we’re doing,” Bernal told The Daily Beast. Right now, he added, “the data is not there.”
So the next step, this summer, is to create a tough, easy-to-use version of the Emotional Beasts system that Bernal can share with testers all over the world. They would play with the new VR setup and feed the resulting data into an online repository so that Bernal can begin building his global library of emotional expressions.
There could be ethical and legal obstacles. Some jurisdictions already give consumers veto power over commercial use of their physiological data, and that could limit the scope of Bernal’s data set, to say nothing of complicating any wider roll-out of empathetic VR.
“The California Consumer Privacy Act for instance covers biometric information and does not allow an exception based on the idea one’s face or other biometric information is publicly available,” Mark MacCarthy, a Georgetown University professor and privacy expert, told The Daily Beast. “The Illinois Biometric Information Privacy Act prohibits companies from gathering, using, or sharing biometric information without informed opt-in consent. So, figuring out your emotions from the way you look or walk or your heartbeat needs your permission.”
Bernal said he’s aware of the hurdles he’ll have to clear. “We need to have those dialogues,” he told The Daily Beast.
It Bernal can untie the legal and ethical knots, a system such as Emotional Beasts could prove beneficial, MacCarthy said. Imagine a diagnostic machine that could glance at a patient’s face and diagnose mental illness. “If trained A.I. systems could recognize the onset of depression early on, that might be great.”
Of course, Bernal’s system or something similar is far from robust enough for widespread use as anything but a game. “It needs lots of testing before deployment,” MacCarthy said.
If it all works out, it’s possible to imagine a point in the future where our VR selves are nearly as expressive as our flesh-and-blood bodes are, Bernal wrote. That possibility, he argued, justifies his work on Emotional Beasts.
“As this medium moves forward, this and other tools are what will help the field of virtual reality expand from a medium of surface-level experience to one of deep, emotionally compelling human-to-human connection,” he said.