When it comes to determining who won a Presidential debate, methods include questionnaires, phone polling, and dial testing, where people turn a knob to register their feelings on certain questions.
But what if what logic was removed from the process and what someone felt could be measured as well?
Biometric technology that does just that was on demonstration recently in a University of South Florida classroom.
Thirty-five participants – a mix of faculty and students, some who came out for extra credit – sat down Feb. 19 to watch six Democrats do verbal battle in Las Vegas.
But before the debate started, the viewers filled out a survey that asked about their political leanings, what issues were most important to them, and their top three candidates in the debate. That was the measure of what they said.
How they felt was going to be measured by a pair of tiny sensor pads taped to two fingers. Wires ran from those pads to a biometric device – a small box the size of a cigarette lighter – attached to their wrists.
All 35 devices were wirelessly hooked to a computer in the front of the room, which would take in data in real-time.
The sensors were measuring “galvanic skin response” or, simply, how the skin connected to the pads.
“When you're in an interview or something, your hands kind of get clammy, the moisture level comes up. Well, that reduces the resistance or increases the connectivity of the skin,” said Rob Hammond, director of the USF Muma College of Business Center for Marketing and Sales Innovation.
The session was not being done for polling or even experimental purposes – as last-minute volunteers, the group was not randomized the way a normal experiment would do.
However, it was more a demonstration of the technology's capabilities to measure people's unconscious responses.
"What we're doing is we're taking the brain out of the opportunity to be able to say how you feel about something. We've all been to dinner where someone didn't tell you entirely what they thought about the meal, right?” said Hammond. “Well think about that filter going away. And that's what we're doing with this is being able to cut through that and get to actually what was the raw emotional response in terms of how people reacted to that message."
“We're able to measure people in real-life situations,” said Geoff Gill of Shimmer America. “So for the Super Bowl, we had a party and people were eating, drinking, just acting as they normally would. And what's more, we can monitor them throughout the entire period – there’s no other technology that can do both those things.”
And people emotionally responded to two of the Super Bowl ads that also tested highest in polling after the game – Google’s “Loretta” commercial and Doritos’ spot pitting Lil Nas X and Sam Elliott against each other.
During the debate watch party, participants were welcome to chat, text and otherwise act like they normally would.
“I would say maybe I was conscious of the device at first, but since it was a two-hour event, and it was much more of a calm atmosphere with so many people involved that by the end of it, it was just as normal as if you were sitting at home watching the debate,” said marketing senior Sarah Gimbel, a volunteer who also works at the Center.
Gimbel and the other participants responded as they normally would to the action on the screen: when a candidate said something funny, they may have laughed; when one challenged a rival, they may have “oohed” or groaned; and sometimes, they just didn’t react at all.
But Hammond said the devices were measuring the unspoken reactions: how their bodies were reacting, and the intensity of the responses.
The computer in the front of the classroom collected those responses, regardless of whether it's positive or negative. The intensity was shown in color-coded form: blue for no response, yellow for medium, red for the most intense reaction. It was also broken down by individual device and collective response.
Take for instance one of the most engaging moments, according to the data: when Massachusetts Senator Elizabeth Warren challenged former New York City Mayor Michael Bloomberg to release women who used to work under him from non-disclosure agreements.
The day after the debate, Hammond looked at a pie chart of that moment that was about half red and yellow.
"You can see here … 45 percent (of the audience’s engagement) was either medium or high. That's a really significant number."
After the debate, participants filled out a second form. This one asked them who they thought performed the best.
By taking those conscious replies and comparing them to the unconscious responses, Hamond said, they can see which candidate might see long-lasting effects – positive or negative – from their performance, and which might just get a temporary bump that disappears by the next news cycle.
Hammond again used Bloomberg and Warren as an example.
“He ended up actually kind of towards the bottom in terms of the rankings on how people thought that he had performed, but he also had a high unconscious response. Senator Warren had a strong performance but a weak engagement, Mayor Bloomberg had the opposite,” he said.
In other words, while people thought Warren performed well, she didn’t make an emotional connection, so Hammond predicted it wouldn’t mean improved polling numbers for her. This ended up being true, as Warren came in fourth in the Nevada caucuses with just under 10 percent of the vote.
Bloomberg not only had a bad night, it resonated emotionally with the viewers, meaning he likely hurt himself in the polls. That also ended up being the case – while he wasn’t among the candidates in the caucuses, his numbers in one nationwide survey dropped 3 percent.
Gimbel said the experience may have affected who ultimately gets her vote.
"I would say it definitely changed my outlook and perspective on some of the candidates depending on certain comments or reactions in the debate and being able to see our reactions as well,” she said.
And Hammond added that going forward, these devices might help predict not just who people are voting for, but why.
“If you go back to the 2016 presidential election, the polling was not very good. You know, we kept being surprised by how the results were coming in,” he said. “I think this may give us some additional insights as to why people are voting the way that they are.”