Skip to main content

Columnists, social media masters, and podcasters are crawling over each other with finger-pointing and back-slapping – inboxes are dinging at a blistering pace. Depending on how you voted, it’s either a good day or a bad day, and everyone has an opinion.

While we’re all in deep in postmortem, it’s worth exploring how and why everyone got it wrong leading up to the election. Why didn’t politicians know how people felt about calling out pronouns, how young men viewed their futures, how heavily Hispanic-dominated border towns have felt the weight of increased immigration, or how concerned many Americans are about the cost of groceries? Can we finally say that polls are flawed and rethink how we seek to understand the voting public?

As a 30+ year market researcher, consumer insights trends entrepreneur and consultant I’ve watched year after year of polls getting it wrong. There is a time-tested, respected better way.  With no inside knowledge of how either campaign conducts research (and it must be extensive, right?), I respectfully suggest the following for consideration.

Polls Don’t Tell The Whole Story

The major flaw with polls is that they don’t have the structure to ask why, particularly in the rapid-fire environment leading up to Election Day. Therein lies the heart of the matter. Many pollsters were projecting percentages with a margin of error that ranged from 2 to 4% or so. Pollsters are relaying statistical calculations but what ideas or sentiment does that 4% represent?   Recently, I wrote about how polls only give us one data point. I suggested, that at the very least the surveys should include open-ends or verbatims. This is an opportunity for respondents to add thoughts in their own words. Analyzing these open ends can reveal the values, emotions, and issues voters care about most. Follow-up surveys then have insights and information to design more targeted questioning.

Questions about truthfulness are also and issue. According to Tangle, exit polls“have a mixed track record of accuracy” and there are questions about the truthfulness of the answers. This piece from Stephen Pastis highlighting The Shy Voter and the Bradley Effect reminds us of the very real challenges of polling.

New Strategy: Start With The Big Picture

I’m sure that most political researchers are well-informed and consider how global and national issues are affecting citizens. However, I believe that big-picture cultural analysis can tell a deeper, more nuanced story.  It’s not just recognizing the facts of major developments but how they affect people in the short and long term and the values and emotions motivating citizens. The complications of the COVID-19 pandemic, the effects of lead in pipes, or the cost of college didn’t just develop in the last election cycle. Tracking the evolution of changes in consumer sentiment, sales numbers, the types of jobs won and lost, new businesses startups, etc., associated with those developments by demographic and geography will yield necessary insights. Axios has been following up the election with the issues at the heart of the election—the working-class shift, inflation, elitism, etc. David Brooks’ column in The New York Times, highlighted the drift of boys’ education, the opioid epidemic, and obesity.

Campaigns should be tracking these issues at regular intervals, not after the election, to monitor the changes in the electorate. Straight-forward analysis or scenario planning can provide insight for continued research and fodder for new strategies.

Narrow The Focus By Talking To Real People

Next, use those learnings to design and conduct deep ethnographic studies to understand what matters to citizens and why. These sessions, while more costly and time-consuming than polling, are far more effective in identifying meaningful issues for a campaign platform.  A focus group here and there is helpful, but these sessions must be staged in an ongoing pattern, over time, to accurately track the changing attitudes and concerns of the voting public.

On NPR, Scott Simon interviewed Sunmin Kim, a sociology professor at Dartmouth, about why polls failed to predict the election outcome and whether political parties should continue to use polls. Kim replied, “Well, they should because there are practically no better alternatives. But I think we should take polling with more caution, as scholars have long been advocating. I, for one, after witnessing the outcome of this election, is leaning towards a more in-depth reporting or the ethnographic studies of particular communities because when you do the national representative or statewide polling, we often neglect specific dynamics that occur in the underground in the communities, and we are left wondering with the numbers by subgroups.”

Better research would have given candidates a more holistic understanding of why someone can vote for expanding abortion access and vote for Trump. Cultural analysis and primary research will reveal the complexity of voters’ lives rather than their opinion on one issue.

Now, armed with longitudinal studies and a clear picture of what’s changing, what’s important to people, and why, campaigns can be ready to conduct more accurate polls, supplemented by ethnographies/focus groups that illuminate the values driving voter behavior. Until you understand why voters feel the way they do or want to vote for a candidate, polls will continue to get it wrong.