
All About Exit Polls (with a Plea for Caution and Reporting)
By Lambert Strether of Corrente.
2020 was not a good year for pollsters. Guardian: “Polling industry the night’s big loser as 2016 debacle repeats itself“; Jacobin: “After the 2020 Election, Polling Is Dead“; Al Jazeera: “The landslide that wasn’t: What the elections say about America“[1]; and the Bangor Daily News: “Susan Collins defied the polls. Here’s what they may have gotten wrong.” In fact, at least on election night itself, a punter who followed the betting sites would have, correctly, gone to bed and slept easy, unlike those who stayed up to watch the networks.
This post, however, is not about horse-race polling, but about exit polling: Polling that does not try to pick a winner before election day — can’t we just wait? — but that seeks to explain what happened[2]. I will first give a short list of “hot takes,” which are even now no doubt congealing into conventional wisdom. Then, so we know where the hot takes come from, I’ll look at the sources of exit polling data and the methods[3] used. I’ll conclude with some thoughts on how to improve the role pollsters play in our elections, and other ways we might come to understand election results. To the hot takes!
Hot Takes
Here is an, er, random sampling of hot takes from the Twitter. There are many more like them! Running through the various identities:
On Latinos:
The 2016 exits didn't track this, but increased class polarization among Latino voters would be one of the more promising demographic signals of the 2020 election pic.twitter.com/lukKwJrCoQ
— Matt Karp 🌹🦏🇺🇸 (@karpmj) November 8, 2020
(Note the lack of a source).
On LGBTQ+:
61 percent of voters polled who identify as LGBTQ+ say they voted for Biden, according to an exit poll done by Edison Research for the National Election Pool. This is a lower percentage of LGBTQ+ voters than supported Barack Obama or Hillary Clinton. https://t.co/SABZVU3fU0
— McG 🏳️🌈 ✊🏽 🇺🇲 🐝 #TeamPete is in quarantine (@Foxmental_X) November 6, 2020
On Whites:
The AP vote cast exit poll says white voters were 74% of the electorate. Huge difference from Edison’s 65%. Any idea why? (Per my earlier Tweets, I don’t trust Edison since they say white college educated women went for Trump net +1, a clearly erroneous finding).
— Bruce Mehlman (@bpmehlman) November 6, 2020
On demographic shifts generally:
This year's Edison Exit poll on demographic shift.
There are no surprises w the white vote (except Trump lost ground w white men?) but there are a lot of alarming trends here. pic.twitter.com/ScyAF1y1L4
— M D (@MelADavis) November 4, 2020
Summarizing the surprising demographic results in prose, USA Today, “Election 2020 exit polls: Political pundits utterly failed to predict Donald Trump’s voters“:
Coverage of the Donald Trump vs. Joe Biden campaign largely focused on four areas: women, racial minorities, senior citizens and suburban voters. Conventional wisdom held that Trump would be slaughtered by all of them, thereby handing Biden a landslide.
Turns out, that was almost entirely incorrect. Trump improved over his 2016 performance among Black women (+4), Black men (+5), Latino women (+3), and Latino men (+4). It is almost certainly true that minority voters turned out in far larger numbers in 2020 than in 2016, racking up critical votes for Biden in battleground states like Pennsylvania and Michigan, but Democratic strategists must be wondering what it means that Trump earned a larger share of the non-white vote than any GOP presidential candidate since 1960.
(This happens to be written by a Republican, but I’m sure you’ve seen the same points made elsewhere. All the links go to CNN, by the way, and not directly to CNN’s source which, amazingly or not, goes unlisted on CNN’s page).
So those are the hot takes. Now I’ll go on to show why anything numerical in these results should be approached with caution. First, we need to understand the sources.
Exit Polling Sources
As it turns out, in 2020 we have not one but two[4] exit poll sources (though one doesn’t call itself an exit poll, for practical purposes they are the same. From the Washington Post, “How is TV news going to cover the weirdest, most fraught election in history? All of your questions answered“:
As they have for nearly two decades, the big three broadcast networks and CNN will join as the National Election Pool [NEP] to share data collected by a firm called Edison Research, which conducts exit polls — via both in-person and phone surveys of people who have already voted — to anticipate the trends within this year’s electorate. While “a handful of people at each network” will be permitted to review the exit-poll data during the day, they will not be allowed to report it until 5 p.m. Tuesday, said Edison executive Joe Lenski. Edison also will collect the actual vote tallies from across the country as they are released by local jurisdictions.
But Fox News and the AP left the pool after 2016 and have struck out together, hiring a research operation affiliated with the University of Chicago to help them prepare their projections. Arnon Mishkin, head of Fox’s decision desk, said his organization was disappointed with 2016’s exit polling, which skewed the results by capturing a disproportionate number of younger and college-educated voters — many of whom lean Democratic — and didn’t fully probe the voting sentiment of mail-in and early voters. What this means is that for the first time since 1988, you’ll see not one but two different polls of the electorate as you flip the channels.
So, at a minimum, when you see a hot take with a chart that doesn’t cite to either source, you should approach it with caution (or even reject it entirely, as you would with a Covid chart that gave no sources).
So, we have a duopoly with Edison and VoteCast. Here Pew Research on Edison’s logistics:
The exit poll is a major operation. Edison expects to survey about 16,000 early and absentee voters by phone, [Joe Lenski, Edison’s co-founder and executive vice president,] said, and another 85,000 or so voters in person. “Between exit-poll interviewers, vote-count reporters, supervisors driving around checking on sites, and the two very large phone rooms we’ll be operating on Election Day to take in those results, we have close to 3,000 people working for us on Election Day,” he said.
The exit poll is more a set of interlocking surveys than a single, uniform poll. Aside from the phone and in-person components, Edison will field state-specific questionnaires at 350 of its 1,000 or so polling locations, in addition to the national questionnaire all respondents receive. The idea, Lenski said, is to be able to ask about issues that might be particularly relevant in key states.
And now Associated Press on VoteCast:
AP VoteCast combines interviews with a random sample of registered voters drawn from state voter files with self-identified registered voters selected using nonprobability approaches. In general elections, it also includes interviews with self-identified registered voters conducted using NORC’s probability-based AmeriSpeak® panel, which is designed to be representative of the U.S. population.
Interviews are conducted in English and Spanish. Respondents may receive a small monetary incentive for completing the survey. Participants selected as part of the random sample can be contacted by phone and mail and can take the survey by phone or online. Participants selected as part of the nonprobability sample complete the survey online.
In the 2020 general election, the survey is expected to complete about 140,000 interviews with registered voters between Oct. 26 and Nov. 3, concluding as polls close on Election Day.
So, although VoteCast is treated as if it were an exit poll, it is not, unlike (mostly) Edison, conducted at the exits of polling stations. And now to the methodological issues for each.
Exit Polling Methods
The response rate problem is common to all pollsters, and has the potential to vitiate the entire industry. From The Age, “Why the polls were wrong – and will never be right again“:
In the age of the mobile phone, very few people answer calls from unlisted numbers, and even fewer want to talk to a pollster – who, for all they know, may be a fraudster in disguise. The Pew Research Centre reports that its response rates have plummeted from 36 per cent two decades ago to just 6 per cent now. And Pew is a not-for-profit outfit that doggedly attempts to contact every sampled phone number at least seven times. Commercial polling firms don’t have that luxury.
No major commercial polling company is brave enough to reveal its response rate. Rumours are that they’re down to about 3 per cent.[5] That’s a very thin foundation on which to predict a presidential election. The tiniest inconsistency between the characteristics of that 3 per cent and those of the electorate as a whole could invalidate the entire industry.
The pollsters do their heroic best to model the likely behaviour of the masses from the self-reports of a few phone-answerers, but all such models are approximations. They inevitably introduce error. Model error may be even bigger than the sampling error that goes into calculating the “error margins” that are often reported alongside polling data. Or it may not be. No one knows but the pollsters, and they’re not saying.
Hitherto unique to election 2020, an unprecedented number of votes were cast by mail (“67% of Republicans said they planned to vote in person on Election Day, according to a Marquette University Law School poll, compared with just 27% of Democrats”). Edison and VoteCast handled this change in voting patterns differently.
First, from Edison, “NEP & Edison Research to Once Again Conduct Exit Poll of Record“:
The NEP’s exit poll is the only survey that will be released on election night that represents the views and opinions of actual voters interviewed as they cast their ballots all across the country.
As it has since 2004, the NEP exit poll will also include extensive telephone surveys of those planning to vote by mail to ensure that all voters are represented in Election Night coverage across the pool’s member networks and subscribers. This year, those polls will reach more than 25,000 voters casting ballots before Election Day.
For the first time in 2018, NEP’s exit poll included in-person interviews with those voting at early voting locations. The technique proved highly accurate in Nevada and Tennessee, the two states in which it was used that year, and was successfully expanded in this year’s presidential primaries in North Carolina and Texas. For the presidential election this fall, early voters will be interviewed in person in eight critical states.
“In 2018, Edison and the NEP pioneered the technique of conducting interviews at in-person early voting sites, and today, we’re using that valuable experience to expand those efforts for 2020,” said Lenski. “It’s simply a matter of taking our time-tested models and applying them to the ways people vote today.”
So Edison is an exit poill, except when it isn’t. Now, VoteCast, “AP VoteCast isn’t an exit poll. It’s better“:
In 2016, more than 40% of the electorate voted early, absentee or by mail. Unlike the legacy exit poll, AP VoteCast meets registered voters where they are, reaching them via mail, by phone (landline and cell phone) and online, using a random sample of registered voters to carefully calibrate a massive poll conducted using opt-in online panels.
AP VoteCast extends beyond the traditional battleground states. It captures the opinions of registered voters who cast a ballot early, on Election Day or not all. By gathering data from a sample size more than six times the size of the legacy exit poll, our survey provides greater insight into various subsets of the population, including Mormons, Muslims, Jews, Latinos, veterans and other groups of voters.
Now, at this point, nobody can say which is more trustworthy: Edison or VoteCast; it’s far too soon for any academic research to have taken place (“Exit Poll of Record” [POW!] vs “legacy exit poll” [OOF!]). All we can say is that they gave different results. From the Poynter Institute, “The AP and Fox News say Biden has carried Arizona. Why do other networks say it’s too close to call?:
Fox News and The Associated Press deemed Arizona a win for Joe Biden on election night, making their calls three hours apart. President Donald Trump and his campaign howled in protest against Fox.
Now, a day and a half later, CNN and other broadcast networks insist that while Arizona may be leaning Biden, the race is still too close or too early to determine the winner.
Why that disparity?
Formulas for vote counts and projections are wildly complex mathematically and expensive to create, but there is a simple explanation.
AP and Fox pulled out of a consortium of networks after the pooled effort had produced shaky results in 2016. The rest of the networks stayed in, thinking the system could be tweaked while the AP had concluded it was broken.
The issue was whether the accelerating move to early voting and mail-in voting, advancing cycle after cycle, made traditional election day exit polls invalid. AP said yes and embarked on inventing a new methodology. The Fox News decision desk, an AP client, agreed and collaborated.
Ditching exit polls, the new formula relies on votes counted so far plus an informed estimate of how many votes remain to be counted and where. The likely split can be inferred by party affiliations, the mix in a given county of those who already voted and other factors.
Sally Buzbee, executive editor of AP, explained her thinking in an email interview with me last week:
“We made the difficult decision to pull out of the network exit poll consortium. Working with NORC at the University of Chicago, we developed a new methodology and tool called AP VoteCast, which also captures early voters and which has proven highly accurate and robust.
“We did not develop AP VoteCast for the pandemic: We developed it because we saw the long-term trends. But it has proved a huge blessing given the pandemic.
“In a pre-election webinar in which Buzbee participated, Sam Feist, Washington bureau chief of CNN, explained why his network went another direction, sticking with the consortium and its vendor, Edison Research. Simplifying just a bit, Feist said that he and others who stayed believed that a supplemental version of exit polls could be constructed for the early voting and mail-in segments.
Hoo boy. “Informed estimate”? That sounds like a Nate Silver-esque secret sauce (or, less politely, a fudge factor). Nevertheless, the VoteCast client made the early call, which stood up. But that’s the horse-race. Will Edison or VoteCast be more trustworthy with exit pollings other function, explanation? We just don’t know yet.
Conclusion
As readers know, I didn’t pay much attention to the polls during the general (sorry, guys, I know you needed the clicks). My subjective sense is that doomscrolling through poll results takes time that would be better spent doing almost anything else, including bowling, snooker, drinking, smoking, etc. One solution to any problems polls create is “election silence“; Israel, for example, bans polls for 15 days before an election, though most other bans are expressed in hours. “Social media blackouts” (7 days) have also been proposed. (For myself, I’d consider a 30 day ban for both, as a minimum. Then again, why not abolish both “industries” entirely?) What’s wrong with having the results of a horse-race come as a surprise?[6]
Finally, I’d like to express a desire that we try to refocus just a little bit from symbol manipulation through numbers and charts to reporting by humans about humans. For example:
honestly would be very curious to see some shoe leather reporting on this one, completely baffling to me
— ryan cooper (@ryanlcooper) November 7, 2020
From the Harvard Gazette, “The problems (and promise) of polling“:
Political scientist Theda Skocpol isn’t ready to give up entirely on polling just yet, but she does think the current process, which often relies on dinnertime robocalls, “artificially constructed” focus groups, and oversimplified voter categories, needs a serious overhaul. “The whole way we think about what’s going to happen politically is not based on talking to people or observing people in their contexts,” said Skocpol, Harvard’s Victor S. Thomas Professor of Government and Sociology. “It’s based on these methods of data collection, and also thinking about the data, which aren’t working anymore.”
For Skocpol, what works is something she has done for the past several years while researching her most recent book, “Upending American Politics: Polarizing Parties, Ideological Elites, and Citizen Activists from the Tea Party to the Anti-Trump Resistance,” with co-editor Caroline Tervo. Together Skocpol and Tervo made repeated trips to eight counties in four swing states — North Carolina, Pennsylvania, Ohio, and Wisconsin — developing close relationships with people on the ground. Skocpol sees similar grassroots efforts as the key to effective polling in the future.
This 2016 story by the New York Times, “Many in Milwaukee Neighborhood Didn’t Vote — and Don’t Regret It,” really helped that election come into focus for me. More like that, please. Much more.
NOTES
[1] I have asserted that in 2020 pollsters became political actors at the tactical level (seeking to bring about the result they desire by making it seem inevitable). This is true in the newsroom, as Taibbi points out:
In the last few weeks I’ve heard from multiple well-known journalists going through struggles in their newsrooms, with pressure to avoid certain themes in campaign coverage often central to their worries. There are many reporters out there — most of them quite personally hostile to Donald Trump — who are grating under what they perceive as relentless pressure to publish material favorable to the Democratic Party cause.
I grant I have no evidence similar pressures are taking place in the office cultures of pollsters. However, it would seem strange that pollsters are immune to pressures otherwise pervasive throughout the political class. This doesn’t imply a Bond villain, but rather the usual self-censorship and slanting due to careerism.
[2] Yes, exit polls are used by the networks and other media on election day to help assess candidate’s likelihood of winning before actual votes accumulate. This use case may even be how exit pollsters make most of their money; I don’t know. However, the social purpose for which pollsters exist is to, as it were, to hold a mirror up to the electorate, so we can get some idea of who voted for which candidate, and why. And, no doubt, to handicap the next race. Interestingly, exit polls were first designed to explain. The (more lucrative?) horse-race function came later. From Samuel J. Best and Brian S. Krueger, “The Exit Poll Phenomenon” (SAGE):
Exit polling developed in the 1960s out of a desire by journalists to explain voting results to their
audiences. Over time, it transformed from a modest effort at CBS News to estimate the outcome of the 1967 Kentucky gubernatorial election into a multimillion-dollar operation sponsored by a consortium of television networks and designed to project race winners and explain the preferences of numerous voting groups. Along the way, it overcame technical malfunctions, internal squabbles, and erroneous calls to become the centerpiece of media coverage of the elections.
[3] I won’t discuss statistical methods. For a bracing look at that topic, see this comment by alert reader Terry Flynn, who writes: “TL;DR – if you don’t correct heteroscedasticity in polling data first then “more data” doesn’t produce better results, often the opposite.” For heteroscedasticity, see here. Flynn urges that “the polling organisations REFUSE to poll using correct methods as known since 1985 because of their funding masters, laziness and fear of striking out from the pack,” which certainly triggers my priors and implies, pleasingly, that the entire polling industry could be abolished (if not revolutionized). Does anybody wants to take a second shot at explaining the entire thread — maybe using one or another of the “hot takes” I supplied?
[4] The U.S. Census, as part of its monthly Current Population Survey (CPS), surveys voters for age, sex, “race and Hispanic origin,” education, poverty status, and income — but not who they voted for!
[5] Pew Research: “The good news is that Pew Research Center studies conducted in 1997, 2003, 2012 and 2016 found little relationship between response rates and survey accuracy, and other researchers have found similar results. The bad news is that it’s impossible to predict whether this remains true if response rates go down to 4%, 2% or 1%, and there is no sign that this trend is going to turn around as peoples’ technology habits continue to evolve.”
[6] Another function that exit polls are used for, at least internationally, is to serve as a check on the election results. As TDMS Research points out:
The possibility that our vote counts are corrupt cannot be dismissed off-hand or ignored. Computer vote counts are never verified by full hand counts and the vote counting software is proprietary—hidden from view and inaccessible to the public.
However, all our other electoral functions are so deteriorated I’m not sure that checks by exit polls are all that helpful. Voter rolls are going digital, which means they are by definition hackable. Voter rolls have also been corrupted by Republican operatives like Kobach. Democrats seem to be interposing more and more barriers between the voter’s expression of intent and the actual ballot, whether through inherently unauditable ballot marking devices — Stacy Abrams, though justifiably emphatic in praise of Georgia’s voter registration efforts, is, oddly, completely silent on Georgia’s horrid electronic voter rolls and ballot marking devices — or through vote by mail (which makes voters who don’t follow directions well vulnerable to disenfranchisement, as well as those who, through life circumstances, find it hard to “make a plan to vote.” Voters are now spoken of as having “banked” their early votes, leading one to wonder what rent is being extracted by the banker; the opportunity cost, I would imagine, of already having voted for Biden when he slipped a major cog, or for Trump when there turned out to be video of him at Little Saint James).