Reality Check: An insider’s guide to how political polls work

share
A poll worker at Montbello High School in Denver. Photo courtesy the Rocky Mountain News archives via the Denver Public Library
 
While the first political polls began in the 1824 presidential election as straw polls, or informal headcounts on candidates and issues, polling became a regular part of American media consumption with George Gallup’s  weekly “America Speaks” newspaper column in the 1930s, reporting the public’s thoughts on issues of the day through nationwide public opinion polls.
 
Today, nearly every news organization — along with political candidates — run their own polls.  

The New York Times/Siena College Poll and ABC News/Washington Post poll rank at the top, based on their track record of accurately predicting elections and methodology, according to FiveThirtyEight.
 
In Colorado, polls in 2022 accurately predicted a tight race between Republican incumbent Lauren Boebert and Democrat Adam Frisch for Colorado’s 3rd congressional district, preceding Boebert's decision to change congressional districts for the 2024 race. 

To understand how polls work, RMPBS spoke with two Colorado-based pollsters. Floyd Ciruli is a political analyst whose clients have included local media outlets, chambers of commerce and school districts, and Andrew Schwartz, serves as vice president of Beacon Research, a national research firm that conducts polls and political analysis for clients including Fox News, New York Democratic state assembly members and Massachusetts governor Maura Healey.
 
RMPBS spoke with them separately and have included their answers together here. The following conversation has been edited for length and clarity.
Andrew Schwartz is the vice president at Beacon Research. Photo courtesy Andrew Schwartz
Andrew Schwartz is the vice president at Beacon Research. Photo courtesy Andrew Schwartz
Floyd Ciruli is the president at Ciruli and Associates. Photo courtesy Floyd Ciruli
Floyd Ciruli is the president at Ciruli and Associates. Photo courtesy Floyd Ciruli
Rocky Mountain PBS: How have the way polls are conducted changed over time?
 
Floyd Ciruli: When I first started in the 1970s, we were still talking to people in person with clipboards in hand. We were starting to phase that out in favor of phone calls. From the 1970s to the early 2000s we conducted polls over landline phones.  And in the early 2000s, we started to switch to cell phones. Nowadays, 95% of our survey calls are targeting people’s cell phones. But increasingly people don’t answer their phones as much as they used to. And it’s expensive and time-consuming.
 
So today, about half of the polls you read about are conducted through online panels.  Some of them are opt-in panels where we reward people for joining (e.g. through a small monetary reward). Other panels – the best ones -  actually survey people and get them to join the panel. The nice thing about a panel poll is there’s maybe a thousand people on the panel in each state who will regularly poll for you, often online. And if a pollsters wants to poll only seniors or only people with a junior college degree or less, they can send their poll only to those people on their panel list.
 
Andrew Schwartz: There’s a ton of innovation in the industry and new ways of reaching people especially in the last eight years.  You can try to get people to take your survey by putting a pop up in front of them as they browse the internet. You can email or text them.
 
FC: Text messages are very inexpensive and it takes just a second to send one. If a candidate sends 10,000 people a text message with a survey, even if most people don’t answer, they’ll still get some responses, even if it’s just, say, 700 people.  
 
RMPBS: What attributes must a poll have to be legitimate?
 
AS: A typical good poll surveys 600 to 1,000 people. The margin of error on a sample of 600 people is 4 points and on 1000 people, it’s 3 points. 

You could probably go as low as sampling 400 people. But with a total of 400 people, if you start talking about how a subgroup votes, say women, you’re actually talking about like 200 people, and the margin of error on that is much higher, like nine. As a general rule, most organizations won't release or highlight subgroups that are smaller than 100. That's where your statistical reliability goes out the window and you get a lot of false reads at that point.
 
We're always looking at age, ethnicity, gender and, certainly after 2016, education. There are a lot of other demographic variables that different pollsters can, and do, look at, like homeownership, religious attendance, whether you voted in a recent election, whether you live in an urban or rural area. That’s just a handful of things pollsters look at as a way to try to be sure that they’re getting a representative sample.
 
It has to be a representative sample underpinning the poll. So if, for example, the Sierra Club had a question on the front page of their website asking everyone who visited their website: “Do you support or oppose drilling in the Arctic?” That's not a survey of Americans. That's a survey of people who visited the Sierra Club webpage, which is going to be a certain kind of person. So if they reported those results and said something like, “90 percent of Americans think we shouldn't drill in the Arctic,” that wouldn’t be a legitimate poll.
 
RMPBS: What else should voters look out for to determine if a poll is not legitimate?
 
AS: One thing to look out for are polls that ask for a donation after you answer their question.
 
Polls that campaigns release should also be taken with a grain of salt. If a campaign releases a poll that says that they're down two points, I'd be willing to bet that they're actually down eight points. But they're going to say it's two points to keep people excited and donating and generating attention.
 
RMPBS: Are you saying campaigns may make up numbers as a form of marketing?
 
AS: It’s more likely that they’re releasing a subset of the questions and picking and choosing what to release.
 
FC: Reputable media outlets don't publish candidates polls to any great extent because they assume they could be biased. The candidate is only going to give out what they want. Some smaller outlets may release them if they are hungry for news. And people like myself pretty much ignore them.
 
RMPBS: You both have worked on polls for candidates. Are these the types of polls you work on?
 
AS: Most polls that we work on don’t get released. They’re not designed for that. They’re designed for internal strategic purposes.
 
FC: Candidates conduct polls to tell them what their position should be and to determine what the biggest issues are in the race. They're not going to change their values. They’re going to remain a moderate or a conservative or a liberal, but they might see, for example, what voters would think if they came out seeking a 20 week limit on abortion.
 
Candidates poll constantly, and at this point — August, September, October — they’ll be conducting shorter polls to figure out where they’re at and what they need to do to get more votes.
A cartoon depicting the 1948 presidential election between Democratic President Harry S. Truman and Republican Thomas Dewey. Despite several polls predicting a landslide victory for Dewey, Truman won the election in one of the biggest and most well-known political upsets in U.S. history. Photo courtesy Clifford Kennedy Berryman, National Archives
A cartoon depicting the 1948 presidential election between Democratic President Harry S. Truman and Republican Thomas Dewey. Despite several polls predicting a landslide victory for Dewey, Truman won the election in one of the biggest and most well-known political upsets in U.S. history. Photo courtesy Clifford Kennedy Berryman, National Archives
RMPBS: What went wrong in the Trump-Clinton election in 2016? Everyone predicted Hillary Clinton would win, but she didn’t.
 
AS: The polls had too many college-educated voters and the fault lines in the election were around education. By and large, college-educated voters were voting for Hillary, and voters without a college degree were voting for Trump.

That educational polarization was new — it’s not something that we had seen to that degree prior to 2016. Before 2016, it didn’t hugely matter if your sample was a little bit underrepresenting the non-college educated voter because the non-college educated voter was voting along similar lines as college-educated voters.
 
RMPBS: What have pollsters learned since 2016?
 
AS: Everyone doing this work is now aware of the importance of speaking with voters of different education levels. There has been a real broadening of ways to reach people. And a lot of that is the recognition that if you only have one way of reaching people — like calling people on the phone, which generally speaking tends to be college-educated voters — and those people are a little bit different from the rest of voters, your poll is going to be off.
 
We also started weighting responses in 2016 and we’re doing more of that now. Weighting is basically slightly dialing up or slightly dialing down certain respondents based on their demographics in order to get more of a representative sample. Let's say I have one non-college-educated respondent in my sample and I need ten. I really shouldn't multiply that person by ten and say, “Great, I'm all set.” I should go out and find six other seven other people and then multiply each of them by a small factor — like 1.5.
 
RMPBS: Do conservative or liberal media outlets skew their polls in any way?
 
AS: They really shouldn't be and generally they're not. If a Fox News poll was five points more favorable to Republicans every year than the actual election results, they would lose credibility. The same in reverse is true for an outlet like MSNBC. The professionals behind each of those polls are working to make them as accurate as they can be, because that's what they’re getting judged on. At the end of the day, the goal is accuracy.
 
FC: All these polls at some point have to run up against what happens on election day — our version of the Super Bowl.  If you want to stay in the polling business, you better be in the ballpark or else have a good explanation as to what happened.
 
RMPBS: What happens if the pollster performs poorly?
 
FC: You won’t get hired. Your newspaper will lose credibility. Newspapers frequently change pollsters when things don’t go well.
 
RMPBS: How much can a poll influence a voter? Does the poll itself affect how people will vote?
 
FC:  There really is no definitive answer. There is no doubt that polling affects news coverage. There is a tendency for news outlets to ask the frontrunner why they’re ahead and ask more positive questions - and ask more negative questions toward the opponent e.g. “Why is Mr. Trump behind? He seems flummoxed and confused.”
 
One reason why the news was so bad for Biden after his terrible debate performance is [that] it was being reinforced by polls showing the public thought he did terrible. The polls led the news to question him further about his age and his capabilities. That’s an effect strictly related to the media.
 
The effect you’re asking about is the bandwagon effect. If a poll comes out and says, say, “Harris is ahead,” will it lead more people to get on the bandwagon? The evidence is mixed. The one thing about the bandwagon effect is that that wagon can stop at any point.
 
There could also be a dampening effect. If a candidate is far behind, people may feel like there is no use in voting. However, evidence shows the dampening effect is not very strong because the candidate who is trailing behind will campaign harder and say “We’re close. Get out there and vote!” and send more emails and text messages encouraging people to vote. So really there is no definitive answer on either the bandwagon effect or the dampening effect.

RMPBS: Is the purpose of polling simply to inform — or is to skew the electorate one way or the other?
 
AS: Polls conducted by the media are really designed to hold up a mirror to the public and show what people are thinking and what issues they find important. There’s not a lot of persuasion in there. It’s not the role of the media to be persuading people.
 
Polls sponsored by a candidate or by advocates for a ballot initiative — their purpose is typically to develop strategy. The poll is like testing a petri dish in a laboratory.
 
If we’re doing a poll for a candidate, we might test positives and negatives on our own candidate so that we know strategically what their biggest vulnerabilities are. Let's say that the candidate failed out of school and has a DUI on their record. It’s helpful to know which is the bigger vulnerability. If the poll shows it’s the DUI, the candidate can strategize and decide not to go out campaigning about personal responsibility and making the right choice, because at some point that’s going to blow back up.
 
RMPBS: Have there been any notable moments in Colorado history where polling came front and center?
 
FC: In general, there’s not a lot of national polling going on in Colorado anymore. When we were a battleground state, back in 1998 and 2000 — when Bush won, things were different. But Obama killed it in 2008 and since then, a lot of young people, mostly liberals, moved in and now we’re no longer a swing state.
 
There’s a very good pollster in Colorado, Chris Keating, president of Keating Research who’s polled for Adam Frisch, the current democratic candidate for Colorado’s 3rd congressional district and who ran in 2022 against incumbent Lauren Boebert and came incredibly close.
 
Keating was one of the only pollsters that said that Frisch was close and actually had at least one poll that said he was winning back in 2022’s midterm elections.  A lot of the Republican commentators said that was incredibly unlikely, that it was a firmly Republican district by several percentage points. And it turned out that Keating was absolutely right [Frisch lost to Boebert by only 546 votes].  
 
That was an example of a pollster catching what was really going on. For the upcoming November 2024 election, Boebert has swapped districts and is now running for the 4th congressional district because she knew she would lose her current 3rd district.
 
RMPBS: Who conducts polling calls over the phone? Are they in the U.S. or abroad?
 
AS: Pollsters design the surveys, write the questions, figure out who to call, analyze the data and figure out what the data means. But we don’t do the actual calling. Most of that is done by call centers or phone banks, the best of which are in the U.S. A lot are in the Midwest.
 
You want your interviewers to have a pretty neutral accent. If you had somebody calling up with a thick southern accent asking questions about, let's say, gun control and reproductive rights, you might feel less comfortable talking to that person if you could imply what you thought that person's views were.
 
RMPBS: Is the selection process for a caller as intensive as picking a jury and assessing what their political views are? Is it like auditioning an actor?
 
AS: No, it’s not that so much. The interviews are often recorded and monitored so you can go back and figure out if there were any problems. You can go back and see, for example, if a particular interviewer got more completed interviews with women than men or with Democrats than Republicans. And if that were the case, that could flag that something is not quite right, that the interviewer is not as neutral as they should be.
 
RMPBS: Do people conducting the polls tend to reveal who they are working for?
 
AS: They will almost always say, “I work for a research company.” That’s because if you got a call and the interviewer said they worked for the Lauren Boebert campaign and wanted to poll you on something, many Democrats might just hang up and say, “Never call me again.”
 
Even callers working for polls sponsored by media outlets generally don’t mention who's sponsoring the poll. Certainly a long time ago it was fine to say, “Hey, I’m from the New York Times or CNN and I’m taking a poll.” That’s not true anymore.  
 
RMPBS: Where do the callers get the contact information of the people on their list?
 
AS: If you're conducting a phone survey, one method is to just dial numbers randomly. I kid you not. It's a very statistically valid method. If I was doing a national poll and knew I needed ten responses out of Rhode Island, I could dial 401 for Rhode Island and then randomly enter the next seven numbers. That's a totally reasonable way to get a random sample of the population.
 
Most call centers work off voter files. When people sign up to vote, they often enter their name, address, and date of birth which gets inputted into a database. There are commercial vendors that may append other information to that record such as credit bureau information.  
 
RMPBS:  Is there anything we didn’t touch on that you’d like to add?
 
FC: Polling is hugely important. Joe Biden dropped out of the race because of polling. The Democratic Party, whose primary job is to win elections, came to the conclusion they couldn’t win. It showed the real power of polling.  
 
Just as the reputation of the media has gone down in terms of trust, so has ours.  We are working in the same environment trying to get people to answer our surveys. Restoring trust in media and in polling is very important from my point of view.
 
AS: Nothing to add other than encouraging people to pick up the phone when the pollster calls!