What are pollster ratings? | Polling 101 from FiveThirtyEight

FiveThirtyEight database journalist Dhrumil Mehta explains what makes a good pollster.
7:17 | 03/25/21

Coming up in the next {{countdown}} {{countdownlbl}}

Coming up next:



Skip to this video now

Now Playing:


Related Extras
Related Videos
Video Transcript
Transcript for What are pollster ratings? | Polling 101 from FiveThirtyEight
Welcome to another exit polling want to line. I'm your mom at that date along with my giant jar of jelly beans we're gonna talk about what makes good calls. Pollsters take the country's temperature on all sorts of issues. And politicians use those polls help guide which laws they pass so it's worth understanding what makes a great pollster. Conducting did pull isn't easy and good pollster needs to have methodology that allows them to contact a representative sample of the electorate. At the end of the day what's most important is whether a poster accurately predicted public opinion. But in many cases he can be hard to know whether they've done sing a pollster release the pole on which flavor jelly beans Americans like that's. To find out how accurate that Poland's you have to ask every American to confirm their favorite flick. And don't got time for that but. There's one thing that pollsters ask about all of time that we can compare to real world results election. After everyone votes we find out who won and by how much. Each election cycle there's a huge range of how well different pollsters predict the outcome of the race just take a look at the 20/20 national. Does that help you better understand which pollsters are doing a good job of pulling Americans fight thirty reviews all the major posters each election cycle in the parade. Eight plus means the pollsters really trust we're. Look at how close these pools are great to be actual putting an election. A deal on the other hand means you. Probably should think twice before trusting that pollsters elections Jillian. Order anything else. To calculate a pollsters grade. We take into account every presidential and senate house. Gubernatorial or generic Gallup poll that they publicly released and three weeks leading up to an election over more than two decades. And compare them to the actual results the election there pulling. That's a lot of polls don't it. Take Monmouth University which got an eight. We have dozens of polls for months at the last three weeks before election going back about two decades and 78% of those polled called the races correctly. That's pretty good it's a whole lot better than random chance this year races called correctly isn't the mean metric used to evaluate pollsters them. We also take into account how far off polls are from the actual result. And we give more recent polls higher we now average relative to really old people. Now so far we've focused just on any hospitals. That creme Democrat. But just because a Poulter gotta be here at C doesn't mean you shouldn't pay attention to their pools after all season pass great. But when interpreting polls from pollsters aren't among the most accurate some other factors become important as well. For example while some pollsters are less accurate than others overall sometimes pollsters consistently inaccurate in a particular direction. This is called house. Take Rasmussen Reports in 20/20 national pulling one of the only pollsters to show trump winning the popular vote in the last three weeks before Election Day. But I wasn't a huge surprise. Rasmussen national in the last several election cycles have been more right leaning an average of all the polls in the evening that the election. There's little pools well and while Rasmussen leads consistently more Republican than the actual result. Other pollsters like sir if he'd been consistently more democratic. This is all captured in a metric recalled meat or averted eyes. Knowing an introverted bias in the pollster allows us to still use their results to better understand state of her race but with a bit more knew months. Another reason looking a pollsters rating is they do something called heard it. And no we're not talking about she. Hurting his when a pollster selectively releases results or modifies or methodology because they're poll isn't in line with the others. Take for example these polls from I was 114 centuries. Usually there's a natural not a very outcomes of polls that randomly sampled the population. But notice how there are fewer out liar pulling the latter part of the cycle that might be evidence that the pollsters are hurt. Pollsters may do this to avoid raising questions about the accuracy of their results. Or they may look at how far their poor divergence from the average and think they're doing something wrong. But if too many pollsters do this it may converge on the wrong answer. That's like good pollsters don't record even if their poll is an old wire you'll still publish it. And there was one poll that sent out from the pack one by cells are and an eight plus pollster who is particularly good polling in Iowa. And it turned out so there was correct the apparent out liar was the closest to the actual result. Now publishing house my results doesn't always and even the best run pools can occasionally get the answer wrong. That's just tough to six works but while some pollsters occasionally publish their results others are more consistently wrong. Pollsters that a C minus or. Well these are still pollsters making a good faith effort to measure public opinion you should probably be cautious about relying on their results. So far we've talked about the traps that decent mediocre pollsters fall into. But there are certain types of pulls that you should put absolutely. No trusted. One is an unscientific. Unscientific polls are pulled it just costs a subset of Americans question without taking any measures to get her present samples. You be like if I were in the green jelly bean fan club and I asked my friends and tell me what their favorite kind of jelly bee. Just because most of them chose green doesn't mean that's America's favorite flavor. Madden. Girl keen. These kinds of unscientific polls do not get any attention upholstery because they're not even trying to measure public opinion of Americans as a whole. Still there's an even more malicious. Even more insidious kind of poll and keep close. Eight. Over our years of dealing with polling data we've found that some pollsters or just outright falsifying. Or they engaged in some other form of professional malpractice. Others the and he pollsters at all just bad actors pretending to be pollsters. If weren't able to discern that a pollster has applied to the public that pollster will get a failing. And these firms are banned by 538 and the polls are never used it anywhere else but we do listen and upholstery it's just so you know which names to avoid. Ultimately it's never good to put too much stock in one individual coal even if it's from a great pollster. That's why if you wanted to let Americans are thinking about a topic it's better look at the average of all polls on top. That's if you find a claim is available in only one are a handful of hopefuls. Saying about Americans view them fracking or marijuana legalization. More. Peace. None of you'll know whether those polls come from great. Mediocre they're downright bad pollsters. If you look at this video don't forget to subscribe to 538 and beauty.

This transcript has been automatically generated and may not be 100% accurate.

{"duration":"7:17","description":"FiveThirtyEight database journalist Dhrumil Mehta explains what makes a good pollster.","mediaType":"default","section":"ABCNews/fivethirtyeight","id":"76661630","title":"What are pollster ratings? | Polling 101 from FiveThirtyEight","url":"/fivethirtyeight/video/pollster-ratings-polling-101-fivethirtyeight-76661630"}