ABC News' Guide to Polls & Public Opinion

— -- Public opinion polls can be simultaneously compelling and off-putting - compelling because they represent a sort of national look in the mirror; off-putting because they're not well understood by many news consumers. Why conduct polls? How do they work? And how come you never call me? Some basic answers follow.

Why Conduct Polls?

Public opinion is an integral part of the news we cover: It informs and influences political debate, the state of the economy, social trends and more. Simply put, we report what people think, along with what they do, because it is important.

At the ABCNEWS Polling Unit, we are news reporters first; we think of public opinion as our beat - like covering the Supreme Court, the White House or the Pentagon. In many ways the process is the same: We pick a topic, formulate questions, go to our best sources, ask what we need to know and report what we've learned.

The difference is in the selection of our sources. Polling relies on the principles of inferential statistics, which state that we can draw inferences about a set (in this case, the American public) by examining a randomly assembled subset. Random selection is key: The fundamental requirement for any poll to be representative is that it be based on a valid, random sample of respondents.

A National Blood Test

Pollsters have a joke: If you don't believe in random sampling, next time you go to the doctor for a blood test, have him take it all.

You get the point. It only takes a tiny drop of blood, randomly drawn from the body, to test for cholesterol. And it doesn't matter how big the donor is - a mouse, a man, Godzilla. Go even bigger: Imagine someone the size of Mars. No matter. A single drop is still enough to complete the test.

Or imagine a big bowl full of red and yellow jelly beans. You don't have to count them all to know the correct proportion. Just close your eyes, stir thoroughly and pull out a random sample - say, a hundred of them. The share of red to yellow jelly beans in the sample will closely represent the color distribution of all those in the bowl. And it doesn't matter if the bowl fits on your kitchen table, or if it's the size of Yankee Stadium.

Of course, sampling's not perfect. (What is?) A statistical formula produces a margin of error, telling us how close to the full population we can expect the sample to take us. Generally speaking, the larger the random sample, the smaller the sampling error.

Remember, as in the examples above, the error margin does not depend on the size of the population under study (except in the case of very small populations) - only on the size of the random sample itself. That's why a sample of 500 or 1,000 adults, randomly selected, is perfectly adequate to represent the nation's population of 200 million.

You Never Call Me

Our opinion polls start with a computer program that generates a random sample of all possible residential telephone numbers in the country. The survey is designed so that every residential phone has the same probability of getting the call.

So why haven't you been called for an ABCNEWS poll? Because it's a big country, with lots of phones. You've got better odds of getting hit by lightning. The key point is that your odds of getting called are precisely the same as everyone else's.

The principle of random sampling makes polls truly democratic. We don't speak with any pre-determined individual or group. It can be absolutely anyone - you and me, your Aunt Sadie in Jersey City, your parish priest, your local bookie, the lady across the lunch counter, the guy pounding fenders - anyone. That's why we think that a good poll, honestly done, represents the true voice of average Americans in a unique and irreplaceable way.

Nobody I Know Says That

People tend to live in relatively homogenous areas and associate in relatively homogenous groups. We tend to share our opinions with like-minded friends and acquaintances. So it shouldn't be surprising if sometimes a poll's result is different from what you and most of the people you talk to think. Your "sample" isn't a national one, and more important, it isn't random.

Remember, too, that a national poll is representative of the entire adult population of the country. If "only" 40 percent of Americans support a position, that's still a huge number - around 80 million people.

The Point of Decision

Even when you understand how sampling works, it can be difficult to accept that your opinion can be expressed in a poll in which you weren't personally included. How can they say it represents me, you ask, when they didn't talk to me? Sounds un-American.

The answer is that it's important to differentiate between the opinions we hold, and the way we've arrived at them. We come to our opinions by completely personal and idiosyncratic routes - where and how we were raised, our faiths, our family and friends, our education and knowledge, and more. Polls do not capture this information.

However, all these personal paths converge at one place - the point of decision. In an election, for example, we can vote for Candidate A, vote for Candidate B, or not vote. There are no other options. And it's here, at the point of decision, that opinion can be accurately sampled, tabulated and reported in an opinion poll.

Why Similar Polls Find Different Results

Sampling error is the least likely source of differences in polls - question wording, question order, the poll's timing and data interpretation are more often at play.

Polls agree far more often than they disagree: Differently worded questions on the same subject usually get about the same results, so long as they've been asked in a neutral, balanced and fair way, and important developments haven't intervened.

Nonetheless, in a poll as in any interview, you only get answers to the questions you've asked. Use loaded words, leave out an important element, or include biased information, and you can get different answers. Remember, too, that no single poll represents the final word on any subject. When honest polls on the same subject do differ, that's not contradiction; it's additional information, from which we can learn more.

The danger in polling - as in any news reporting - is in asking leading questions or producing slanted analysis. That's why we at ABCNEWS conduct our own polls, rather than relying on possibly biased surveys that may be sponsored by groups with an interest in the outcome.

We also publicly release our full questionnaires and findings. You're welcome to read them, check out exactly what we've asked and compare the results to our conclusions.

Polls Around Us

It may be helpful, as well, to recognize the extent to which we all use polls. Think for a moment about the unemployment rate. Where do they get that?

The simple answer: It's a poll. The U.S. Census Bureau calls up a random sample of adults and asks who's working. It's a big sample, so the error margin is small - two-tenths of a percentage point. But it's still no more or less than a poll.

It's the same with many, maybe most, of the government statistics you hear every day. Inflation, housing starts, factory orders, you name it - all are the results of random-sample survey research.

Say Cheese

Naturally, polls have their limits. They are quantitative rather than qualitative - good at finding out what people think, but only suggestive of why. That's where thoughtful, sensible data analysis comes into play.

Polls are also, as is often said, "a snapshot in time." But that phrase too often is used to portray public opinion as flighty and mercurial, bouncing along aimlessly with no thought or reason behind it. That's clearly not the case. When we track public opinion over time, we find that in fact it's usually very stable - supported by evidence, experience and common sense and changing only when relevant information warrants it.

Public opinion confounds conventional wisdom at least as often as it confirms it. That's why it's so unwise to seek it through the filter of pundits, spinmeisters, intuition or a self-selected sample. If we truly want to know what people think, there's one tested and proven way to find out: Assemble a random sample of Americans, call them up - and ask