Old School Polling Methods Hid Trump's Popularity With Voters
This level of annoyance can mean that people who are called may not answer, and if they do answer, they may provide responses intended to get the caller off the phone rather than being truthful. Meanwhile, the advent of CallerID means that people tend to screen their calls and not answer calls from numbers they don’t recognize. On top of that, voters in the prized 18-to-35 age group are much more likely to use wireless phones rather than landlines, so polling frequently misses those people. Of course, people lie to pollsters. One pollster for CBS recently revealed that his company had run an experiment to see if it received different answers from the same people in polls conducted online versus on the phone. They did. The findings revealed an apparent reluctance on the part of college-educated respondents to admit they were planning to vote for Trump when they were speaking to a human, but did admit it when responding to a machine. While it’s tempting to focus on social media sentiment analysis as a substitute to polling, that’s no panacea, either. The sample is automatically skewed to reflect only those people on social media, which means it’s not a random sample. And, of course, people also lie on social media because they’re embarrassed to tell their friends who they’re supporting.While presidential election polls are conducted only every four years, there are plenty of other surveys out there being used to measure all sorts of things. They may include product preference surveys, for example, or customer satisfaction surveys. These don’t suffer from exactly the same problem as political surveys, but they still are subject to the same limitations and sources of bad data. If you’re planning to field such a survey, make sure you know the limitations before you start, and find ways to view the results—which are always inaccurate in one way or another—with some understanding of the limitations. The same is true with social media studies. And if you’re presented with the results of a survey, you need to know what its limitations are. Was it a phone survey? How were the respondents selected? What is the level of confidence that a representative sample was taken? Of course, you also need to review the poll or survey questions for signs that the questions themselves could induce bias. And it might be a good idea to use more than one survey method. None of these ideas will necessarily make a poll that’s accurate, but at least they will provide better understanding of the results and the limitations of those results. And it also might encourage you to pair a poll or survey with social media analysis for a more complete result.
And, of course, there are the social media robots that appear to generate enthusiasm that may not reflect the actual level of enthusiasm among real voters. While social media analysis companies are learning to filter out traffic generated by the bots, software developers are working to make the filtering less effective.