Clayton Allen ’23 was one of millions whose eyes were glued to their television screen on Nov. 3, 2020.
It was election night, and as he watched results roll in, Allen felt an overwhelming sense of pride in his contribution to the democratic process—as a pollster.
Allen is a student in Associate Professor of Political Science Shamira Gelbman’s Election Polling and Public Opinion course. In 2012 and 2016, students taking the class conducted exit polls at voting centers throughout Montgomery County. However, due to COVID-19, the students’ polls went virtual and were sent to approximately 6,000 Wabash alumni.
“Not only do I feel like our class did a good job in how we conducted our polls, but I feel we were also able to provide a public service by involving the Wabash community in this election,” Allen said. “As I watched the coverage of election night, I felt we had done Wabash proud by participating in a cornerstone of democracy.”
After several weeks of training on polling methodology and best practices, the students worked in two different teams to develop probability sampling strategies and write questionnaires, which were administered online from October 26-28.
A combined total of more than 1,100 alumni participated in the two polls, each of which had a margin of error of less than 5 percentage points at a 95% confidence level.
“Response rate is a challenge for any poll, and we were far from 100%,” Gelbman said. “But by and large, the feedback from alumni has been really positive. While the students had to learn from the criticism they received, we also had a lot of messages from alumni saying they really liked being engaged with the College in this way.”
Even before the polls were sent out, several alums held half-hour Zoom meetings with students to work through their polls and figure out how some questions might be worded better.
While students were excited and anxious to see how accurate their polls were for the presidential race as well as Senate races, Gelbman’s class is more about the concept of polling rather than predictions. They learn how pollsters do their jobs, how to design good questions, and how to evaluate the results.
“My goal is always that they come away from this course as really savvy consumers of polling information,” Gelbman said. “There are polls on all sorts of topics—both political and non-political—and I really hope that, at the very least, they come out of this class being able to read them really critically. And then I hope they share that understanding with those around them.”
Gelbman admitted there has been problematic polling in the past, especially in 2016. The coverage of these issues led to a distrust in the process, which likely affected response rate for polls in this election cycle.
“There are a lot of people who assume polling is inherently biased,” Allen added. “I think the general public should know that, first and foremost, public opinion polling is a science, and this science in particular has no agenda. Just because the polls don’t necessarily line up with the actual election results doesn't mean there's some inherent bias in the system.”
Gelbman said it is still too early to say how well—or how poorly—the polls this year did because votes are still being counted in many states, and a full count is needed to know the exact margins by which candidates won or lost.
However, when looking at races like the Senate race in Maine, she said it seems clear that some polls “plain got it wrong.”
“I think we’ll see a pretty intensive effort from the polling industry to understand why they got certain things wrong this year and figure out where to go from here,” Gelbman said. “Do you keep doing what you’re doing? Do you make some tweaks? Or does election polling need to be done fundamentally differently—or maybe not at all?”