You have got to be kidding.
On November 8, City of Arlington residents are expected to vote on a
divisive ballot measure to finance the proposed $1 billion Texas Rangers
Stadium. Meanwhile, campaign stakeholders have released a series of poorly
designed, automated, low-cost polls to measure the public opinion on this
important issue. What’s wrong with this picture?
With such high stakes consequences, one would assume that
poll sponsors would want to support their campaign advocacy with a high-quality
poll conducted by a polling company with a recognized track record. Poll
sponsors may try to stack the deck to support their campaign objectives, but a
reputable pollster with a good track record would not knowingly bias a study. Not
everyone that conducts opinion polls, however, are reputable pollsters. Indeed, the “shadow” polling industry
includes many telemarketing firms, call centers and political operatives that
have little or no training in survey practices or ethical conduct, and usually
not active in professional polling organizations.
Each of the sponsored polls have reported different results,
used varying methodologies, and were conducted by polling firms with varying
reputations. Only one of these polls
–conducted by DHC Data --- has been subjected to critical review by survey
experts in local news stories and considered to be of questionable quality. Interestingly, the polls sponsored by the Say
Yes campaign and WFAA/Fort Worth Star-Telegram have not been critically
analyzed by survey experts in local news reports. Because the results of these polls
are likely to influence the voting behavior of Arlington city residents, I believe
that each of these polls require some scrutiny as well. The reputation of a
pollster is clearly important, but not as important as their polling methodology
in a particular study.
I reviewed only one online report for the poll sponsored by
WFAA and the Fort. Worth Star-Telegram, while relying on published news reports
regarding the methodology of the other three polls. I discovered shortcomings in
all polls, and would like to share my thoughts on their implications for
polling accuracy and voting outcomes. My only objective here is to educate the
public about good and bad polling practices -- topics that I usually address in
classes that I teach on survey research methods, mass communications research,
and statistics. In addition, the information discussed should provide some help
in deciding which poll deserves more of the public’s confidence.
1.
Sample
Selection: Each of the polls reported that their target audience included likely
voters in the City of Arlington. However, only one of the pollsters -- Public
Opinion Strategies -- sampled landline and cell phone households since they
used live interviewers to manually dial the numbers, as required by the FCC, which
is likely to capture a more representative sample of voters. DHC Data (for Save Our Stadium), however,
relied exclusively on landline phones while Survey USA (for WFAA/Star-Telegram)
relied primarily (76%) on landline phones and less on mobile phones. Good
survey practice suggests that pollsters should rely less on landline telephones
because their penetration has declined significantly in recent years and are
more likely to capture older residents. A recent study by the Pew Research
Centers explains the wisdom of placing more reliance on cellphone households in
telephone-based surveys:
“Samples of adults reached via cellphone are much more
demographically representative of the U.S. than samples of adults reached via
landline. Consequently, replacing landline interviews with cellphone interviews
reduces the degree to which survey data need to be weighted to be
representative of U.S. adults. This in turn improves the precision of estimates
by reducing the margin of sampling error. Perhaps not surprisingly, one
major survey was recently redesigned to feature 100% cellphone interviewing.” (The Twilight of Landline Interviewing,” Pew
Research Center, August 1, 2016)
Thus, studies that rely primarily on
landline telephone households may be “stacking the deck” by placing more weight
on the opinions of older residents than the opinions of residents that depend
more on cell phones, such as younger and ethnic minority residents.
2.
Exclusion
of Demographics: Without demographic
information about the poll respondents, it is difficult to know how well the poll
respondents represented the voting community. There is no good reason to hide
this information other than to avoid scrutiny by other experts. Each of the
studies tell us that their target audiences were likely voters in the City of
Arlington, but only one of the polls (WFAA/Star-Telegram) provided demographic information
for the respondents that could influence the survey outcomes – such as race,
gender, and age. For pollsters that do
not disclose demographic information, we are left to wonder if these polls
over- or under-represented particular segments of the community which could
mispresent the polling results. None of the pollsters reported whether their
polling results were weighted or adjusted to reflect the demographics of the voting
community in the city of Arlington.
3.
Questionnaire
Content: Survey experts interviewed in
news stories had mixed opinions about the one poll reviewed (Save Our Stadium),
pointing to such problems as leading questions or long questions that would test
the memory of any person. Campaign representatives on both sides have pointed
to incomplete or misleading descriptions of the ballot measure as well.
4.
Data Collection
Approach: With the exception of
Public Opinion Strategies (POS), the two other polling firms (DHC Data and
Survey USA) opted to use the cheapest and least credible data collection
approaches to collect opinions on this divisive issue: pre-recorded, automated
telephone calls instead of live telephone interviews. Automatic telephone calls have little
credibility in the polling industry because they remove human contact, and do
not provide any opportunity for clarification when respondents are confused.
Automated telephone calls are often rejected by residents because they are associated
with telemarketing firms that often annoy the public. Polling firms employ
automated calls when they have limited time available, have a limited budget to
fund live telephone interviews, or have limited resources to use live
interviewers. Because FCC regulations prohibit automated calls to cell phone
users unless they are manually dialed, polls using automated methods exclude
nearly half of community residents who have only wireless devices but no
landline telephones – a practice that systemically excludes younger residents and
ethnic minority groups.
5.
Language
offered: Based solely on news
reports about these polls, it appears that none of the pollsters offered a
language other than English to collect their data. Why is this important? Hispanics comprise 29 percent of Arlington
city residents, while 36 percent of Hispanics are foreign-born and primarily
Spanish-speaking. Our past experience
shows that 50 to 63 percent of Hispanics will prefer a Spanish-language
interview because they find it easier to express their opinions. Unless their
presence in the voting community is minimal, it makes little sense to exclude
this strong base of baseball fans by offering only one language. Indeed, it is
likely that the estimate of support for the new stadium could be
under-estimated by this exclusion.
6.
Pollster’s
Reputation: The reputation of the
polling companies was also discussed in news reports. In my opinion, Public Opinion Strategies utilized
the most credible polling methodology since all interviews were conducted by
telephone with live interviewers, their two polls included landline and cell
phone households, and the company has a long history of public opinion polling. DHC Data, however, was characterized in news
reports as having a low visibility, no web site, and questionable experience as
a pollster. Its owner, however, claims to have conducted several polling
studies in past years. Survey USA – who
conducted the WFAA/Ft. Worth Star Telegram poll, was also described as having a
solid polling history. Interestingly, survey experts only scrutinized the poll
conducted by DHC Data, while the polls conducted by the other two polling firms
received praise for their track records but little criticism of the polling
techniques used in the Texas Rangers campaign.
It is a risky practice to avoid scrutiny of a pollster’s practices
because they have a great reputation.
In summary, the most recent polling results
are summarized below:
·
Save Our Stadium poll by DHC Data: 38% support, 46% oppose, 16% undecided
·
Say Yes polls by Public Opinion Strategies:
o
Sept. 23-25:
54% support, 40% oppose, 6% undecided
o
Oct. 14-15:
56% support, 37% oppose, 7% undecided
·
WFAA/Ft. Worth Star Telegram poll by
SurveyUSA: 42% support, 42 opposed, 16% undecided
Ultimately, the election scheduled for Nov.
8 will be the final word on which pollster
provided the best picture of how Arlington residents feel about the Texas
Rangers Stadium issue. Based on the
information evaluated thus far, I believe that the polling results by Public
Opinion Strategies for the Vote Yes campaign – 54-56 percent supporting the
stadium referendum – presents the most accurate picture of the actual voting
outcome. Why? Primarily because they used human beings to conduct
the interviews and included both landline and cell phone residents in their
study. The poll was not without its own shortcomings since it did not describe
the respondents’ demographic attributes, and may have excluded Spanish-speaking
and younger voters by over-relying on landline telephone households. Nonetheless, I believe that their polling
practices and results are more deserving of the public’s confidence in
comparison to the other polls.