The 2016 presidential election is well under way and understanding “the Latino vote” will be more important than ever. Yet the stories about how Latinos behave politically will often be contradictory, confusing, and outright nonsensical.
The media is always eager to publish stories associated with data because numbers bring a certain authority to their stories. But are these numbers right?
Recent polls have shown great differences in what Latinos think about the candidates and how they will vote. Donald Trump declared victory in Nevada when exit polls reported that 45 percent of Latinos voted for him in the Republican caucus, but researchers said those claims were vastly overstated.
A poll touted by the NY Post claimed that Trump was winning over Latino Republicans, and cited a poll that was discovered to be based on fewer than 100 Latinos. The article was called out by the website Latino Rebels when they discovered that the poll was so hugely misleading.
As you read about Latino political behavior over the next few months, here are a few things you should know about Latino research and polling that will help you evaluate the polls on your own.
How many Latinos are being surveyed?
Making claims about Latinos based on 100 interviews can introduce what we political science wonks call bias in the data, resulting in a large margin of error. There are ways to reduce bias in small samples, such as being particular about who participates in the poll, but these small surveys have to be carefully examined.
A general rule of thumb is that the closer to 1,000 participants - and by this we mean 1,000 Latinos, not 1,000 participants with a few Latinos sprinkled in - will generally be a more accurate survey than those which survey just a couple of hundred Hispanics.
Yes, language matters
Does a Latino survey participant speak mainly in English or Spanish? It does matter when polling Hispanics. Surveys that under-sample Spanish-speaking respondents are likely to get different views on topics, since early-generation respondents who are more likely to be foreign born can differ on their views of certain policies when compared to third-generation Latinos. This is particularly important when it comes to questions on immigration policy, for example.
About 60 percent of Latinos are bilingual, but other Latino respondents will only feel comfortable asking questions in Spanish or English. Polling firms that use call centers that employ bilingual interviewers have the advantage of not having to call back a respondent when they get a Spanish-speaking person on the phone. This increases the chance that the interview will get completed and the commonality of language is more likely to put the person at ease when answering questions.
What about age, gender, nationality?
Other important characteristics to consider when looking at a poll of Latinos are the age of the respondents, their country of origin, what generation they are, and their gender.
Latinos are particularly young; millennials make up almost half the eligible voting population. When it comes to social issues such as gay marriage or key topics such as the environment, it is important to adequately gauge the views of the different Hispanic generations. Similarly, Latinos from different countries and regions in the Latin America and the Caribbean will not have the same views on foreign policy or the role of government.
Remember Latinos' Geographic Diversity
Regional differences (Southwest, Northeast, etc), as well as other demographics also should to match up with the population. Cubans and Puerto Ricans are more prominent in Florida, with Mexicans more so in the Southwest, and Caribbean Latinos have greater prevalence in the Northeast.
Even within certain geographical regions, there has to be an awareness of the differences based on neighborhoods, income, etc. For instance, Latinos in West L.A tend to be more affluent and live in more integrated neighborhoods, which is less representative of California's Latinos in areas such as East L.A. and the state as a whole.
Beware the Questionnaire
Another important source of bias is the survey itself. The questions used can influence the answers given by the respondent. For instance, asking a question about immigration while using the term "illegal alien" will yield different results than asking about "undocumented immigrants". Using the word "amnesty" will also influence the answer than if the survey uses the term "legal pathway". Words matter which is why it there has been a quiet battle going on over the words we use, whether it is in the media or in the Library of Congress.
Not only can questions introduce bias, but the question order can influence answers. For example, if the survey asks questions about terrorism and then asks questions about immigration, the person being interviewed, even among Latinos, may frame their answer on immigration based on the prior information on terrorism. This is called "priming" and experts who understand this effect will do the best they can not to prime respondents for certain questions.
Priming can have other effects, such as raising one's awareness of their group identity. For instance, presenting questions early in an interview that primes a respondent's identity as a Latino can influence their views of candidates and policies. Of course, pollsters can also use this knowledge to get results they want.
Who is funding the poll?
Reputation matters and so does the perspective of the researchers who are conducting the poll. From the construction of the survey instrument to the analysis, the forces behind the project is important information for you as you digest the findings of the poll.
Surveys are expensive and the groups or organizations funding polls and surveys are not without their own interests. But that being said, do not dismiss studies on that merit alone, that's simply lazy thinking.
The entity funding the poll should make you ask certain questions about the survey, but that in itself does not answer if the poll is accurate or not. Look at the sample size, the survey, the important demographics, and be suspicious if the outfit is not being open about their methodology.
As academics, we put a lot of trust in the methodology underlying studies and a lack of disclosure about survey methods is a red flag. Conclusions drawn from bad data - even if you agree with those conclusions - are infinitely worse than conclusions you may disagree with but are drawn from good data and methodology.
How are pollsters reaching respondents?
Advances in technology provide challenges to how we gather data on Latino voters, specifically. The United States may be racially and ethnically diverse, but it is also very segregated. This means that gathering data on minorities must be deliberate and researchers who seek to learn more about Latinos need to have a greater understanding of how to go out and gather data on those specific communities.
Cell phones and the internet introduce challenges. Latinos have less access to the internet than whites, as do lower educated populations. Latinos are also heavily reliant on cell phones, which makes it easy to ignore unwanted phone calls by survey companies.
Surveys that rely solely on the internet can yield poorer results than those that use a variety of methods to seek out Latino respondents. Pew Research points out there have been widespread errors in certain kinds of online surveys when measuring Latino and black data and has a useful analysis of what to look for.
Challenges with Surveys
The key to a good poll is that the people being interviewed must represent the population that the researcher is making claims about as accurately as possible. Statistically, researchers calculate this effect by reporting the "margin of error" in the results. If you see a survey, you will notice the ± symbol followed by a number. Generally speaking, the lower the number, say ±3, the more accurate the poll will be compared to another poll with a larger margin of error.
Data-gathering is a tedious process. It requires participation by people who don't usually care to be answering questions to strangers and it requires that those doing the interviews are properly trained. It's also important to keep in mind the sensitivity of the topics. Views on immigration and political preferences, as well as personal questions such as household income are tough for people to openly discuss. We also make assumptions that the person being interviewed actually has good knowledge of what they are being asked.
Some research has shown that the respondent's perception of the interviewer's race or ethnicity will influence answers. For example, if the interviewer is black and asks a white person a question about race, the answers could change. Similarly, if a Latino is being interviewed in Spanish, their answers will be different than if they were being asked by someone who the respondent thinks may disapprove of the answer.
Polling is not easy, but it's got to be done right
Surveys are expensive to do accurately, and politicians often assume that their opinions are shared by everyone. Of course, this isn't a problem with just politicians, but when they are making the decisions about how much money to spend on surveys, they often do not think it's worth the expense to get the answers they are sure they already know.
Many think they already know Latinos based on media representations, stereotypes, or because they have a Latino friend. This only furthers our misunderstanding of Latino voters and will take great pains to correct not only in the media, but in how we govern.
As readers are inundated with information and stories about Latinos based on polls, be aware of how this data is being gathered, and keep an eye on the characteristics of the survey that the company is revealing. The more open they are about the data and methods, the more tools you will have in understanding how much you can believe the results of the survey.
Stephen A. Nuño is an Associate Professor in the Department of Politics and International Affairs at Northern Arizona University.