Jump to content

Methods Matter: Phone vs Web Surveys


Recommended Posts

Pew Research over the past year conducted a large survey in the United States to understand the difference in survey responses between those who answered questions to an interview on the telephone vs those who answered web questionnaires without the personal interaction. You can find the full article HERE, but I will run over the general findings below.

The results were fairly close, but had a mean difference of 5.5 points and median difference of 5 points across the 60 questions asked.

However, there were particular areas where the differences were most noticeable.

1) Web surveys produced more negative responses about politicians.

2) Those surveyed by phone were more likely to say that certain groups of people, e.g., gays/lesbians, hispanics, blacks, faced "a lot" of discrimination.

3) People are more likely to say they're happy with their family and home life on a phone survey than a web survey.

Why does this happen?

Pew Research points to something that social researchers have known for a long time. People are more likely to give responses that paint them in a better light when face-to-face (or in this case voice-to-ear) with an interviewer. We call this "social desirability bias" because the respondents give answers to another human being that the respondents themselves believe the interviewer will find more socially acceptable. In these social interactions between interviewer and respondent, people tend to choose responses that they believe will make the interviewer less uncomfortable. Pew notes black respondents who are less likely to admit that they face "a lot" of discrimination when they are responding to a person, rather than a web survey and I might add even written surveys. Face-to-face interviews can lead respondents to self-censor, whereas the internet and written interviews take the personal interaction out of it.

In the end, these results don't really point to anything that social researchers haven't known for decades. They just corroborate the notion that social interaction can alter responses, particularly on sensitive topics where the respondent believes he/she may make the interviewer uncomfortable or that their true responses may be socially unacceptable. Researchers need to consider the types of questions they are asking when deciding between phone interviews and impersonal online or written surveys, though these methods also carry a plethora of other considerations not covered here.

Link to comment
Share on other sites

  • Replies 74
  • Created
  • Last Reply

Top Posters In This Topic

The implicit assumption in the op is people are being more 'truthful' when not interacting with a person. I don't see any reason to make such an assumption. It is likely easier for people to lie on web forms that it is for them to do it by saying the lie to a real person.

Link to comment
Share on other sites

The implicit assumption in the op is people are being more 'truthful' when not interacting with a person. I don't see any reason to make such an assumption. It is likely easier for people to lie on web forms that it is for them to do it by saying the lie to a real person.

The reason is self evident. What reason would someone have to lie in a completely anonymous online web form? Face-to-face interactions are not anonymous even when confidential. You're still dealing with another human being and so there's an opportunity there for face work, which means making the other person feel comfortable by giving them the answers you as the respondent think they want to hear or would make them less uncomfortable, as well as giving responses that make you as a the respondent appear to be more socially acceptable. What's not clear is why someone would lie on an anonymous web form and what's even less clear is why the numbers show that there is significant differences on sensitive topics in line with the theory that people give responses that save face. What you're calling an "implicit assumption" is actually an explanation for the empirical differences between the surveys that's based on existing theories of social interaction. If you have an alternative theory that would explain why the numbers are as they are but argues that people are more likely to lie in a completely anonymous situation and the face-to-face responses are their "true" responses, then I would love to hear it because intuitively and theoretically I can't think of any.

Edited by cybercoma
Link to comment
Share on other sites

The reason is self evident.

No you *assume* it is self evident. That is simply a reflection of your biases and preconceptions.

What reason would someone have to lie in a completely anonymous online web form?

A desire to manipulate the survey by giving untruthful answers that support personal prejudices.

What you're calling an "implicit assumption" is actually an explanation for the empirical differences between the surveys that's based on existing theories of social interaction.

I doubt it. I suspect if you dug into the research you will find little more than the same assumptions that you make.

If you have an alternative theory that would explain why the numbers are as they are but argues that people are more likely to lie in a completely anonymous situation and the face-to-face responses are their "true" responses, then I would love to hear it because intuitively and theoretically I can't think of any.

As I said: people filling out forms have more time to answer and will likely weigh the impact of different responses. They are more likely to give responses that support their personal prejudices in an online form even if those answers are untruthful. Edited by TimG
Link to comment
Share on other sites

You've offered really nothing in the way of a suitable competing explanation. You're just saying people will lie on web surveys. Clearly the responses have a distinct pattern here that are explained by theories about "social desirability bias." Rather than provide a competing explanation you're just saying that people will lie on web forms. Why would it be the web forms that are off and why would it be this specific pattern?

You write,

They are more likely to give responses that support their personal prejudices in an online form even if those answers are untruthful.

If the responses support their personal prejudices, how are those untruthful?

Nothing you are writing here makes any sense whatsoever. Just give me a clear reason why the web forms would be a lie, while the face-to-face ones are the truth, taking into account the pattern of the responses that fits perfectly with the social desirability bias theory. If your explanation fits better, then perhaps you have something that overturns Pew's research here. Right now, you're just saying, "Yeah but the opposite could be true!" without actually explaining how or why it would be true. I see no reason to believe that people would be more likely to lie online than offline. They're lying somewhere, since the answers don't match up between the groups and they don't match up on particular questions that are supported by the social desirability bias theory. Give us another better theory that would suggest that Pew Research got their conclusion completely backwards.

Link to comment
Share on other sites

Also, I'm not entirely certain what you're arguing. Maybe I'm wrong assuming that you're arguing that their conclusions are backwards and that the false responses were provided in web forms on sensitive questions as opposed to in the face-to-face phone interviews. Pew Research says the methods matter because there's a difference in results. The reason there's a difference can be explained by the social desirability bias theory. Are you instead saying that there is no difference between phone interviews and web forms? Because that was shown using empirical evidence. You also seem to be appealing to some sort of respondent bias, but it appears that you're suggesting that it's the web responses that are more biased than the phone ones. I'm trying to understand how and why that would be, since Pew Research's explanation is founded on a social theory that has been tested and demonstrated not only in this study, but others.

Link to comment
Share on other sites

Clearly the responses have a distinct pattern here that are explained by theories about "social desirability bias."

An explanation that says only that more social desirable responses are given. The theory does NOT show that the more social desirable responses are necessarily less truthful.

If the responses support their personal prejudices, how are those untruthful?

What is truth then? It matters when you are dealing with objective, verifiable facts but when you are asking people's opinions the "truth" is very malleable. That is why push polls are able to solicit the desired responses by providing context that manipulates people into answering a certain way. If people give different answers when asked in different ways you cannot assume that any are more "truthful" than others. In many cases, the answers could vary depending on the day or the weather or numerous other unrelated factors. The simple fact that voice polls are done when someone interrupts you and web polls are done at your convenience could be enough to change the results.

I agree that context matters and it does not surprise me that a voice survey would get different answers. My only point is it wrong to assume that "different" means one is "less truthful" than the other.

Edited by TimG
Link to comment
Share on other sites

An explanation that says only that more social desirable responses are given. The theory does NOT show that the more social desirable responses are necessarily less truthful.

Then give me a reasonable explanation for people lying consistently to appear less socially desirable that seems just as likely, if not more likely, as the explanation that people in face-to-face interaction attempt to be more socially desirable.

What is truth then? It matters when you are dealing with objective, verifiable facts but when you are asking people's opinions the "truth" is very malleable. That is why push polls are able to solicit the desired responses by providing context that manipulates people into answering a certain way. If people give different answers when asked in different ways you cannot assume that any are more "truthful" than others.

Of course you can, it's outlined in the methodology. There's consistent bias in the responses and they're for particular types of questions. The goal here is to explain the disparity in the responses. They're not both right.

In many cases, the answers could vary depending on the day or the weather or numerous other unrelated factors.

They could, but those factors would be consistent and cancel each other out between the web and phone surveys.

The simple fact that voice polls are done when someone interrupts you and web polls are done at your convenience could be enough to change the results.

It could, but the question is how would that change the results in a way that's consistent with the OP findings.

I agree that context matters and it does not surprise me that a voice survey would get different answers. My only point is it wrong to assume that "different" means one is "less truthful" than the other.

Different does mean less truthful, since it's not only the same survey but also the differences are consistent and statistically significant.
Link to comment
Share on other sites

Of course you can, it's outlined in the methodology. There's consistent bias in the responses and they're for particular types of questions. The goal here is to explain the disparity in the responses. They're not both right.

A conclusion not supported by the data. Showing a consistent bias does not show that one is more truthful than the other. All it does is show a consistent bias. The implicit assumption that the phone surveys are less truthful is nothing but an assumption that is not supported by the data itself. Edited by TimG
Link to comment
Share on other sites

A conclusion not supported by the data.

Agreed....the Pew Research Center studied its own methodology and found that only 1 in 10 Americans even bother to respond to such surveys. Telephone responses have been sliding for years, for lots of reasons, including the loss of landlines. Comparing results to web surveys is problematic at best.

http://www.slate.com/articles/news_and_politics/politics/2012/05/survey_bias_how_can_we_trust_opinion_polls_when_so_few_people_respond_.single.html

Link to comment
Share on other sites

A conclusion not supported by the data. Showing a consistent bias does not show that one is more truthful than the other. All it does is show a consistent bias. The implicit assumption that the phone surveys are less truthful is nothing but an assumption that is not supported by the data itself.

Again, give a reasonable explanation for the consistently biased results. You're pissing in the wind here.
Link to comment
Share on other sites

Agreed....the Pew Research Center studied its own methodology and found that only 1 in 10 Americans even bother to respond to such surveys. Telephone responses have been sliding for years, for lots of reasons, including the loss of landlines. Comparing results to web surveys is problematic at best.

http://www.slate.com/articles/news_and_politics/politics/2012/05/survey_bias_how_can_we_trust_opinion_polls_when_so_few_people_respond_.single.html

Nate Silver has shown that they're still a reliable measure. What's your argument here?
Link to comment
Share on other sites

People say different things when they are interacting with a real live person by speaking than with an anonymous web form by writing/clicking... duh?

As to what's more truthful... the web form presumably better reflects what the person would write/think when they are anonymous, while the voice call presumably better reflects what the person would say/do when interacting with other people. Neither is necessarily more truthful, the different cases are merely better reflective of different situations.

Link to comment
Share on other sites

People say different things when they are interacting with a real live person by speaking than with an anonymous web form by writing/clicking... duh?

As to what's more truthful... the web form presumably better reflects what the person would write/think when they are anonymous, while the voice call presumably better reflects what the person would say/do when interacting with other people. Neither is necessarily more truthful, the different cases are merely better reflective of different situations.

But when they're asking the same questions there is a right answer for what that individual believes. When they're consistently biased, it shows that in one of the methods, the answers are less precise. The imprecision grows when it's questions that may provoke responses that are not socially acceptable. That's all shown in the numbers. If you have a different answer to the same question when you're asked by a person, as opposed to an anonymous impersonal web poll, then one of those responses is imprecise. Which one? Tim says the phone survey, but can't back that up with any reasonable theory. Pew argues the web form is more precise because the bias grows when the responses might make the interviewer uncomfortable or be more socially unacceptable. That means that people self-censor in face-to-face interaction. Why Tim is arguing against this is beyond me. It's self evident that social interactions cause people to self censor. The way you talk with your buddies at the bar isn't the way you talk to your boss and isn't the way you talk to your kids. There's social forces at work that actually shape,respondents answers and that can be seen in the data. It's pretty interesting seeing how this intangible force of social interaction can actually sway so-called objective numbers.
Link to comment
Share on other sites

Pew is less and less a "reliable measure" because of the methodology used to canvass people, changing demographics, and other biases.

Web based polling introduces its own set of issues. Comparing the two is interesting but ultimately pointless because of the underlying biases.

You're not actually saying anything here. Certainly not anything related to the OP. They measured the difference in responses between the two methods and found a disparity. They've provided an explanation for that disparity in social interactionism.

What you're talking about is reliability based on sampling. Polling methods have adapted over the years and, like I said, Nate Silver's predictions have demonstrated their reliability.

In the OP the methods were compared in a controlled test. You can read more about the way they did this study at the link, though it has nothing to do,with the sampling bias issue you're talking about.

Link to comment
Share on other sites

Even if one only wishes to focus on the "social interactionism" aspect of telephone polling or direct interviews, there are still problems with the methodology, access, demographics, etc. "Social" pollsters are often confronted with annoyed respondents that have another agenda, and their impact on results cannot always be detected and compensated for.

Link to comment
Share on other sites

Tim says the phone survey, but can't back that up with any reasonable theory.

I provided a reasonable theory - you just reject it because it is not what you want to believe. In any case, I am saying the same as Bonam: no assumptions can be made about the relative accuracy only that the results will differ.

It's self evident that social interactions cause people to self censor.

Social interactions also encourage people to tell the truth since it is a lot easy to lie to a computer than to another human: especially if participation in a web poll is completely anonymous. Your assumption that web polls are more accurate is nothing but an assumption. It is not actually supported by the data you quote. Edited by TimG
Link to comment
Share on other sites

Nate Silver has shown that they're still a reliable measure. What's your argument here?

Nate Silver used to be a fairly credible source but he caved in to the buffoons that did not like Roger Pielke Jr's column on his site. This demonstrates that he thinks that reporting good analyses of data is less important than keeping the left wing politically correct mob happy and unfortunately means that everything he says is likely biased by what he thinks the politically correct mob wants to hear. Edited by TimG
Link to comment
Share on other sites

I provided a reasonable theory - you just reject it because it is not what you want to believe. In any case, I am saying the same as Bonam: no assumptions can be made about the relative accuracy only that the results will differ.

Of course assumptions can be made about the accuracy. The numbers show that people provide different responses to the same questions. That speaks to the accuracy of the measure. This is basic stuff. If you ask the same question, but get different responses from a random sample of people and the only thing that changed was the method, then it's obvious that the method is affecting the responses. That means the responses for one of the methods is inaccurate, in this case by about 5-5.5% on average with higher disparities for socially contentious topics. Pew Research explains why, but you reject that explanation oddly.

Your assumption that web polls are more accurate is nothing but an assumption. It is not actually supported by the data you quote.

You keep saying that and you're wrong. It's self-evident that people would self-censor around others. You say you've provided a reasonable theory, but you have not. You say that it's just "easier" to lie on a web poll. There's absolutely no evidence for that. It's just as easy to lie to an interviewer asking you particular questions and asking you to choose between a number of responses. Further still, you've provided no explanatory theory as to why someone would prefer to lie on a web poll versus a phone survey. It's easier is not even remotely as convincing as Pew's argument that people self-censor in social situations. So try again. Give me some compelling reason why a person would lie on a web form instead of the phone interview, bearing in mind that it's particular kinds of questions where there are disparities. There are other questions that line up more closely. It's the socially contentious topics that have the greatest disparity. Arguing that "it's just easier" to lie on the web form doesn't account for that at all. That is to say, it does NOT fit with the data, despite your repeated insistence that Pew's theory doesn't fit the data.

What I want to know is why it's so difficult for you to understand that people censor themselves around others? It explains the patterns in the data and it provides valuable information about how to get people's honest opinions on various topics. Sometimes phone surveys are fine, but other times a web form might be more appropriate. Those times that the impersonal polling is more appropriate are those situations where people might censor themselves around others. That makes sense and that's instructive for future research and polling. Your argument "it's just easier to lie on the web" tells us nothing and doesn't account for the disparities in the responses. It's not only just as easy to lie to an interviewer, but also people are encouraged to lie to an interviewer due to social pressure which doesn't exist to the same extent in an impersonal questionnaire.

Edited by cybercoma
Link to comment
Share on other sites

Nate Silver used to be a fairly credible source but he caved in to the buffoons that did not like Roger Pielke Jr's column on his site. This demonstrates that he thinks that reporting good analyses of data is less important than keeping the left wing politically correct mob happy and unfortunately means that everything he says is likely biased by what he thinks the politically correct mob wants to hear.

What are you even talking about? What does this have to do with the data and methods? More importantly what does this have to do with the accuracy of his predictions?

As off topic as it is, why don't you use the same amount of skepticism with extreme right wing sources? If you want to check biases, maybe you should start by checking your own.

Link to comment
Share on other sites

TimG, I think, has done this in the past. He's just skeptical of social science, to my memory, especially when it disagrees with his gut feelings about how the world is. Again - this is just my memory of how he operates.

The thing that I think people are missing in this discussion is that web-based polling doesn't necessarily mean an open web poll such as we have here. They use targeted polling, whereby the polling firms email respondents who have signed up to receive surveys for consumer surveys.

The trick is, as BC points out, that in the new post-landline world you have to factor in the demographics, access and so on. This is no easy task. That's why from 2000 until now, or so, polls started having issues. When I took polling and statistics in University, decades ago, we were told about the difficulties in 'the old days' before the white pages provided a great source for random sampling. Once landlines and white pages started dying, the Statisticians had to relearn how to sample and it didn't go well.

What I find supremely dumb is that those who do well at it, and those who do poorly at it aren't well differentiated by the press, or the public. You'd think people would care about such things. Obama's re-election provided the odd spectacle of a Republican 'strategist' - Karl Rove - doubting the results AS THEY WERE REPORTED by Fox News !

This, I guess, means that we're never as smart as we think we are.

Link to comment
Share on other sites

The thing that I think people are missing in this discussion is that web-based polling doesn't necessarily mean an open web poll such as we have here. They use targeted polling, whereby the polling firms email respondents who have signed up to receive surveys for consumer surveys.

Yes. Good point. Thanks for clarifying that.

The trick is, as BC points out, that in the new post-landline world you have to factor in the demographics, access and so on. This is no easy task. That's why from 2000 until now, or so, polls started having issues. When I took polling and statistics in University, decades ago, we were told about the difficulties in 'the old days' before the white pages provided a great source for random sampling. Once landlines and white pages started dying, the Statisticians had to relearn how to sample and it didn't go well.

This is handled through weighting techniques that are based on demographics. We understand our country's demographics through the census. That's one of the many reasons Harper's attack on the long-form census is such a huge problem. These polls inform policy and if we don't have an accurate accounting of demographics, the weighting used in these polls will be inaccurate. But then maybe if you're more interested in governing by ideology, that's the intent. You don't want an accurate picture that contradicts your worldview.

What I find supremely dumb is that those who do well at it, and those who do poorly at it aren't well differentiated by the press, or the public. You'd think people would care about such things. Obama's re-election provided the odd spectacle of a Republican 'strategist' - Karl Rove - doubting the results AS THEY WERE REPORTED by Fox News !

Another good point. The media should spend more time discussing methodologies and what makes for more and less accurate polls. However, most people would fall asleep at any public discussion about methods. They just want the answer in a sentence or two, not the discussion, the implications, and comparisons.

The Pew study here is important because it shows that web polling can be an effective tool in certain circumstances, which goes against the often held belief that web polls are entirely useless and biased. It shows that they can capture beliefs that people hold which they might censor publicly. It also gives us a reason to be skeptical of phone-based survey results that ask the kinds of questions that people would self-censor in public discussions. We can look at the results of this study and ask ourselves how the polling might be different if it was done through an impersonal method. IMO, it's important to understand how the methods affect the responses.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


  • Tell a friend

    Love Repolitics.com - Political Discussion Forums? Tell a friend!
  • Member Statistics

    • Total Members
      10,730
    • Most Online
      1,403

    Newest Member
    Entonianer09
    Joined
  • Recent Achievements

    • phoenyx75 earned a badge
      Week One Done
    • lahr earned a badge
      Conversation Starter
    • lahr earned a badge
      First Post
    • User went up a rank
      Community Regular
    • phoenyx75 earned a badge
      Dedicated
  • Recently Browsing

    • No registered users viewing this page.
×
×
  • Create New...