I just happened to google, “facebook and sociology” and I couldn’t help but click on this. Since in this class we spent quite a bit of  time discussing surveys: how to make them, the best questions to ask, how to ask questions to collect the data we really want. It seems that making surveys is quite fashionable nowadays…this article talks about “DIY surveys”. If nothing else, it is interesting to see how what we are learning in the classroom, is used everyday…


click here or scroll down:


How To Make People Tell The Truth

Wed Apr 27, 2011
DIY survey platforms make constructing questionnaires easy, but the results could be biased, contradictory, or deeply misleading.
Online surveys often have to compete for attention against the backdrop of Netflix, Gmail alerts, and 25 open browser tabs. The minimal cognitive effort given to answering questions may exacerbate all the problems that lead to biased or outright distorted results.As Facebook adds polling features and SurveyMonkey acquires popular document form builder, Wufoo, the proliferation of amateur surveying has a big future. So, we asked a survey expert at the famous University of Michigan Institute for Social Research, Professor Michael Traugott, about how to make questionnaires that get at precisely the data we’re digging for.

Leading Questions

Perhaps the biggest no-no that surveys violate is poor wording. Minor adjustments in questions can often produce enormous differences. For example, one study found that for the question “Should divorce in this country be easier to obtain, more difficult to obtain, or stay as it is now?” placing “more difficult” at the end of question caused a 11% difference in responses.

In another study, for the question “Do you think the United States should forbid public speeches against democracy?” replacing the world “forbid” with “allow” caused a 26% increase in respondents’ support for free speech (because individuals, on average, have an aversion to forbidding rights). In other words, because respondents don’t take the time to think about the substance of a question, wording matters.

One way to get around bias, says Traugott, is to ask balanced wording. “Some people think A, while some people think B, how about you?” Be as explicit as possible about the wide range of beliefs that exist–or, an individual’s sheep-like proclivities kick in.

Second, try not to attach an authority’s name to a question, such as, “The Supreme court recently decidedX, do you agree?” Individuals, especially lazy ones happy to pass off the heavy thinking to someone else, will give extra weight to authority figures who may know more than they do.

Pre-test, Pre-test, Pre-test

Traugott says its a mistake for people to believe “that they can write these questions and get them correct the first time.” Testing out a survey on a few close friends may reveal enormous gaps in understanding. More sophisticated pretesting may require iteratively improving the question on different sources until a string of unique testers gives the same interpretation to the same question. If that’s too cumbersome, asking someone in a nearby cubical or over Facebook chat may still lead to big improvements.

In pre-testing, one of the red flags to look out for is response categories that don’t allow respondents to answer how they truly feel. For instance, Traugott recommends adding a “Don’t Know” option if the question relates to an issue for which a concrete opinion doesn’t exist. When a respondent agrees to an interview they may feel a sort of “social contract” to answer questions, “even if they haven’t thought very much about it.” If a pre-test is done correctly, a respondent who does not have a solid opinion will tell you so and the questionnaire can be adapted accordingly.

Stand On The Shoulders Of (Survey) Giants

Fortunately, you don’t have to be a PhD in qualitative sociology to make a good survey; Traugott recommends looking at survey archives to scout out what others have tried. Specifically, the iPoll and Pew Research Center have solid archives. For more academic questions, or more indepth organizational, social, or political research, try the Interuniversity Consortium for Political and Social Research (ICPSR).

Hypothesize Through Questions

Ultimately, all questions begin with a hypothesis about the world. Pollsters ask about President Obama’s approval ratings after the State of the Union because they suspect an eloquent speech might boost his likability among conservatives. A manager may ask workers if they enjoy their job because he or she fears low organizational morale.

Therefore, Taugott recommends adding in questions that unearth the cause of an answer. Pollsters should ask which party a respondent is affiliated with; a manager might ask a worker how long they’ve been at their job. Without these additional variables, we’re left in the dark, unable to prove why the results turned out a certain way.

SurveyMonkey, Facebook, and other DIY survey platforms are digital siren songs, tempting us with the ability to bang out a quick survey over a lunch break because it’s possible. But, as the old statistics adage goes, “junk in, junk out.” Collecting accurate data takes revision, investigation, and forethought.

Follow Greg Ferenstein on Twitter. Also, follow Fast Company on Twitter.


About kelirosa

I am finishing up my degree in Sociology at Hunter College, and i signed up for wordpress because my professor wants me to. truth me told, i have no idea what this is...

One response

  1. Binh says:

    A perfect subject for the blog- good work!