by By Tom Isler | October 01, 2009

1009CS Smiley faces1bHow satisfied are meeting planners with the feedback they get from pre- and postmeeting surveys? Choose one: Extremely satisfied, mostly satisfied, neutral or not at all satisfied.

Survey experts would immediately spot at least five things wrong with that question. Planners who can't identify the five common survey errors -- or, indeed, are dissatisfied with their ability to elicit actionable feedback from meetings-related surveys -- should brush up on survey best practices, particularly given the importance now placed on measuring meeting results and proving the value of meetings, which often is determined by attendee surveys.

Stumped? First, the question asks a respondent to speculate how "meeting planners" in general feel about surveys, not about the respondent's firsthand experience. Such questions that rely on hearsay typically result in bad data and are discouraged by researchers.

Second, the question is vague. What is meant by being "satisfied" with survey feedback? Is the question asking whether planners receive positive feedback that gives them professional satisfaction (i.e., the meeting was a success)? Or is it asking if they're satisfied with the usefulness of the feedback, be it positive or negative? Questions in mail or Internet surveys in particular cannot be ambiguous, because there is no opportunity to clarify a question once the survey has been sent out.

Third, the question is "double barreled," meaning it asks about two things at once: pre- and postmeeting surveys. Imagine that a respondent is satisfied with premeeting surveys but dissatisfied with postmeeting surveys; how can he or she answer the question accurately? This should be two separate questions.

Fourth, the response scale is skewed toward positive responses. Respondents do not have the same gradation of choices to express dissatisfaction as they do to register satisfaction (there's no "mostly dissatisfied" to counterbalance "mostly satisfied"). Response scales should be balanced to get the most objective data.

Finally, the question doesn't allow respondents to indicate that they don't know the answer or the question is not applicable to them. If the question doesn't exhaust the entire spectrum of responses, some respondents will be forced to choose a disingenuous answer, spoiling the data.

"We find that planners, generally speaking, are not research experts," says Ira Kerns, managing director of MeetingMetrics in New York City, which offers survey and other measurement services to clients. That's not a criticism, he quickly adds, it's reality and unrealistic to expect otherwise. But planners who don't hire experts like MeetingMetrics or Red Bank, N.J.-based Exhibit Surveys often end up creating their own paper surveys to hand out on-site, or they use free or low-cost web tools, like or In those cases, Kerns says, planners' lack of knowledge of survey science results in "a certain amount of ineffectiveness" of the endeavor.