Survey Design: Testing, Monitoring, and Revising

Michael W. Duttweiler
Assistant Director for Program Development and Accountability
Cornell Cooperative Extension
mwd1@cornell.edu

Monica Hargraves
Manager of Evaluation for Extension and Outreach
Cornell Office for Research on Evaluation
mjh51@cornell.edu

Thus far in our four phase process we have been looking ahead – anticipating specific information needs, teasing out the specific types of inquiry that would address those needs, and applying design principles to craft and present specific queries according to best survey practices. Before you step back to gaze upon your amazing creation, remember to heed the prescribed but often shortchanged step of pretesting the survey.

Pretesting Many authors suggest using two types of pretest: one in which the participants know they are pre-testing an instrument and one in which they do not.  The former is an interactive process in which participants can share interpretations and suggestions with the researcher.  In addition to clarifying questions, insights are gained on ease of completion, interest level, sequence, etc.  In the second type of pretest participants complete the survey as it will be implemented in your actual evaluation.  In this case, the sample should resemble your actual sample as closely as possible. Careful review of the information generated will help you know if you are on track to have the information needed to address your evaluation questions.  Narinus (1999) provides practical hints for pretesting.  DeMaio et al.(1998) provides a more formal introduction to pretesting.

Monitoring Especially for a large scale survey, a surprising amount of information may be available during survey implementation.  In web surveys, for example, respondents often will zing an e-mail to survey contacts expressing frustrations or satisfactions with the survey instrument and/or offering additional information. The latter, in particular, can divulge additional questions that might have been useful or indicate that existing questions miss the mark. It may also be appropriate to include a “debriefing” question in the instrument itself such as “Do you have any comments about your experience with this survey?”  It can be challenging to balance respondent observations with what you know to be appropriate instrument designs that generate information you need.  Narinus (1999) said it well:

Remember that your participants are the experts when it comes to understanding your questions. But, you are the ultimate authority. There are times when suggestions made by participants are either impractical or run contrary to the rules of sound methodology. Keep the balance in mind.

Review and Modify The real proof of your design comes with assessment of the utility of the information provided in addressing your evaluation questions.  Response patterns to individual questions such as poor response to open-ended questions or frequent “don’t know” responses, multiple write-in comments on scaled questions, and incomplete forms can suggest needed improvements. Of course, the bottom line assessment will be whether or not the data generated allow you to address your evaluation questions with confidence.

Summary These four posts were aimed at promoting a comprehensive view of survey instrument design based in establishing clear evaluation purposes and information needs, application of established survey design principles, pretesting, monitoring and revision.  Perhaps the unstated essential ingredient throughout was omitted – humility.  It’s unlikely that anyone who has done extensive survey work has avoided the experience of their carefully crafted instruments occasionally missing the mark.  The approach we outlined here helps the evaluator anticipate needs and likely responses and establishes a pattern of continual improvement.  What approaches have worked best for you?

Sources:

Narinus, P. 1999. Get Better Info from All Your Surveys: 13 Important Tips for Pretesting. SPSS, Inc.  http://www.uoguelph.ca/htm/MJResearch/ResearchProcess/PretestingTips.htm

DeMaio, Theresa J., Jennifer Rothgeb, Jennifer Hess, 1998. U.S. Bureau of the Census, Washington, DC 20233 Accessed September 19, 2012 http://www.census.gov/srd/papers/pdf/sm98-03.pdf