Flexible Systematic Approaches Build Evaluation Capacity for Program Staff

By Celeste CarmichaelProgram Development and Accountability Specialist, Cornell Cooperative Extension Administration

“Systematic approaches with flexibility built in to meet local needs”—that is how I would describe ideal program development resources for Extension programs.  Most of our Extension Educators are busy with field responsibilities.  In order to assist with implementation of best practices, resources need to be applicable to broad goals, easy to find, use, and adapt.

For Cornell Cooperative Extension (CCE), Qualtrics has proven to be a systematic yet flexible resource for supporting needs assessments and program evaluations.  There are other options for survey development, but Qualtrics is supported at Cornell for faculty, staff, students and CCE educators.  We have also found Qualtrics to be a good match for any job from very simple to highly complex surveys, and it provides substantive data protection for respondents through secure servers.  One of the other features that makes Qualtrics very attractive is the ability to create a library of sample Cooperative Extension evaluation forms and questions to help Extension Educators get started with survey development. 

Staff have reported that because of time limitations there are instances when evaluation measure development is done in haste just prior to a face to face event.  When created in a hurry questions might not reflect the intended program outcomes and the resulting responses may not be as useful as they could have been otherwise.  Staff also report that survey development can be frozen by simple details that might feel overwhelming when having to develop a survey in short order.  Challenges noted include:

  • Getting the right survey look and feel
  • Developing questions and question order
  • Pilot testing questions
  • Understanding the overall evaluation questions for the program

In order to give more common programs a leg up on building evaluation forms, draft surveys that ask questions connected to how programs reach statewide outcomes are being developed and shared in the Qualtrics Cornell Cooperative Extension survey library.   The draft surveys have a Cooperative Extension header and footer, an appropriate question logic for typical programs, questions and blocks of questions that have been piloted, and questions related to behavioral aspirations and outcomes.  Surveys from the library can be saved into a user’s personal library and adapted as needed.  Additionally survey questions can be individually found in the question bank library.

On using the libraries:

CCE Qualtrics

Qualtrics users will note that “Library” is a tab in the Qualtrics menu where surveys can be saved into a user’s personal account and adapted.  The data collected belong with a user’s personal account and not the library.  A benefit to Qualtrics is the online documentation about using the features including libraries.

Similar options for a systematic approach exist beyond Qualtrics, of course.  The idea is simple—provide a starting point to allow all staff a baseline set of questions to collect data around programs.  When the starting point is adaptable—it builds capacity for the program practitioner to grow into the evaluator, adapting the questions to meet to local needs.  Where Qualtrics or another survey tool is not available, a virtual folder of adaptable documents can help local educators who are doing similar types of programs build around common program outcomes and indicators.

The POWER of Sticky Notes in Evaluation

by Zena Edwards

Washington State University Extension

 

I recently completed an 18-month Western Evaluation Capacity Training with thirty of my Extension from across the western United States.  Issues with response rates to evaluations and surveys seemed to be a recurring theme during our culminating presentation session.

For the past few years I have been studying and reading about Robert Cialdini’s Six Persuasion Principles.  In the book “Yes! 50 Scientifically Proven Ways to Be Persuasive” Cialdini and his co-authors describe several low-cost, low-effort ways that can used increase the likelihood that program participants will respond to our evaluation requests.  The one that fascinates me the most is the innovative use of a common, inexpensive technology already available in every Extension staff person’s tool kit:  The humble sticky note.

In 2005 Garner published research on how using the simple technology of a sticky note can dramatically increase response rates for paper surveys.  He conducted four studies to investigate the effect of attaching a sticky note to survey packets on the likelihood of completing surveys.

The first study compared response rates between: a) using a typical printed cover letter sent with the 5-page survey, b) the identical letter with the hand written message “Please take a few minutes to complete this for us. Thank you!” on the upper right hand corner and c) the printed cover letter with the handwritten message on a yellow 3 x 3 sticky note affixed to the upper right hand corner.  The printed cover letter had a response rate of 36%; this increased to 48% when a handwritten note was added. But using the sticky note more than doubled the response to 75%!

The second study was designed to ask the question “Is the sticky note in itself enough to increase response rates?” When the results were tallied, there was a 69% return rate for cover letters with the handwritten request on a sticky note. There was no significant difference between a blank sticky note (43%) and the typical cover letter.

The third study found that more participants responded when receiving the cover letter with sticky note affixed and written message (64%) compared to just the letter (42%).  They also returned surveys sooner and more of them answered the open-ended questions and with more completeness and detail.  Interestingly, the response to the sticky note message could be subconscious; none of the respondents that returned follow-up surveys mentioned the sticky note message as a reason for their response.

In the fourth study, individuals received either a shorter 5-page questionnaire or a 24-page survey on the same topic with 150 items asking more open-ended and detailed questions.  For the shorter survey, there was a significant difference between no sticky note (33% response rate), but no difference between a standard sticky note request (70%) and a personalized sticky note request using the individuals first name (77%).

For the long survey, people were more likely to respond if they received a personalized sticky note (67%) compared to the standard sticky note request (40%). There was only a 14% response rate when no sticky note request was attached to the cover letter.

What can we take away from this research?  Technology is an essential and important tool, but cannot replace “high touch” if we want to improve response rates. The small cost of some added effort can dramatically increase our evaluation response rates. Some things to consider if you want to take advantage of this low cost technology available at your finger tips:

  • Just adding a sticky note to evaluations can slightly increase response rates, although not significantly.Sticky Note Message
  • Adding a request to complete the survey on a sticky note increases response rates further.
  • Writing the same message directly on the evaluation is not likely to improve response rates
  • Adding a sticky note request on a program evaluation can increase response rates, increase quality and quantity of response to open-ended questions and decrease response time.
  • A personalized sticky note message can increase response to more onerous surveys or evaluation requests.
  • The extra effort of personalization may not be needed for typical surveys or requests.
  • Conducting an on-line survey?  Consider mailing or distributing a post-card with a pre-printed sticky note using a handwriting font.
  • Consider using an electronic sticky note when sending email evaluation requests.

 

This article, The POWER of Sticky Notes in Evaluation, was originally published Tuesday, March 26, 2013, on the Evaluation Community of Practice blog, a part of eXtension.   This work is licensed under a Creative Commons Attribution 3.0 Unported License.

 

Using Optical Mark Recognition Software to Expedite Data Entry

by Teresa McCoy, University of Maryland Extension

The last thing I want to do or subject anybody else to is manual data entry.  It’s time consuming, requires a lot of attention to detail, and just not the way to get others interested in the field of evaluation as a career.  I almost feel apologetic when I have to hand over a box of surveys to undergraduate students and tell them their job is to enter all of this data.

Of course, with the Internet and survey software, we can often bypass the paper route and go to electronic surveys:  But, not always.  Extension still has audiences that require the traditional route of paper, pen, envelopes, and stamps.  That’s why I have found the Gravic, Inc. product, Remark®, to be quite a handy tool.  If you have audiences that aren’t reachable by electronic surveys, it’s well worth your time to check out this type of technology.  I will get into more details, but the bottom line is you can use paper-based surveys (developed in MS Word) that require only a quick pass through a scanner and your data is ready to check and clean.

Any word processing software (or even Excel) can be used to create survey/evaluation instruments that answers are read by fill-in-the-bubble marks.  These bubbles are like what you might have seen and filled in on standardized test forms.  However, your document retains a pleasing look without the machine-readable form appearance.  There are certain parameters in the design of your instruments that need to be adhered to and which I will discuss in more detail later in this post.

Gravic, Inc. does provide special bubble fonts that can be downloaded (free) to use.  The primary direct cost to get started is in licensing the software.  A single user license is listed at $995; a three-pack license is listed at $2,535.  You will need an automatic document feed (ADF) scanner, as well, and there are plenty of good scanners around the price mark of $400.  So, for a one-person set up, you can be ready to go for an investment of around $1,400.

A couple of items are important to consider if you have to buy a scanner.  The price points for ADF scanners are structured about the pages per minute (PPM)—the speed at which the scanner feeds through the pages.  I use an Epson scanner that is rated at 25 PPM, can duplex, and has a 75-page loader.  That is plenty of speed for me for the projects I am involved in.  An important note is that Remarks requires a Twain compliant scanner.  I just returned a Fujitsu ix500 scanner because it did not support Twain (lesson–do not assume).  If your scanner is not Twain compliant, you can still scan your images into a PDF and read them from that file.  That is not my preference, however, because I don’t want that extra step involved.  Remark® doesn’t recommend any particular scanners, but there is information on the web site, as well as from other users in the Remark® Knowledgebase.

There are some indirect costs that you need to consider if you want to invest in this technology.  There is a learning curve that is associated with any new software, of course.  With Remark®, there is also an investment in learning how to design instruments/surveys that scan well.  For example, the white space between the bubble marks is important so that the machine can get a good, clean read.  Lines, shading, tables, and other types of formatting that you may want to employ to help the user experience may cause trouble in the scanning process.  Read the manual carefully to avoid mistakes in your form design.  A nice feature that Remarks offers is to have your instrument reviewed by one of their design experts.  I have taken advantage of this service and they offer a quick review turnaround.  In addition, I am willing to share surveys that I have designed for Remark®.

As with any other technology, this software does not replace thoughtful evaluation planning, survey design, and data analysis.  However, if you have audiences that still require paper-and-pen surveys/evaluations, this software is a great option for you to consider.  After all, who wants to do manual data entry?  Not me.