Online Focus Groups

By Sarah Baughman, eXtension

Evaluation & Research Leader

 

Focus groups can be a great way to gather research or evaluation data, especially when it is beneficial to have participants interact and hear the opinions or views of others. Traditionally focus groups are conducted “face to face” with a moderator in a room with a small number of participants.  New technologies make it possible to conduct focus groups online.  Online focus groups typically take place in a virtual meeting space where the moderator and participants are all in different physical locations.

Online Focus Group
photo credit: Lars Plougmann via photopin cc

As part of the eXtension State and Local Value Enhancement project we conducted a number of online focus groups in the last year.  Our goal was to emulate a traditional in person focus group as much as possible.  We used the Adobe Connect System in conjunction with a conference phone; there are many other platforms available that could work.  Some systems are free if your institution does not subscribe to an online conferencing service.

Here are some benefits, tips and pitfalls we learned during our process:

Benefits:

  • Cost effective
  • Modeling of technology use (a key value of our organization)
  • Able to reach a broad range of participants across the country
  • Able to give participants a choice in times/dates
  • Recordings and transcripts are available immediately for analysis
  • Participants have the option of speaking or typing responses

Tips:

  • Use the technology to its fullest by incorporating “chat pods” and “polls” to keep the participants engaged
  • Follow the usual best practices for focus groups (see Focus Groups: A Practical Guide for Applied Research, Krueger & Casey, 2009)
  • Have a moderator, a note taker, and one extra person for additional support as needed
  • Preparation is key!  Practice with the entire team ahead of time.
  • Carefully consider the use of VOIP or phone line.   We found a phone line to be less problematic but it can increase costs.

Pitfalls:

  • Technology and bandwidth issues.  While the evaluation team may have been able to practice with the technology not all the participants can.
  • Moderating can be a bit more challenging when you can’t see body language.
  • Can be easier for participants to loose “focus” in web-based sessions

I would love to hear about your experiences conducting online focus groups.  What has worked well for you?  What have been some challenges?  What questions do you have about using this technique?  Where do you see opportunities for incorporating online focus groups in your evaluation practice?

 

 

This article, Online Focus Groups, was originally published Friday, April 5, 2013, on the Evaluation Community of Practice blog, a part of eXtension.   This work is licensed under a Creative Commons Attribution 3.0 Unported License.

This work is licensed under a Creative Commons Attribution-NonCommercial-No-Derivs 3.0 Unported License.

The POWER of Sticky Notes in Evaluation

by Zena Edwards

Washington State University Extension

 

I recently completed an 18-month Western Evaluation Capacity Training with thirty of my Extension from across the western United States.  Issues with response rates to evaluations and surveys seemed to be a recurring theme during our culminating presentation session.

For the past few years I have been studying and reading about Robert Cialdini’s Six Persuasion Principles.  In the book “Yes! 50 Scientifically Proven Ways to Be Persuasive” Cialdini and his co-authors describe several low-cost, low-effort ways that can used increase the likelihood that program participants will respond to our evaluation requests.  The one that fascinates me the most is the innovative use of a common, inexpensive technology already available in every Extension staff person’s tool kit:  The humble sticky note.

In 2005 Garner published research on how using the simple technology of a sticky note can dramatically increase response rates for paper surveys.  He conducted four studies to investigate the effect of attaching a sticky note to survey packets on the likelihood of completing surveys.

The first study compared response rates between: a) using a typical printed cover letter sent with the 5-page survey, b) the identical letter with the hand written message “Please take a few minutes to complete this for us. Thank you!” on the upper right hand corner and c) the printed cover letter with the handwritten message on a yellow 3 x 3 sticky note affixed to the upper right hand corner.  The printed cover letter had a response rate of 36%; this increased to 48% when a handwritten note was added. But using the sticky note more than doubled the response to 75%!

The second study was designed to ask the question “Is the sticky note in itself enough to increase response rates?” When the results were tallied, there was a 69% return rate for cover letters with the handwritten request on a sticky note. There was no significant difference between a blank sticky note (43%) and the typical cover letter.

The third study found that more participants responded when receiving the cover letter with sticky note affixed and written message (64%) compared to just the letter (42%).  They also returned surveys sooner and more of them answered the open-ended questions and with more completeness and detail.  Interestingly, the response to the sticky note message could be subconscious; none of the respondents that returned follow-up surveys mentioned the sticky note message as a reason for their response.

In the fourth study, individuals received either a shorter 5-page questionnaire or a 24-page survey on the same topic with 150 items asking more open-ended and detailed questions.  For the shorter survey, there was a significant difference between no sticky note (33% response rate), but no difference between a standard sticky note request (70%) and a personalized sticky note request using the individuals first name (77%).

For the long survey, people were more likely to respond if they received a personalized sticky note (67%) compared to the standard sticky note request (40%). There was only a 14% response rate when no sticky note request was attached to the cover letter.

What can we take away from this research?  Technology is an essential and important tool, but cannot replace “high touch” if we want to improve response rates. The small cost of some added effort can dramatically increase our evaluation response rates. Some things to consider if you want to take advantage of this low cost technology available at your finger tips:

  • Just adding a sticky note to evaluations can slightly increase response rates, although not significantly.Sticky Note Message
  • Adding a request to complete the survey on a sticky note increases response rates further.
  • Writing the same message directly on the evaluation is not likely to improve response rates
  • Adding a sticky note request on a program evaluation can increase response rates, increase quality and quantity of response to open-ended questions and decrease response time.
  • A personalized sticky note message can increase response to more onerous surveys or evaluation requests.
  • The extra effort of personalization may not be needed for typical surveys or requests.
  • Conducting an on-line survey?  Consider mailing or distributing a post-card with a pre-printed sticky note using a handwriting font.
  • Consider using an electronic sticky note when sending email evaluation requests.

 

This article, The POWER of Sticky Notes in Evaluation, was originally published Tuesday, March 26, 2013, on the Evaluation Community of Practice blog, a part of eXtension.   This work is licensed under a Creative Commons Attribution 3.0 Unported License.

 

Using Optical Mark Recognition Software to Expedite Data Entry

by Teresa McCoy, University of Maryland Extension

The last thing I want to do or subject anybody else to is manual data entry.  It’s time consuming, requires a lot of attention to detail, and just not the way to get others interested in the field of evaluation as a career.  I almost feel apologetic when I have to hand over a box of surveys to undergraduate students and tell them their job is to enter all of this data.

Of course, with the Internet and survey software, we can often bypass the paper route and go to electronic surveys:  But, not always.  Extension still has audiences that require the traditional route of paper, pen, envelopes, and stamps.  That’s why I have found the Gravic, Inc. product, Remark®, to be quite a handy tool.  If you have audiences that aren’t reachable by electronic surveys, it’s well worth your time to check out this type of technology.  I will get into more details, but the bottom line is you can use paper-based surveys (developed in MS Word) that require only a quick pass through a scanner and your data is ready to check and clean.

Any word processing software (or even Excel) can be used to create survey/evaluation instruments that answers are read by fill-in-the-bubble marks.  These bubbles are like what you might have seen and filled in on standardized test forms.  However, your document retains a pleasing look without the machine-readable form appearance.  There are certain parameters in the design of your instruments that need to be adhered to and which I will discuss in more detail later in this post.

Gravic, Inc. does provide special bubble fonts that can be downloaded (free) to use.  The primary direct cost to get started is in licensing the software.  A single user license is listed at $995; a three-pack license is listed at $2,535.  You will need an automatic document feed (ADF) scanner, as well, and there are plenty of good scanners around the price mark of $400.  So, for a one-person set up, you can be ready to go for an investment of around $1,400.

A couple of items are important to consider if you have to buy a scanner.  The price points for ADF scanners are structured about the pages per minute (PPM)—the speed at which the scanner feeds through the pages.  I use an Epson scanner that is rated at 25 PPM, can duplex, and has a 75-page loader.  That is plenty of speed for me for the projects I am involved in.  An important note is that Remarks requires a Twain compliant scanner.  I just returned a Fujitsu ix500 scanner because it did not support Twain (lesson–do not assume).  If your scanner is not Twain compliant, you can still scan your images into a PDF and read them from that file.  That is not my preference, however, because I don’t want that extra step involved.  Remark® doesn’t recommend any particular scanners, but there is information on the web site, as well as from other users in the Remark® Knowledgebase.

There are some indirect costs that you need to consider if you want to invest in this technology.  There is a learning curve that is associated with any new software, of course.  With Remark®, there is also an investment in learning how to design instruments/surveys that scan well.  For example, the white space between the bubble marks is important so that the machine can get a good, clean read.  Lines, shading, tables, and other types of formatting that you may want to employ to help the user experience may cause trouble in the scanning process.  Read the manual carefully to avoid mistakes in your form design.  A nice feature that Remarks offers is to have your instrument reviewed by one of their design experts.  I have taken advantage of this service and they offer a quick review turnaround.  In addition, I am willing to share surveys that I have designed for Remark®.

As with any other technology, this software does not replace thoughtful evaluation planning, survey design, and data analysis.  However, if you have audiences that still require paper-and-pen surveys/evaluations, this software is a great option for you to consider.  After all, who wants to do manual data entry?  Not me.