Worthy and Effective Public Value Narratives

By Scott Chazdon, University of Minnesota Extension

In 2014, my evaluation colleagues and I began gathering stories about the impact Extension programs have on individuals and communities. Based initially on the Most Significant Change method (Dart & Davies, 2003), the project aimed to promote ongoing dialogue about Extension programming and help staff and stakeholders explore the changes that occur because of Extension programming.

Methodological Underpinnings

The Most Significant Change methodology is a participatory, story-based approach to evaluate Extension’s impact on participants and the public. We piloted the project across programming in the central region of Minnesota—15 counties that include the Twin Cities metropolitan area and surrounding suburban and rural counties.

The project uses a dialogical process; each submitted story is reviewed against a rubric. The project drew on Brinkerhoff’s Success Case Method (2002) to show evidence of program impact through rich and verifiable descriptions. This method does not replace other evaluation efforts, but stories can be powerful communication tools, especially when combined with other evaluation methods.

The Rubric

As a result of the central region project, we strengthened the rubric to ensure that public value is deeply engrained in impact narratives.  The rubric is based on our learning from the project as well as my own work on Evaluation for Public Value (Chazdon & Paine, 2014).

The elements of the new rubric are:

  1. Story demonstrates behavior changes that resulted from Extension programming. A strong narrative must incorporate evidence that the program has achieved its intended behavioral outcomes.
  2. Story demonstrates the trust and respect Extension has established with key audiences. Extension has built trust through long-standing relationships with key stakeholders. These aspects of programs are often overlooked and need to be incorporated into impact narratives.
  1. Story demonstrates Extension programs, staff, and volunteers meeting the needs of underrepresented populations. Part of the public value of a program is determining the audiences that most need the programming. These should be audiences that cannot otherwise receive the content through private sources.
  1. Story demonstrates Extension adapting to meet the changing needs of its key audiences. Public value also resides in staying current with traditional Extension audiences (farmers, youth, and conservation professionals) by addressing changing needs. This can include changing content due to new economic, environmental, or political contexts.
  2. Story demonstrates ways that Extension leverages organizations or partnerships to expand the delivery of research and education beyond initial program participants. Public value resides in the way Extension leverages its partnerships and collaborations to reach beyond its direct participants.
  1. Story demonstrates ways that Extension programming led to positive social, economic, environmental, cultural, health, or civic effects for public-serving organizations or communities. Public value resides in the “so what?”—the positive things that happen in families, organizations, and communities that can be attributed at least, in part, to Extension education. It is challenging to quantify these types of impacts, but systematic qualitative methods, such as Ripple Effects Mapping, can be very useful to document these effects.

These six aspects of public value are easy to teach and provide a useful framework for thinking about the public value of Extension education.

Moving Forward

As evaluators move into public value narratives, we must tread carefully with the communications staff in our organizations. Typically, they write impact stories, and they do it well! But they may not employ evaluative frameworks, such as this rubric, in doing so. To distinguish our work, we describe it as public value “narratives” rather than “stories.”

Moving forward, we continue to develop tools and training resources to support the writing of impact narratives. We have developed the following quick guide to composing a narrative:

  1. What was the presenting issue?
  2. Who was the target audience, and why?
  3. Why Extension? Credible information, research-based, trusted resource?
  4. What changes in behavior or action occurred as a result of the program? Include evaluation evidence.
  5. What were the broader impacts? Evidence of spillover, leveraging, ripples, return on investment, benefit-cost analysis?

We are also working to train new Extension educators in this narrative writing process. We hope to have a public value narrative contest as part of our annual professional development conference and use narratives for our reporting to the national land grant impacts database.

I am happy to share more information on our process and can be reached at schazdon@umn.edu.

Key References:

Brinkerhoff, R. O. (2002). The success case method. San Francisco: Berrett-Koehler.

Chazdon, S.A. & Paine, N. (2014). Evaluating for Public Value: Clarifying the Relationship Between Public Value and Program Evaluation.  Journal of Human Sciences and Extension, 2(2), 100-119. Retrieved from http://media.wix.com/ugd/c8fe6e_8b2458db408640e580cfbeb5f8c339ca.pdf.

Dart, J. & Davies, R. (2003). A dialogical, story-based evaluation tool: the Most Significant Change Technique. American Journal of Evaluation, 21(2), 137-155.

Franz, N. (2013). Improving Extension programs: Putting public value stories and statements to work. Journal of Extension, 51(3). Retrieved from http://www.joe.org/joe/2013june/tt1.php

Kalambokidis, L. (2011). Spreading the word about Extension’s public value. Journal of Extension, 49(2). Retrieved from http://www.joe.org/joe/2011april/a1.php.

Knowledge Sharing Toolkit. (2014). Most Significant Change. Retrieved from http://www.kstoolkit.org/Most+Significant+Change

Flexible Systematic Approaches Build Evaluation Capacity for Program Staff

By Celeste CarmichaelProgram Development and Accountability Specialist, Cornell Cooperative Extension Administration

“Systematic approaches with flexibility built in to meet local needs”—that is how I would describe ideal program development resources for Extension programs.  Most of our Extension Educators are busy with field responsibilities.  In order to assist with implementation of best practices, resources need to be applicable to broad goals, easy to find, use, and adapt.

For Cornell Cooperative Extension (CCE), Qualtrics has proven to be a systematic yet flexible resource for supporting needs assessments and program evaluations.  There are other options for survey development, but Qualtrics is supported at Cornell for faculty, staff, students and CCE educators.  We have also found Qualtrics to be a good match for any job from very simple to highly complex surveys, and it provides substantive data protection for respondents through secure servers.  One of the other features that makes Qualtrics very attractive is the ability to create a library of sample Cooperative Extension evaluation forms and questions to help Extension Educators get started with survey development. 

Staff have reported that because of time limitations there are instances when evaluation measure development is done in haste just prior to a face to face event.  When created in a hurry questions might not reflect the intended program outcomes and the resulting responses may not be as useful as they could have been otherwise.  Staff also report that survey development can be frozen by simple details that might feel overwhelming when having to develop a survey in short order.  Challenges noted include:

  • Getting the right survey look and feel
  • Developing questions and question order
  • Pilot testing questions
  • Understanding the overall evaluation questions for the program

In order to give more common programs a leg up on building evaluation forms, draft surveys that ask questions connected to how programs reach statewide outcomes are being developed and shared in the Qualtrics Cornell Cooperative Extension survey library.   The draft surveys have a Cooperative Extension header and footer, an appropriate question logic for typical programs, questions and blocks of questions that have been piloted, and questions related to behavioral aspirations and outcomes.  Surveys from the library can be saved into a user’s personal library and adapted as needed.  Additionally survey questions can be individually found in the question bank library.

On using the libraries:

CCE Qualtrics

Qualtrics users will note that “Library” is a tab in the Qualtrics menu where surveys can be saved into a user’s personal account and adapted.  The data collected belong with a user’s personal account and not the library.  A benefit to Qualtrics is the online documentation about using the features including libraries.

Similar options for a systematic approach exist beyond Qualtrics, of course.  The idea is simple—provide a starting point to allow all staff a baseline set of questions to collect data around programs.  When the starting point is adaptable—it builds capacity for the program practitioner to grow into the evaluator, adapting the questions to meet to local needs.  Where Qualtrics or another survey tool is not available, a virtual folder of adaptable documents can help local educators who are doing similar types of programs build around common program outcomes and indicators.

No shoes? No shirt? No problem!

By Karen Ballard, Professor, Program Evaluation, University of Arkansas & NAEPSDP President-Elect

WHAT?  Be a part of the FREE virtual Program Evaluation Summer School July 21st – 24th, 2015.flip flops

No travel funds?  Lots of questions? We have you covered.

The National Association of Extension Program and Staff Development Professionals (NAEPSDP) and the PSD/Southern Region Program Leadership Network is co-sponsoring this four-day webinar series.  The free live interactive sessions will consider some of Extensions’ big issues . . .

Want to know how to produce webinars with wow?

Want to consider what the future may hold for Extension?

Want to know where to even start with program evaluation?

Want to know how to understand what really matters with social media?

Pull up a chair . . . in your office or on the beach . . . Register and join us next week.

You can register for one or all of the educational sessions, Tuesday-Friday, July 21st – July 24th

For more information related to topics and speakers, see the detailed program descriptions and registration links below,  or visit https://naepsdp.tamu.edu/ to register.

 

VSS flyer_Page_1

VSS flyer_Page_2

 

Program Schedule

 

Tuesday, July 21st

Session Title:  Oh, What a Tangled Web…inar We Weave!

Presenters: Mary Poling and Dr. Julie Robinson

Session Description:

This session will look at the intricacies and continuous development of best practices for webinars and blended courses based on user feedback, instructor experiences, and evaluation results.

Participants will learn:

  • best practices for hosting a webinar.
  • best practices for conducting a webinar.
  • best practices for delivering a blended course.

Registration Link:  https://uaex.zoom.us/webinar/register/e0f3ccfc80a2c5c0d746f627e8486654

 

Wednesday, July 22nd

Session Title: The Art and Science of Environmental Scanning: Staying Real During Rapid Change

Presenters: Dr. Nancy Franz and Dr. Karen Ballard

Session Description:

This session examines trends and disruptive technologies that currently exist and/or are on the horizon for Extension. To plan responsively in this environment, Extension workers must anticipate these new developments. This session will engage participants in exploring strategies and methods Extension may need to adopt to insure relevance and support from stakeholders. Participants will be invited to participate in the discussion to stimulate actions supporting the future of Extension.

Registration Link:  https://uaex.zoom.us/webinar/register/dc80b86017278299cde7dc3c8da9331e

 

Thursday, July 23rd

Session Title:  When Is a Program Ready for Replication and Rigorous Evaluation?

Presenters: Dr. Donna J. Peterson and Dr. Laura H. Downey

Session Description:

This session will explain the Systematic Screening and Assessment Method (SSA; Leviton, Khan, & Dawkins, 2010) and how it can be applied to Extension programs.  SSA includes environmental scanning methodology as well as evaluability assessment.  Participants will:

  • Learn the step-by-step process of conducting an environmental scan and evaluability assessment
  • Understand criteria used in an evaluability assessment
  • Be asked to apply the evaluability assessment method to a program of their own

Registration Link:  https://uaex.zoom.us/webinar/register/2c11f52eaef207f2d746f627e8486654

 

Friday, July 24th

Session Title:  Evaluation of Social Media Platforms for Extension Outreach and Education

Presenter: Amy Cole

Session Description:

This session will address identifying “if” and/or “what” social media tools may assist with effective Extension outreach and education of target audiences.  Participants will learn what research reflects regarding audience demographics for key social media sites and the implications for Extension educators.  Strategies and successful current practices from multiple organizations will be shared to assist participants in identification of effective social media methods that can be replicated.

Registration Link:  https://uaex.zoom.us/webinar/register/f2ab575c6aab73f47510d14dfea9e911