AEA2014: To Join or Not to Join AEA

By Pennie CrinionDirector of Program Planning and Evaluation, University of Illinois Extension

Ever find yourself wondering if you should renew an Extension Professional Association membership or join another one?  As an administrator most of my career, I’ve seen the addition of three new national professional associations for Extension resulting in a total of seven and have felt pressure in deciding how many to join.

Then when I assumed my current position which included leadership for evaluating programs my predecessor impressed upon me the need to join the American Evaluation Association (AEA). So I registered for the Summer Evaluation Institute held in Atlanta but didn’t formally commit to AEA membership and national conference participation until 2013.

This year I found the conference theme–Visionary Evaluation for a Sustainable, Equitable Future to be particularly interesting in light of the concerns regarding protecting the environment, an issue that has global impact.  Bob Willard, a leading expert on quantifying and promoting the business value of corporate sustainability strategies and core faculty member of the International Society of Sustainability Professionals provided the opening presentation.

His efforts to engage the business community in proactively avoiding risks and capturing opportunities by using smart environmental, social, and governance strategies was insightful and reassured me that  corporations are increasingly recognizing their place at the intersection of global economic, environmental, and equity issues. As Bob shared the business context for corporate social and environmental responsibility, he stressed the importance of standards and benchmarks in linking the corporate world to the field of evaluation.  He highlighted standards that encourage organizations to create positive environmental, social, and economic value so that we have the possibility of sustaining a global economy, society, and ecosystem.  I left convinced that the world renowned evaluation experts who were in attendance and members of AEA would rise to the opportunity he described.

As always, I also appreciated conference opportunities to view the poster session and hundreds of choices offered in 15 concurrent session segments supported by 53 evaluation topical interest groups including the Extension Education Evaluation group. Comradery with Extension colleagues, reasonable registration fees, and opportunities to visit with others in the evaluation field are other great features.

So you may be asking what other benefits would I reap by joining AEA?  Here’s a list for your consideration.

  • Coffee Break demonstrations that are 20 minute long webinars designed to introduce audience members to new tools, techniques, and strategies in the field of evaluation.
  • Professional Development eStudy 3 or 6 hour webinars are provided by evaluation’s top presenters in multiple 90-minute sessions allowing for in-depth exploration of hot topics and questions from the audience.
  • AEA’s active listserv: EVALTALK, with more than 2000 subscribers from around the world that welcome questions and discussion on any topic related to evaluation.
  • AEA365 is dedicated to highlighting Hot Tips, Cool Tricks, Rad Resources, and Lessons Learned with the goal of a daily post from and for evaluators around the globe.
  • Printed or online copies of the American Journal of Evaluation and New Directions in Evaluation.
  • Job posting opportunities.

So visit www.eval.org and explore AEA membership.

AEA2014: Internal v. External Evaluator Perspecitves

By Brigitte Scott, Evaluation and Research Specialist for the Military Families Learning Network

Greetings! I’d like to use this blog post to introduce myself as the new Southern Region rep for the AEA Extension Education Evaluation Topical Interest Group (TIG) Board, and to share some thoughts from #eval14 as well.

I have been working with the Military Families Learning Network (MFLN) as the evaluation and research lead for almost a year now. Outside of my “intro” evaluation work during my post-doc, this is my first professional evaluation position. My graduate degrees are in curriculum and instruction, and I have worked extensively with qualitative research methodologies. As an educator and a researcher, I am hard-wired for an ethic of care in my evaluation work, and I have a very high tolerance for complexity and “messiness.” All of these things greatly inform my work with MFLN.

I am an internal evaluator, and as such, I’m a part of the MFLN leadership team, have some supervisory responsibilities, and have a heavy hand in processes across the network. Evaluation permeates all that I do within the MFLN. And while I have often been referred to as the “one-(wo)man evaluation pop-up-shop” for the MFLN, my goal is to support evaluation and evaluative thinking as nested activities within the daily processes of all the individuals who are a part of our learning network. I know I speak for my MFLN colleagues when I say we are each committed to our work, to the communities we serve, and to the MFLN mission. We want to do the best we can do as professionals because we believe in our work and we believe in the value of Cooperative Extension.

With all this in mind, I would say one notable impression #eval14 made on me is related to the impact internal and external evaluation perspectives bring to our work—and what this means for evaluation in Cooperative Extension context. Although I’m aware that perhaps the majority of evaluators work from an external perspective, this was the first time I had a firm sense of what the external evaluation perspective looks and sounds like “on the ground.” I heard a lot of terms like “treatment,” “randomized control,” and “intervention.” Of course, these aren’t new terms for me in the context of education research or evaluation. And certainly these terms and their methodologies and paradigms have their place in evaluation work. But the thought of using these terms (and their accompanying methodologies) in my work felt . . . jarring. I know that not all Extension evaluators are internal to the programs they evaluate. But I’m thinking that as Extension professionals, we all have a certain level of “buy-in” and belief in the Cooperative Extension System, what it stands for, and what we do collectively for and within our communities. I would venture to say that as we work within our communities, we have a certain involvement that transcends, even complicates, our evaluation work. Though this external “rhetoric” prompted a certain type of reaction in me, it also reminded me of how susceptible to (positive) bias I can be as an internal evaluator to work that I believe in and am invested in intellectually and programmatically. It was a strange new entre for me into reflexive thinking on my own evaluation practice. But there it was. And I wonder, from all of you: What do you think? How does your commitment to the mission and values of Extension “flavor” your evaluation work? How does your disciplinary background, and even your own professional “positioning” impact your methodologies? Your findings? Your evaluation reports?

It was great meeting some of you at the Extension Education Evaluation (EEE) TIG meeting, and as I understand, it won’t be long before I’ll be helping to review EEE submissions for #eval15. I look forward to getting to know you all and the work that you do not only through our AEA TIG, but also in our collective work as evaluators within Cooperative Extension. Feel free to connect at any time, and if you are attending NAEPSDP (the National Association of Extension Program and Staff Development Professionals) in December in San Antonio, please look for me! We’ll grab a coffee and wax reflexive…!

 

Until then,

Brigitte Scott

brigit2@vt.edu

@4ed_eval

 

 

AEA2014: Evaluation Ideas that are Ready to Retire

By Teresa McCoyAssistant Director of Evaluation and Assessment, University of Maryland Extension

Did you ever have a pair of old shoes or jeans that you just could not bear to part with—no matter how tattered or worn out? I know that I have and bet you have, too. At the 2014 American Evaluation Association annual conference in Denver, I attended a session given by Michael Quinn Patton entitled, “Lack-of-vision evaluation ideas that should be retired to realize visionary evaluation for a sustainable, equitable future” where he presented his suggestions for evaluation ideas that we should discard (like those old shoes or jeans).

If you have heard Michael speak, you know he generates thought-provoking engagement with his audience and this session title certainly sparked my interest. I wasn’t disappointed. In his new edition of Qualitative Research Evaluation Methods: Integrating Theory and Practice (4th edition) (due out in November and coming in at length of over 800 pages), he discusses the 10 outdated evaluation ideas and/or approaches that he thinks should be retired, including such classics as anecdotal, gold standard, best practices, and site visits.  You will have to buy the book to see all of his explanations about the practices that are outdated.

After presenting his ideas, Michael asked the audience for further nominations. There were lots of suggestions! I had my own: cost-benefit. I think that is a term borrowed from the business world that does not translate to the not-for-profit or government sectors. For example, how I do calculate the costs of not knowing how to prepare healthy meals for children versus the benefits? Is there a way we can calculate the costs of a bridge, such as the Chesapeake Bay Bridge in Maryland, versus the benefits? We know the costs to build the Bay Bridge and to maintain it each year, but what about the benefits? When I drive over the Bay Bridge, I am amazed at the beauty of the Chesapeake Bay. I benefit by being able to see my Extension colleagues on the Shore within one-to-two hours. However, what is that benefit actually worth?

With costs and benefits, the question has to be raised about costs and benefits to whom? The State of Maryland benefits from the tolls I pay each time I cross the bridge. Yet, the State of Maryland incurs a cost in highway and bridge maintenance, air pollution, and disruptions to the ecology of the Bay because I drive across to the Eastern Shore.

Another nomination from an evaluation expert in the room was the idea of logic models. This person suggested we move away from the term logic model to that of program map.  I agree. The best way to clear a room of Extension people is to say, “I’m here from the Evaluation Department to teach you about logic models.” I often start trainings with that line and always get a laugh. In my practice, I have moved away from teaching logic models to teaching program theory and program maps. I advise people to forget about the logic model form and use the tool that works best for them to figure out what outcomes their program is designed to accomplish. I showed an image of a tree (roots, trunk, limbs, leaves) logic model to a group and a woman said, “OK, I understand that now. I wish someone would have told me this earlier.”

I would like to hear from other Extension evaluators what old ideas and approaches you think we should leave behind like those old shoes and jeans. Perhaps this discussion could help us move our practices and our profession ahead in the next few years.