|
Newsletter: May 2013
| Vol 13, Issue 5
|
|
|
|
Learning More About Theory and Practice Is Always Exciting |

Dear Colleagues,
In mid-April, I attended the annual meeting of the Eastern Evaluation Research Society (EERS). As president of AEA, I was scheduled to give a talk, but, being a Westerner, I wanted to experience the whole conference. What a delight! I attended many wonderful sessions and realized, again, how lucky I am to have found evaluation. The issues raised in sessions never fail to excite me. In fact, I struggle to restrain myself from asking too many questions because I'm just so curious about the choices evaluators make in their practice.
One of the highlights of the conference was the first annual Eleanor Chelimsky Forum, funded by The Robert Wood Johnson Foundation (RWJF) and organized by Laura Leviton, senior advisor on evaluation at RWJF and a past president of AEA. As many of you know, Chelimsky directed the program evaluation and methodology division of the U.S. General Accounting Office (GAO) from 1980 to 1994 as the GAO began to advise Congress on the effectiveness of government programs and policies. Her impact on the visibility of evaluation and the field itself was immeasurable. She served as president of the Evaluation Research Society in 1980 and the American Evaluation Association in 1994. She was also elected as a Fellow of the National Academy of Public Administration.
The forum sprang from Chelimsky's speech at EERS in 2012, where she cited the frequent tension between "principle" and "context" and argued that we need more discussion about the links between evaluation theory and practice to allow both "to stretch, to bend a little, and to grow."
Michael Quinn Patton and Tom Schwandt inaugurated the forum with their remarks on the links between theory and practice, and Laura Leviton provided commentary. Schwandt started with defining "practice" as "artful doing" or "experimenting with possible solutions." He also said evaluation theory is our "repertoire of concepts, insights, analytic frames, and generalizations to be used as heuristics to 'think with.'"
Theory gives us "aids to the evaluation imagination." Theory helps us figure out what to do and how to proceed. Without this theoretical repertoire, Schwandt notes, our professional practice is rudderless. Consider what concepts and frames guide your own work. In my book with Blaine Worthen and Jim Sanders, Program Evaluation: Alternative Approaches and Practical Guidelines, we describe many historic and current evaluation approaches, arguing that evaluators should be aware of the different approaches or theories and drawing on different ones as the context, and evaluation purpose permit.
Patton elaborated on the link between theory and practice. The research on expertise shows that "expertise" derives from experimentation with different heuristics and practice. Much practice. Experts, finally, are hardly aware of their heuristics because their practice has sensitized them to situations and characteristics that the novice would not even notice. Experts have developed "astute and sophisticated situation recognition." He proposes several sensitizing concepts, which should influence evaluators' choices, including context or situation contingencies, user contingencies, the nature of the evaluation and the purpose of the evaluation.
In talking about evaluation practice at our Evaluation 2013, Oct. 14-19, in Washington, D.C., think about how you make use of theory in your practice. What sensitizing concepts are prominent? Which became prominent in your most recent evaluation? Why are these concepts the most prominent? I learned much from Schwandt and Patton's dialogue and Leviton's comments. You can learn more from their postings under "Conference Forms and Handouts" on the EERS website. RWJF is planning further dissemination of the proceedings of the forum and continued discussion. I look forward to continued learning and dialogue!
Sincerely,
Jody
Jody Fitzpatrick
AEA 2013 President
|
|
 |
|
|
Thank You from Executive Director Susan Kistler |
AEA colleagues,
I wanted to take this opportunity to thank you for the chance to serve you for almost 15 years as AEA's executive director. I officially stepped down on May 20 and am excited now to be working behind the scenes with Denise Roosendaal, your interim executive director. I'll be phasing out through June 30; after that, I will continue in a very limited staff capacity working on the program for the fall conference.
AEA has changed quite a bit since I started working with the most amazing volunteer leaders. They had the patience to mentor me as I learned and matured with the association. When I took over membership management, AEA had fewer than 3,000 members. Today, we are approaching 8,000. Financial challenges have turned into fiscal solvency and the ability to undertake new projects and partnerships. We've moved increasingly toward becoming a diverse, international community of practice, striving to serve the members and the field throughout the year. During the past 15 years, the association has built on the strong foundations found in AEA's journals and annual conference to offer a Summer Evaluation Institute, year-round eStudy and Coffee Break Webinars, aea365 Tip-a-Day blog, public eLibrary and expanded topical interest group (TIG) programming. I can't wait to see what the next 15 years brings.
There are too many people who I need to thank. If I were to name them all, this article would become very long very quickly. Coupled with a true love for evaluation, the volunteers who lead AEA have been what has kept me at my post this long. In particular, I have benefited from having known every AEA president with whom I have had the privilege of serving. I learned how to be more patient from one and more inquisitive from another; a third improved my writing and a fourth my organizational skills; a fifth consoled me when my daughter was ill; a sixth reminded me to see the value and joy in the challenges in life. Each one taught me about the intricacies of the field and about what it means to serve selflessly. I am proud and honored to count them, and so many others from the membership, among my professional colleagues and personal friends.
I'm not going away. Before working with AEA, I worked as a teacher, trainer and consultant, and I look forward to returning to those passions. I'll be continuing on with the association as a volunteer and will be working on a range of projects with some of the colleagues I have had the opportunity to meet through my time as AEA's executive director.
Thank you all, from my head and from my heart. I look forward to seeing many of you at Evaluation 2013 this fall in Washington, D.C.
|
New AEA Office Opened May 20!
|
On May 20, AEA phones were ringing at its new office in Washington, D.C.! At the same time, AEA's new management team assumed their new roles. Now, when you call AEA, you will hear new voices, but they will have the same commitment to serving the association. Please help welcome your new management team and bear with them during the transition. If you find new people at the other end of the line and they don't immediately have the answer you need, they will find it and get back to you ASAP!
Additionally, you may notice a slightly different look to eval.org. AEA moved the website to a more robust platform that will allow AEA to add new functionality as it moves forward.
To learn more about the management company transition and what it means for AEA, read the Transition FAQ.
Sincerely,
Susan Kistler
Executive Director Emeritus
B. Denise Roosendaal
Interim Executive Director
|
AEA's Values - Walking the Talk with Jim Rugh
|
Are you familiar with AEA's values statement? What do these values mean to you in your service to AEA and in your own professional work? Each month, we'll be asking a member of the AEA community to contribute her or his own reflections on the association's values.
AEA's Values Statement
The American Evaluation Association values excellence in evaluation practice, utilization of evaluation findings, and inclusion and diversity in the evaluation community.
i. We value high quality, ethically defensible, culturally responsive evaluation practices that lead to effective and humane organizations and ultimately to the enhancement of the public good.
ii. We value high quality, ethically defensible, culturally responsive evaluation practices that contribute to decision-making processes, program improvement, and policy formulation.
iii. We value a global and international evaluation community and understanding of evaluation practices.
iv. We value the continual development of evaluation professionals and the development of evaluators from under-represented groups.
v. We value inclusiveness and diversity, welcoming members at any point in their career, from any context, and representing a range of thought and approaches.
vi. We value efficient, effective, responsive, transparent, and socially responsible association operations.

Hello, fellow AEA members. I am Jim Rugh, currently serving as the coordinator of the EvalPartners global collaborative initiative, which promotes the role of Voluntary Organizations for Professional Evaluation (VOPEs) (i.e., formal associations/societies or informal networks of evaluators) in enhancing the capacities of evaluators and addressing enabling environments for evaluation in countries around the world.
I've been involved in international development for 49 years and have considered myself a professional evaluator for 33 of those years. My engagements within AEA since 1986 have included leadership in the International and Cross-Cultural TIG, the previous Nominations and Elections Committee and now, again, with the Nominations Working Group. I also led the International Listening Project for AEA and was the AEA representative to the International Organization for Cooperation in Evaluation (IOCE) (the "United Nations of evaluation associations") for four years.
As I look back at AEA's values statement from the perspective of my roles of globally promoting the evaluation profession, I can certainly see how it clarifies just what evaluators and evaluation associations should be — not just for AEA members, but for anyone who would define themselves a professional evaluator — whether to conduct evaluations or with responsibilities for planning evaluations and contracting evaluators.
There's been an interesting discussion recently on the XCEval and other listservs with reference to Patricia Rogers and Jane Davidson's GenuineEvaluation blog that has a good list of evaluation-specific methodologies — criteria that distinguishes evaluation from other forms of research. They have a lot to do with the values that guide (or should guide) those who call for evaluations and those who conduct evaluations. That discussion reinforces the relevance of many of the values in AEA's values statement.
In my engagements with VOPEs around the world, I am frequently reminded of the value of such values in guiding the philosophies, paradigms, strategies, methodologies, and operating practices of all of us who consider ourselves to be evaluators.
In their lead article in the Voluntary Organizations for Professional Evaluation book just published by UNICEF and EvalPartners, Natalia Kosheleva and Marco Segone cite a 1915 essay by Flexner in which he set forth the criteria for defining a profession, referring then to social work. "Professional groups have more and more tended to view themselves as organs contrived for the achievement of social ends rather than as bodies formed to stand together for the assertion of the rights or the protection of [their own] interests and principles."
Surely such values as those articulated by AEA help guide us as evaluators, and, collectively, as members of VOPEs.
|
Policy Watch - Evaluation in GPRAMA 2010 |
From Cheryl Oros, Consultant to the Evaluation Policy Task Force

In 2010, Congress passed the GPRA Modernization Act (GPRAMA 2010), updating the 1993 Government Performance and Results Act (GPRA). The new ACT stemmed, apparently, from Congressional desire for more use of data in federal decision making. You may wonder how the GPRAMA 2010 will be interpreted and implemented. In this column, I point to several sources for information and discussion on this and on other issues.
First, the White House Office of Management and Budget (OMB) is disseminating its interpretation of the updated law. One point of emphasis for OMB is to move from a focus on evaluations of individual programs, to an understanding of the results of larger strategies. OMB is also building an interactive federal evaluation community, the Evaluation Working Group, with a distribution list of 300+ federal staff members. Federal executive branch employees can register for the Evaluation Working Group list. On this site, OMB provides: evidence and evaluation guidance based on its interpretation of GPRAMA 2010; group activities, such as meetings with speakers; an OMB blog; a discussion board; and other resources and links about evaluation and performance measurement.
Second, the Government Accountability Office (GAO) sponsors FedEval, with a website providing information about federal evaluation activities, plus links to evaluation-related resources. Federal Evaluators (FedEval) itself is an informal network of federal evaluation officers, with a listserv of about 900. To join the FedEval listserv, federal employees should send their contact information to ShipmanS@gao.gov.
Third, a paper from the IBM Center for the Business of Government describes the new requirements of the GPRA Modernization Act (The New Federal Performance System: Implementing the GPRA Modernization Act). Based on a December 2012 joint forum with the National Academy for Public Administration, the report also offers recommendations about the need for agencies to better integrate evaluation with the performance measurement required by GRPA. The report recommends that agencies help create a better understanding of evaluation; incorporate evaluation expertise into performance discussions; link performance goals to evaluation outcome variables; and link evaluations to funding.
Fourth, Kathryn Newcomer of George Washington University and Clinton Brass of the Congressional Research Service have a recent paper, Reconceiving 'Performance Management': Situating Performance Measurement within Program Evaluation, and Program Evaluation as a Mission-Support Function. Newcomer and Brass argue that — unlike much common practice — evaluation, policy analysis, and performance measurement should be employed as a function that genuinely supports agency mission, with performance measurement situated firmly within the broader field of evaluation.
You can share your thoughts on these papers and resources with the AEA EPTF discussion group by signing up at the EPTF Discussion List.
|
Face of AEA - Meet Chris Lysy |
AEA's more than 7,800 members worldwide represent a range of backgrounds, specialties, and interest areas. Join us as we profile a different member each month via a short question-and-answer exchange. This month's profile spotlights Chris Lysy.
Name: Chris Lysy
Affiliation: Westat
Degrees: M.A. Sociology, B.A. Criminology and Criminal Justice
Years in the Evaluation Field: 5 years
Joined AEA: 2010
AEA Leadership Includes: Presenter at annual conference and contributor to AEA's newsletter as illustrator
Why do you belong to AEA?
"I belong to AEA because of the personal relationships I have developed with fellow members. Only a few years in and I feel like a part of the community. I originally joined AEA because of aea365 and Susan Kistler. I had a budding interest in social media and new technology but was disappointed by the lack of effort made by another association I was involved with at the time. AEA's approach, which helps spread the expertise of its members, was a breath of fresh air."
Why do you choose to work in the field of evaluation?
"I fell into evaluation after five years in contract research. The field fits my personality. I am good at math, suffer from an insatiable curiosity, and want to help change the world (not just say the words). Resources are often finite and good intentions are almost always insufficient. As an evaluator, I have the potential to help inform decisions and improve projects."
What's the most memorable or meaningful evaluation that you have been a part of?
"My first experience in evaluation was as a data specialist for Smart Start, North Carolina's early childhood initiative. Large quantitative datasets can have an intangible quality, that is until you see the datasets scoured by local programs who know the names of the children the data represent. The position gave me a new perspective on the meaningfulness of data and a newfound respect for quality control. The position also helped me reassess my own value. There was ample opportunity to make a contribution and the results of my work were visible. I could see the importance of my work and that motivated me to step up my game."
What advice would you give to those new to the field?
"Start a blog. AEA has more than 7,000 members. Thousands of those members will present at some point during the year, many at the annual conference. My educated guess is that less than a hundred will blog on a related topic with any level of regularity. This, I believe, is a terrible imbalance. So, start a blog. Find a topic that you love, write honestly, and be yourself. Always aim to write something you know will be of value to someone. If you need help, I will help."
|
eLearning Update - What Are People Saying About the eStudy Workshops? |
From Stephanie Evergreen, AEA's eLearning Initiatives Director
"The eStudy training is a cost-effective way to get just-in-time training in the current resource-constrained environment. Thank you again." — Michele Bonhomme, Washington, D.C., area
As Michele Bonhomme knows, today's economy has meant smaller professional development budgets for many of us, which makes our Professional Development eStudy courses great opportunities for training. Bonhomme was able to stay at home while getting six hours of education on Monitoring & Evaluation Planning from Scott Chaplowe, who is based in Geneva. eStudy courses reduce travel costs related to professional development.
Scott Miller was in that same eStudy on Monitoring & Evaluation Planning. He commented that "It was a good balance for someone like me, who has experience with these concepts and applications but no formal training in evaluation per se. Materials will be especially helpful for explaining the evaluation process to clients."
eStudy courses offer just that boost you need in your own understanding to help you up your evaluation practice. Course materials can provide great references even after the course is over.
Kurt Wilson, who attended Jeff Wasbes' eStudy on Causal Loop Diagrams, wrote "Great content ... the experience will certainly increase my participation in AEA webinars in the future!"
eStudy opportunities are important to people like Wilson, who is a graduate student and independent consultant and is living abroad at the moment. eStudy courses are a great way to stay connected to advances in the field, even when living with various resource constraints.
Like all eStudy presenters, Wasbes assigned a tiny bit of homework to help reinforce and practice the concepts introduced during the live webinar. Jeanne Ortega noted the value, saying "I thought it was an appropriate topic for a three-hour course. I was also glad to have an opportunity to do the homework. The problem was interesting and helped to explain how these diagrams could be usefully applied. I'm looking forward to participating in similar training options."
If you'd like to participate in similar training options, too, check out the latest lineup. AEA is excited to welcome back Kylie Hutchinson in June. She'll be training on effective evaluation reporting. Please know that registration closes June 10. AEA is also happy to announce that Michelle Kobayashi will be extending her Creating Surveys eStudy to a full six hours in June.
The Coffee Break lineup is another way to get bite-sized professional development every week.
|
Diversity - Culturally Competent Evaluation Practices and Policies: What's on the Horizon? |
From Karen Anderson, AEA's Diversity Coordinator Intern

Just last year, the Cultural Competence Dissemination Working Group (CCWG) sent a letter to Hillary Clinton, former U.S. Secretary of State, asking her to review the AEA Cultural Competence Statement and to consider applying it in state offices to encourage culturally competent evaluation on the federal level. Since then, the letter has been passed on to the Office of Planning Performance and Evaluation (OPPE) for dissemination and use, and the CCWG has continued to make connections to further the goals of the group.
For example, after speaking with Celeste Richie, AEA member, from the United States Department of Labor, the CCWG learned of her interest in guidance around searching for cultural competence in request for proposals (RFPs) and the practical application of the statement. This is where the idea for a condensed version of the Cultural Competence Statement was birthed.
The Competence in Evaluation one-page document was developed by members of the CCWG earlier this year, and it has been reviewed by an advisory group of leaders in culturally competent evaluation. This document will be ready for dissemination before the end of the summer. The one-page document is a condensed version of the Cultural Competence Statement and includes thought-provoking questions at the end. This is a must-have document for evaluators interested in culturally competent evaluation.
Special thanks to George Grob, members of the AEA Evaluation Policy Task Force (EPTF), and the Cultural Competence Statement one-page document reviewers.
|
Potent Presentations - Announcing Revised Poster Guidelines |
From Stephanie Evergreen, Potent Presentations Initiative Coordinator

One of the highlights of the AEA annual conference is the Wednesday night poster session. Hundreds of posters explain new developments in evaluation while presenters are on standby to elaborate. Somewhere in the mix, a panel of judges also wanders through the crowd, assessing the posters and selecting a competition winner.
Even if you aren't in it to win it, a great poster can draw a large crowd and leave attendees more informed about poster presenter's work. So what goes in to making a great poster?
Check out the revised poster guidelines. Updates include more specific guidance around:
- The design of the poster, such as which fonts to use;
- Advice from past poster competition judges, like where to put your references; and
- Valuable resources for poster development, including video tutorials.
Also, be sure to browse AEA's analysis of some past poster award winners — from AEA and beyond. Read through what makes them rise above the rest, spot the quality in the examples provided, and then apply the best practices to your next poster entry.
|
Emerging Practices in International Development Evaluation |
AEA members Stewart Donaldson, Tarek Azzam, and Ross Conner are editors of a new book, Emerging Practices in International Development Evaluation, published by Information Age Publishing.
From the Publisher's Site:
"The impetus for this volume comes from reflecting on many years of experience, successes, and failures in development evaluation in Asia and Africa and from recent work supported by the Rockefeller Foundation on Rethinking, Reshaping, and Reforming Evaluation. The concepts, frameworks, and ideas presented in this volume are a useful contribution to the ongoing efforts at rethinking, reforming, and reshaping international development evaluation. They come from thought-leaders and practitioners in development, evaluation, research, and academia, who have recognized that development evaluation must evolve if it is to respond to the challenges of the 21st century and play a meaningful role in social and economic transformation. This volume will be of great interest to evaluation scholars, practitioners, and students, particularly to those interested in international development projects, programs, and policies. This book will be appropriate for a wide range of courses, including introduction to evaluation, international development evaluation, program evaluation and policy evaluation as well as evaluation courses in international development, international relations, public policy, public health, human services, sociology, and psychology."
From the Editors:
"This volume challenges the assumptions behind simply using traditional evaluation approaches to evaluate in developing countries," says Donaldson. "It provides in-depth advice and guidance for how to evaluate organizational performance, capacity development, policy influence, networks and partnerships, coalitions, sustainable development, and innovation as part of the effort to improve the quality international development evaluation practice."
About the Editors:
Stewart I. Donaldson is dean, professor, and director of the Claremont Evaluation Center at Claremont Graduate University. He has published numerous evaluation articles, chapters and 10 books, including the Future of Evaluation in Society: A Tribute to Michael Scriven (forthcoming), Advancing Validity in Outcome Evaluation: Theory and Practice (2011), Social Psychology and Evaluation (2011), What Counts as Credible Evidence in Applied Research and Evaluation Practice? (2008), and Program Theory-Driven Evaluation Science: Strategies and Applications (2007).
Tarek Azzam is assistant professor and associate director of the Claremont Evaluation Center at Claremont Graduate University. His research focuses on studying the impact of contextual variable on evaluation. He is the co-founding chair of the Research on Evaluation TIG of AEA.
Ross F. Conner is professor emeritus of the Department of Planning, Policy, and Design at the University of California Irvine. He has done evaluation work in the areas of health, education, criminal justice, and leadership. He is past president of the American Evaluation Association and the International Organisation for Cooperation in Evaluation.
Visit the publisher's site.
|
2015 is the International Year of Evaluation |
We have the pleasure to inform you that UNEG has decided to join EvalPartners in declaring 2015 as the International Year of Evaluation (EvalYear).
The decision was taken by UNEG Heads at the 2013 UNEG Annual General Meeting, after Natalia Kosheleva, EvalPartners co-chair and IOCE president, together with the
UNEG Task Force on National Evaluation Capacity Development, presented the initiative.
EvalYear will position evaluation in the policy arena, including by being a catalyst for important conversations and thinking, at international, regional, national, and sub-national level, on the role of evaluation in good governance for equitable and sustainable human development.
2015 was identified as a strategic year as EvalYear seeks to mainstream evaluation in the development and implementation of the forthcoming Sustainable Development Goals, and all other critical local contextualized goals, at the international and national levels.
For additional information about EvalYear, please contact Marco Segone or click here.
Best regards,
Marco Segone and Natalia Kosheleva, EvalPartners co-chairs
|
New Member Referrals & Kudos - You Are the Heart and Soul of AEA! |
Last January, AEA began asking as a part of the AEA new member application how each person heard about the association. It's no surprise that the most frequently offered response is from friends or colleagues. You, our wonderful members, are the heart and soul of AEA, and we can't thank you enough for spreading the word.
Thank you to those whose actions encouraged others to join AEA in April. The following people were listed explicitly on new member application forms:
Jessica Bizub * Erin Blake * Sorrel Brown * Penny Burge * Brad Cousins * Desiree Crevecoeur-MacPhail * Jerome DeLisle * Jean Eells * Molly Engle * Laura Feldman * Kristi Fuller * Annette Ghee * Jean Haley * Eileen Harwood * Doreen Hauser-Lindstrom * Monica Hunter * Sandra Johansen * Barbara Kinne * Tom Klaus * John Lawrence * Jim Lindstrom * Charles Maddix * Carrie Markovitz * Karen McDonnell * Erin McGuire * Kristin Mmari * Corey Newhouse * Debra Rog * Ruth Sando * Scott Scrivner * Ramesh Singh * Karen Wilkins * Julie Zajac
|
New Jobs & RFPs from AEA's Career Center
|
What's new this month in the AEA Online Career Center? The following positions have been added recently:
- Public Health Evaluator at St. David's Foundation (Austin, Texas)
- Mid-Term Evaluation of The Education for Employment Foundation (EFE) Maroc at EFE (Casablanca, Morocco)
- Assistant Director of Assessment and Program Evaluation at ASU's University Office of Evaluation and Education (Tempe, Ariz.)
- Senior Epidemiologist/Biostatistician at City Harvest, Inc. (New York)
- Consultant Professional Competencies at AAAOM (New York)
- Monitoring and Evaluation Associate at FHI 360 (Washington, D.C.)
- Monitoring and Evaluation Manager at Splash (Seattle)
- SLE Consultant at FSG (Seattle, San Francisco, Boston)
- Evaluation Specialist at Van Andel Education Institute (Grand Rapids, Mich.)
- SLE Associate at FSG (Seattle, San Francisco, Boston)
- Senior Research Managers at InterMedia (Washington, D.C.)
- Qualitative Research Associate at InterMedia (Washington, D.C.)
- Director of Evaluation and Learning at First Place for Youth (Oakland, Calif.)
- M&E Senior Manager at National Democratic Institute (Washington, D.C.)
- Senior Program Officer at Bill & Melinda Gates Foundation (Seattle)
Descriptions for each of these positions, and many others, are available in AEA's Online Career Center. According to Google Analytics, the Career Center received approximately 3,750 unique visitors throughout the last 30 days. Job hunting? The Career Center is an outstanding resource for posting your resume or position, or for finding your next employer, contractor, or employee. You can also sign up to receive notifications of new position postings via email or RSS feed.
|
About Us | The American Evaluation Association is an international professional association of evaluators devoted to the application and exploration of evaluation in all its forms.
The American Evaluation Association's mission is to:
- Improve evaluation practices and methods
- Increase evaluation use
- Promote evaluation as a profession and
- Support the contribution of evaluation to the generation of theory and knowledge about effective human action.
phone: 1-202-367-1166 or 1-888-232-2275 (U.S. and Canada only)
|
|
|
|
|
|
|