Newsletter: April 2013
|Vol 13, Issue 4|
|2013 Conference News|
As spring comes, so does submission for the AEA fall conference. This year, we have received 1262 submissions, up 25% from last year! Submissions are now out to reviewers, and we expect some exciting presentations. Thank you to our TIGs and to individuals for this tide of submissions!
Meanwhile, the Presidential Strand and Plenary Program Task Force has been working on identifying speakers and panels for our theme "Evaluation Practice in the Early 21st Century." They have focused on speakers working in a variety of different areas. From education, we'll have John Easton, the Director of the Institute of Education Sciences in the U.S. Department of Education, talking about evaluation in schools, commenting on our current efforts and the future. In the area of environmental evaluation, we'll have Paul Ferraro, an economist from Georgia State who works on environmental evaluation and policy. In the international arena, we have two great panels - one with representatives from Mexico, South Africa, and the U.S. talking about the practice of evaluation in their countries and how their political context influences that practice, and another with representatives from non-governmental organizations including UNICEF, the World Bank, and the Rockefeller Foundation talking about their practices in evaluation.
We also have representatives from other disciplines that conduct evaluation or evaluation-like activities. These include: Jackie Copeland-Carson, an anthropologist and Director of African Women's Development-USA, talking about how anthropologists approach evaluation; Paul Decker, president of the Association of Public Policy Analysis and Management (APPAM); and Gary Henry, a well-known member of AEA and APPAM, and I will contrast policy analysis and evaluation. We have developed other panels on evaluation findings and use in foundations, preK education, medicine, and more!
We're hoping the conference will help all of us learn from the big field of people collecting information for judgments and decision-making. By learning how those in different settings and different disciplines work to perform evaluations and achieve use, we will learn more about how to improve our own practice. Through inclusion of all perspectives, we hope to broaden our own.
My thanks go to the marvelous group of evaluators - Katherine Dawes, Leslie Goodyear, George Julnes, Tom Schwandt, and Marco Segone - and to my program co-chairs, Kathy Newcomer and Jonny Morell, for their wonderful ideas and hard work in putting this program together.
And, then, thanks to you for submitting your own thoughts on our theme! I'm looking forward to learning about the diversity in practice in our organization from schools and human services to newer wings in evaluation, such as disaster and emergency management and arts and culture.
Can't wait to see you there!
AEA 2013 President
|AEA's Values - Walking the Talk with John Gargani|
Are you familiar with AEA's values statement? What do these values mean to you in your service to AEA and in your own professional work? Each month, we'll be asking a member of the AEA community to contribute her or his own reflections on the Association's values.
AEA's Values Statement
The American Evaluation Association values excellence in evaluation practice, utilization of evaluation findings, and inclusion and diversity in the evaluation community.
i. We value high quality, ethically defensible, culturally responsive evaluation practices that lead to effective and humane organizations and ultimately to the enhancement of the public good.
ii. We value high quality, ethically defensible, culturally responsive evaluation practices that contribute to decision-making processes, program improvement, and policy formulation.
iii. We value a global and international evaluation community and understanding of evaluation practices.
iv. We value the continual development of evaluation professionals and the development of evaluators from under-represented groups.
v. We value inclusiveness and diversity, welcoming members at any point in their career, from any context, and representing a range of thought and approaches.
vi. We value efficient, effective, responsive, transparent, and socially responsible association operations.
My name is John Gargani. I own an evaluation consulting firm - Gargani + Company - located in Berkeley, California. I'm a member of the AEA board of directors, the Director of the San Francisco Bay Area Evaluators (a local AEA affiliate), and a section editor for the American Journal of Evaluation. I've been working in the field for about 20 years and blogging at EvalBlog.com for about three.
I, like many of us, stumbled upon evaluation by accident, but my decision to become an evaluator was quite deliberate. I remember realizing one day how much evaluation can improve the lives of others and that I should become an evaluator. Since that time, I have greatly enjoyed the evaluations I have conducted and the people with whom I have worked, but what continues to excite me is the good that evaluation can do. To my great pleasure, I have found that most evaluators I meet feel the same way.
The AEA Values Statement articulates this shared perspective. It defines us as a community dedicated to the public good, not individuals with similar training, disciplinary interests, or methodological perspectives. Moreover, it argues that our contribution to the public good depends on the values we promote.
Of course, we need technical skills of various sorts to conduct evaluations and we spend years developing and refining them. These capacities, however, do not define us. Neither are they sufficient to promote the public good. We tread in a messy world where stakeholders clash, politics confuse, and interests - public and private - often diverge. If values do not guide our steps, what should? Our interests? Our politics? Our power?
As a community, we choose to be guided by a nobler purpose. To listen to voices that too often remain unheard. To respond to the people, places, and environments that we seek to benefit. To create change not only with our work but the way we conduct our work.
This is why 20 years on, I am still excited to be an evaluator. And why I am proud to be a part of our community.
|Policy Watch - Evaluation in the President's Budget|
From Cheryl Oros, Consultant to the Evaluation Policy Task Force
Why are evaluation policies important? You may wonder how evaluation policies can affect your evaluation work, influence government programs, or help ensure societal issues are addressed. You will be interested in gaining insights into how the Obama Administration is working to integrate evaluation into federal programs by reviewing the recently released President's Budget for Fiscal Year 2014. It includes a section on Performance and Management (p. 77-114) which addresses evaluation policies.
Of interest in this year's budget document are descriptions of several initiatives that focus on the use of evaluation in making funding decisions regarding grants (see Chapter 7, Delivering a High Performance Government, p. 87) which had been introduced in an Office of Management and Budget (OMB) memo last year, Use of Evidence and Evaluation in the 2014 Budget (May 2012).OMB "encouraged a broad-based set of activities to better integrate evidence and rigorous evaluation in budget, management, and policy decisions, such as adopting more evidence-based structures for grant programs, building evaluation capacity, making better use of data within government agencies, and developing tools to better communicate what works." OMB stated: "Where evidence is strong, we should act on it. Where evidence is suggestive, we should consider it. Where evidence is weak, we should build the knowledge to support better decisions in the future."
In the initiatives OMB described in the FY 2014 document, OMB would provide tiered funding based on a review of not just the description of the project, but also the evaluation results. OMB plans to drive improvement strategies and minimize the role of politics in decision making, while maximizing the use of evaluation results. In these initiatives, OMB proposes that higher grant funding be allocated to concepts proven via high quality evaluations, with less funding allocated to those proposed projects with less evidence, and the least funding to development grants for new concepts that do not yet have evidence, but for which evaluation will be required.
OMB is setting up a clear incentive to conduct high quality evaluations and use the results in program decision making. To buttress this increased reliance on evaluation, OMB wants to eventually develop consistent evaluation quality standards, but provides leeway to agencies in selecting evaluation approaches: "Among the most important analytical tools is program evaluation, which can produce rigorous evidence about program effectiveness. For example, evaluations using experimental or quasi-experimental methods can identify the effects of programs in situations where doing so is difficult using other tools" (see p. 91-95).
You can share your thoughts on the FY2014 document with the EPTF discussion group by signing up at the EPTF Discussion List.
Go to the Evaluation Policy Task Force Web Page
|Face of AEA - Meet Sara El Choufi|
AEA's more than 7,800 members worldwide represent a range of backgrounds, specialties and interest areas. Join us as we profile a different member each month via a short Question and Answer exchange. This month's profile spotlights Sara El Choufi, who became an active contributor to AEA immediately upon her arrival.
Name, Affiliation: Sara El Choufi, The Global Environment Facility (GEF)
Degrees: BS Biology/Computer Science, MS Environmental Policy and Planning
Years in the Evaluation Field: 2
Joined AEA: October 2012
AEA Leadership Includes: Environmental Program Evaluation TIG, communications lead
Why do you belong to AEA?
"I work on Results-based Management and when my supervisor told me about AEA, the conference, and the study sessions - I was very interested. I went to my first AEA conference not knowing what to expect and I learned that a lot of the work I have done throughout the years was directly tied to evaluation; I just never thought about it as such."
Why do you choose to work in the field of evaluation?
"As a graduate student I did numerous research projects and wrote a number of papers using Contingent Valuation Methods, and somehow it just stuck. I really enjoyed playing around with statistics, but more than that, I enjoyed associating meaning to my numbers and telling the story behind them. I guess what I am trying to say is I stumbled upon the field and I loved it!"
What's the most memorable or meaningful evaluation that you have been a part of - and why?
"One of my earliest introductions to evaluation was a research paper I wrote for a course on Social Change, Development, and the Environment. I studied the effects that biodiversity enclaves (such as natural reserves) have on the development of the rural areas surrounding them. I did this research on the "Al-Shouf Cedar Reserve" in Lebanon - the largest enclave in the country and a highly regarded space for biodiversity conservation. I got the chance to go out in the field, interview people, reconstruct the story of the land and how it relates to the residents of the area. I was able to review the development programs implemented by the reserve management (specifically rural development initiatives) and assess whether or not they were meeting the objectives set out as well as identify what issues they might have faced during implementation. It was exhilarating!"
What advice would you give to those new to the field?
"I learned a great deal in the year I have been at AEA. The most important lesson is be critical of your work as well as others'; ask the questions you dread to answer about your work; and try to read as much as you can about the theory (and practice) of evaluation; you might not always agree with it, but it will add new perspective to your work."
If you know someone who represents The Face of AEA, send recommendations to AEA's Communications Director, Gwen Newman, at email@example.com.
|eLearning Update - Please, Do Not Disturb|
From Stephanie Evergreen, AEA's eLearning Initiatives Director
Removing distractions may be the most difficult hurdle to a great webinar experience. The same advice I give presenters to help them tame distractions also works well for audience members. For presenters and attendees alike, distractions interrupt the flow of the webinar and make it more difficult to regain engagement. Whether presenting or just listening in, here are three things you can do to improve your webinar experience.
- Post a Do Not Disturb sign on your office door. In many workplaces, it is customary for colleagues to walk into each other's offices to ask a question or follow up on something. It is easy to forget about this habit when sitting down to a webinar, but interruptions from coworkers can derail your attention. Tape a quick sign to your door or cubicle entrance to help others recognize that you need silent time.
- Temporarily pause all popups. When you get a new Outlook message or another friend logs into Skype, you usually want to hear about it. But these popup messages are distracting for presenters and for attendees. Popups on presenter screens will cause the presenter to fluster. For attendees, they are a siren song to click out of the webinar and into your inbox. Disable these popups for the duration of the webinar.
- Occupy your hands. Many presenters often have their hands full, between negotiating their speaking notes and clicking through their visuals. If not, I recommend keeping a pen in the dominant hand. When something you want to say later pops into your head, jot it down. Audience members can also benefit from holding onto something. It keeps the hands occupied and reduces the chances of clicking around the web instead of giving the webinar your full attention.
Together, these tips can help you keep the temptations of your coworkers and your inbox at bay.
|Diversity - Coming to a City Near You|
From Karen Anderson, AEA's Diversity Coordinator Intern
Have you heard the news? AEA is now looking for Graduate Education Diversity Internship (GEDI) Program host sites for the 2013-2014 academic year! As a GEDI alumni and Program champion, I can attest to the experience as being mutually beneficial to host sites and interns.
The GEDI Program actively engages Masters and Doctoral level students from groups traditionally underrepresented in the field of evaluation and works to expand this group of students who have extended their research capacities to evaluation. Evaluative thinking concerning underrepresented communities and culturally responsive evaluation is also an integral part of the GEDI Program. Interns are challenged to expand the capacity of their site, as well as the profession, to work in racially, ethnically and culturally diverse settings.
What I enjoyed the most about my GEDI experience:
- Every cohort gets to choose a name (how cool is that?)
- Camaraderie was developed between me and my fellow Evolution cohort members and we became like a family. Following our GEDI experience, we have continued to work on conference presentations and publications
- Amazing program chairs - and having the opportunity to learn the basics of evaluation and culturally responsive evaluation from them
- Wonderful, resourceful GEDI alumni
- Having Dr. Rodney Hopson, the Program's founder, as well as other culturally responsive evaluators to help guide us through the continuous learning process
- Having the opportunity to give back to AEA via service
To learn more about how your organization can become a host site, click here
|Potent Presentations - Plan to be a Bit Informal|
From Stephanie Evergreen, Potent Presentations Initiative Coordinator
Even in high-pressure presentation situations, a dash of informality can go a long way. By "informality" I mean a quick bit of humor or self-disclosure.
Whether or not to insert jokes is largely up to your own comfort level with being funny in front of a crowd. Our group of expert presenters - AEA's Dynamic Dozen - were somewhat split on this issue. Some were naturally funny people, so they planned scripted levity.
Others aren't as comfortable with humor and found that, when they tried, it totally flopped and became an unwanted distraction. So, don't force it if it doesn't come naturally. Instead, just get more personal.
When we break from what appears to be the formal talk track to say things like "This third point here is the most critical" or "my favorite" we peak audience interest and actually boost learning. It's a bit informal, but you can plan ahead to sprinkle in a few personal touches like that.
The research has shown that adding in personal touches can be particularly effective in conference settings where you are trying to persuade a somewhat skeptical audience. The audiences at AEA are friendly, but outside of here you might find that personal touch to be helpful in disarming an audience predisposed to be defensive or tense, like with certain groups of unhappy stakeholders. (See Rowley-Jolivet, E. & Carter-Thomas, S. (2005). The rhetoric of conference presentation introductions: Context, argument and interaction. International Journal of Applied Linguistics, 15(1), 45-70.)
Read more about the advice of our Dynamic Dozen and other research on great presentation delivery.
|Nominations for AEA's 2013 Awards Due Friday, June 7|
Nominations are now being accepted for the seven American Evaluation Association Awards. Please take this opportunity to acknowledge outstanding colleagues and outstanding work. Through identifying those who exemplify the very best in the field, we honor the practitioner and advance the discipline. Aside from the Ingle Award, all awards are open to non-AEA members as a way to recognize contributions to the field. Self-nominations are accepted, but should also be supported by a recommendation from an AEA member.
All nominations must be completed and received in the AEA office by the deadline, Friday, June 7, 2013 in order to be considered.
AEA awards recipients will be recognized at Evaluation 2013, to be held October 14-19 in Washington, DC. Recipients are announced in the American Journal of Evaluation and each winner will receive a complimentary year of membership to AEA.
Learn more online at: http://www.eval.org/aboutus/awards.asp
2013 Summer Institute - June 2-5 in Atlanta, Georgia
|Mark your calendars for this year's Summer Evaluation Institute co-sponsored by the Centers for Disease Control and Prevention (CDC) in Atlanta.|
Join colleagues for two keynote addresses, five rotations of three-hour training sessions, plus two group lunches that foster networking with fellow professionals and hear from experts who have conducted evaluations in a variety of settings, as well as nationally known authors, those working on the cutting edge, evaluation experts and outstanding trainers.
You choose your topics and customize your experience. For a rundown of sessions, click here.
Join CDC's Kathleen Ethier as she talks about Building and Supporting Evaluation and Evaluation Capacity in Large Organizations and AEA's President-Elect Beverly Parsons as she talks about Evaluation and the Triple Bottom Line as it relates to economic viability, social responsibility and environmental sustainability.
When and where:
June 2-5, Crowne Plaza Atlanta Perimeter at Ravinia Hotel, 4355 Ashford Dunwoody Road
Go to the Summer Institute Page to Learn More
|Adaptive Action: Leveraging Uncertainty in Your Organization|
AEA members Glenda Eoyang and Royce Holladay are authors of a new book, Adaptive Action: Leveraging Uncertainty in Your Organization, published by Stanford University Press.
From the Publisher's Site:
"Rooted in the study of chaos and complexity, Adaptive Action introduces a simple, common sense process that will guide you and your organization into reflective action."
"This elegant method prompts readers to engage with three deceptively simple questions: What? So what? Now what? The first leads to careful observation. The second invites you to thoughtfully consider options and implications. The third ignites effective action. Together, these questions and the tools that support them produce a dynamic and creative dance with uncertainty. The road-tested steps of adaptive action can be used to devise solutions and improve performance across multiple challenges, and they have proven to be scalable from individuals to work groups, from organizations to communities."
"In addition to laying out the adaptive action framework and clear protocols to support it, Glenda H. Eoyang and Royce J. Holladay introduce best practices from exemplary professionals who have used adaptive action to meet personal, professional, and political challenges in leadership, consulting, Alzheimer's treatment, evaluation, education reform, political advocacy, and cultural engagement-readying readers to employ this new toolkit to meet their own goals with a sense of ingenuity and flexibility."
From the Authors:
"Our nation -indeed, our world - is stuck," says Eoyang. "Consider the patterns you see: Climate change threatens the planet. Economic instability blocks prosperity. Our generation cannot afford to retire, yet our skills aren't valued in the new economy. Violence is random and frequent. We lose our children to drugs, bullies and lethargy. What's worse, our twentieth century solutions don't work on our twenty-first century problems. Evaluators try to use outcome measures to understand emergent systems. Managers try to plan and control systemic transformation. Politicians rely on insults instead of inquiry to create public policy. Education, healthcare, and community development agencies continue practices and policies that cost more and do less. Everywhere, complexity and uncertainty frighten and frustrate people, institutions, and communities."
"The good news is, we've known for about a decade that the tools of human systems dynamics could help people see, understand, and influence intractable problems," notes Holladay. "The HSD models and methods described in Adaptive Action have helped individuals and groups make sense of chaos and uncertainty. Research and practice have made our concepts, models, and methods more accessible even as systemic conditions have made the need more urgent. This is why we're so hopeful about our new book - it completes the bridge between overwhelming challenges people face today and the ability to use Adaptive Action to leverage uncertainty and fear into possibility and action."
About the Authors:
Glenda H. Eoyang is the founding Executive Director of the Human Systems Dynamics Institute and author of Facilitating Organization Change: Lessons from Complexity Science.
Royce J. Holladay is the Director of the Network for the Human Systems Dynamics Institute.
For more information about the book, the authors and adaptive action, go to adaptiveaction.org.
Go to the Publisher's Site
|In Memoriam - Barry MacDonald|
From AEA Member Ernie House
Barry MacDonald, one of the most original and influential of pioneer evaluators, died April 16 in Norwich, England. He was 80. MacDonald was among the very first to use evaluative case studies, develop a conception of democratic evaluation, and endorse an ethics for involving study participants. For many years he headed a top evaluation group at the University of East Anglia that conducted several high profile evaluations. His work is well known around the globe. In 2010, I wrote a personal tribute:
Whose friendship, ideas, and democratic ideals have enriched my personal and professional life; whose courage and integrity in the face of political pressures and payoffs have inspired, and whose wit, wordplay, and originality epitomize style and eloquence. My evaluation novel is a tribute to forty years of our friendship.
As a charismatic, charming, and (sometimes) controversial personality, his influence was due to his ability to read people. His striking insights about people and politics were unsurpassed. With most scholars, it's easy to anticipate what they will say, but MacDonald's originality was such that you were often surprised by his observations, and sometimes startled. Later, thinking it over, you realized that he might well be correct.
Some influence was exercised through his written works, which colleagues considered far too few. He was a superb writer by any standard. At its best his writing reached a level of eloquence not associated with the academic world. However, much of his influence was exercised in person through long conversations and discussions of projects. He was noted for his sharp wit, which was subtle, understated, and frequently acerbic. Of a late life romance between an elderly pair, he said, "She acts like a reptile that has captured a small mammal." He spent considerable effort creating witticisms to insert into conversations.
He was highly principled about how participants in studies should be treated and how their personal information should be protected. His harshest criticisms were reserved for abuses of power, like bullying or forcing others to do something through the power you had over them. And this ethic played into his principles about how evaluations should be conducted. Things were to be accomplished by persuasion, not force. He valued the autonomy of individuals highly. All in all, he was one of the most brilliant, flamboyant, and unusual characters most of us have encountered.
(For a more personal memoir of the man, contact firstname.lastname@example.org. For his written papers, see the University of East Anglia website. Noted former students, including Helen Simons, Saville Kushner, and Nigel Norris, have carried on his work.)
His surviving family includes two daughters, Tracey and Shelley, and four grandchildren.
Evaluation Humor - Songs with an Evaluative Bent?
Every issue, we include a light-hearted feature designed to generate a laugh. Recently, in LinkedIn, Susan Kistler asked for input regarding songs - and dance - with an evaluative bent. And she got quite a response. Take a look at the video below and then visit her aea365 post for more fun ditties!
|Cognitive Bias VideoSong|
If you have an illustration or graphic you'd like to share, feel free to forward it to Newsletter Editor Gwen Newman at email@example.com.
|New Member Referrals & Kudos - You Are the Heart and Soul of AEA!|
|Last January, we began asking as a part of the AEA new member application how each person heard about the Association. It's no surprise that the most frequently offered response is from friends or colleagues. You, our wonderful members, are the heart and soul of AEA and we can't thank you enough for spreading the word.
Thank you to those whose actions encouraged others to join AEA in March. The following people were listed explicitly on new member application forms:
Australasian Evaluation Society * Linda Bol * Capella University Library * Tessie Tzavaras Catsambas * Centers for Disease Control and Prevention * Tom Chapel * Diane D. Chapman * Fanie Cloete * Cindy Collins * Leilani Francisco * George Washington University * Andrea Giron * Maggie Grieve * Debbie Hamm * Doreen Hauser-Lindstrom * Laura G. Hill * Robert Johnson * Randi Korn * Leah Goldstein Moses * Kristina Gorbatenko-Roth * Kathleen D. Kelsey * Bohdanna Kynasevych * Gwen Lee-Thomas * Louisiana Public Health Institute * Chris Lovato * Leslee Martin * M&E Listserv * Julianne Manchester * James McMillan * Massey University New Zealand * Alyeta Meyer * Katina Mortenson * Johan Mouton * Ioana Munteanu * Kathy Newcomer * Clare Nolan * Emma Norland * Rita O'Sullivan * Winn O'Toole * Oregon Program Evaluators Network (OPEN) * Organizational Research Services * Antigoni Papadimitriou * Hallie Preskill * Robert Wood Johnson Foundation * Liliana Rodriguez-Campos * Ruth Saunders * Patrick Shields * Stephanie Shipman * Society for Applied Anthropology * Southern Illinois University Carbondale * Stiles & Associates * University of North Carolina * University of Stelenbosch * Nicolae Toderas * Judah Viola
New Jobs & RFPs from AEA's Career Center
What's new this month in the AEA Online Career Center? The following positions have been added recently:
- Nicaragua Youth Leadership Academy Evaluation Consultant at National Democratic Institute (Managua, Nicaragua)
- Online Course Development Evaluator at J Sargeant Reynolds Community College (Goochland, VA, USA)
- Research & Evaluation Associate Fellowship at EnCompass LLC & EDC (Lusaka, Zambia)
- Assistant Director of Clinical Training at Hathaway-Sycamores Child and Family Services (Pasadena, CA, USA)
- Director of Research and Evaluation at New Jersey Community Development Corporation (Paterson, NJ, USA)
- Evaluation and Planning Officer at Lumina Foundation (Indianapolis, IN, USA)
- Research Associate at National Council of Juvenile & Family Court Judges (Reno, NV, USA)
- Health Science Evaluator at Food and Drug Administration (Rockville, MD, USA)
- Health Services Planners/Evaluators (temp) at Contra Costa County (Martinez, CA, USA)
- Rollins Professor and Department Chair at Rollins School of Public Health, Emory University (Atlanta, GA, USA)
Descriptions for each of these positions, and many others, are available in AEA's Online Career Center. According to Google analytics, the Career Center received approximately 3,750 unique visitors over the last 30 days. Job hunting? The Career Center is an outstanding resource for posting your resume or position, or for finding your next employer, contractor or employee. You can also sign up to receive notifications of new position postings via email or RSS feed.
|The American Evaluation Association is an international professional association of evaluators devoted to the application and exploration of evaluation in all its forms.
The American Evaluation Association's mission is to:
- Improve evaluation practices and methods
- Increase evaluation use
- Promote evaluation as a profession and
- Support the contribution of evaluation to the generation of theory and knowledge about effective human action.
phone: 1-508-748-3326 or 1-888-232-2275