|
Newsletter: January 2014
| Vol 14, Issue 1
|
|
|
|
Message from the President: 2014 - A Year of Visionary Evaluation |
Greetings, my esteemed AEA colleagues! It is a pleasure to serve as our association's 2014 president. Just like many of you, AEA is as much family for me as a professional affiliation. Like family members, we have diverse and even conflicting perspectives. Yet we have a common bond - in this case, our commitment to evaluation and its service to society.
Each year, the AEA president provides a theme that gives us a lens through which to view our work. In recent years, the themes have included Evaluation and Context (2009 under Deb Rog's leadership), Evaluation and Quality (2010, Leslie Cooksy), Evaluation and Values (2011, Jennifer Greene), Evaluation in Complex Ecologies: Relationships, Responsibilities, and Relevance (2012, Rodney Hopson), and Evaluation in the Early 21st Century (2013, Jody Fitzpatrick).
The 2014 theme is Visionary Evaluation for a Sustainable, Equitable Future. It builds on these past themes and seeks to propel us into the future. The Call for Proposals for the 2014 conference elaborates on the theme and encourages you to express, through your proposal submissions and conference participation, what visionary evaluation means for you and your work.
The theme urges us all to tap relationships and systems thinking to connect evaluation to a premier challenge of our time — nurturing a diverse and interconnected world with enough for all for generations to come. It calls on us to envision evaluation as fundamental to a sustainable, equitable future. Evaluation needs your visionary integration of sustainable and equitable living, systems thinking, and relationships.
Please join me in using 2014 to each take a visionary step beyond our current evaluation approach toward one that contributes just a little more to a sustainable, equitable future for all. Mark your calendars now for the AEA 2014 conference in Denver, Oct. 15-18, 2014 with professional development sessions preceding and following the conference.
Now let me turn your attention to the work of the AEA Board. First join me in welcoming our five new AEA board members: Stewart Donaldson as president-elect; Susan Tucker as treasurer; and our three members-at-large, Melvin Hall, Robin Miller, and Donna Podems. I also would like to thank our five members whose terms ended in 2013 — Rodney Hopson (past president); Brian Yates (treasurer); and members-at-large Tina Christie, Jenny Jones, and Victor Kuo — for their many contributions to the AEA.
I'm looking forward to our Feb. 7-8 board meeting, at which time we are engaging in a strategy session with an outside facilitator to review AEA's strategic direction. Now that we have completed the transition to our new association management company (SmithBucklin) and executive director (Denise Roosendahl), we are well positioned for an intense review of our strategic direction and priorities. We will be drawing on a wide range of available data including recent surveys of Topical Interest Group (TIG) leaders and leaders of our Local Affiliates.
Before closing, let me call your attention to a new item on the AEA website home page — What is Evaluation? It's a blog based on the thinking of a working group led by Michael Patton. They developed this statement as a resource for members. It's not an official statement but rather a stimulus for comments and an exchange of views. Join the conversation about what evaluation is.
Please feel free to email me at president@eval.org with your thoughts about AEA. I'd love to hear from you.
With best wishes and warm regards for 2014,
Beverly Parsons
AEA 2014 President
|
|
 |
Important Note |
To ensure this newsletter reaches you every month, add info@eval.org to your email contacts!
|
|
|
Call for Proposals for Evaluation 2014 Now Open |
AEA invites all who are involved in the field of evaluation to share their best work in evaluation theory or practice at Evaluation 2014, the annual conference of the American Evaluation Association (AEA), held in Denver on Oct. 15-18, 2014.
The conference is divided into topical strands that examine the field from the vantage point of a particular methodology, context, or issue, as well as the conference theme highlighting this year's theme of Visionary Evaluation for a Sustainable, Equitable Future. Presentations may explore the theme or any aspect of the range of evaluation theory, practice, management, or consulting.
Proposal Submissions must be received by 11:59 p.m. ET on March 17, 2014.
If you have an eval.org account, please log in to submit a proposal.
|
AEA Evaluation 2013 Wrap-Up |
The 27th Annual Conference of the American Evaluation Association was held in Washington, D.C., Oct. 16-19, 2013, and, from all accounts, it was a huge success! The theme of the conference was "The State of Evaluation Practice in the Early 21st Century."
Conference Theme
AEA's 2013 president, Jody Fitzpatrick, explained the theme as follows: "Evaluation is not only a transdiscipline, but those of us in it come from many different disciplines and evaluate different types of programs in arenas as different as education and transportation. ... many people conducting evaluations are not members of AEA, were not trained in evaluation, and may not even call themselves evaluators. But, we are all collecting information to help others make judgments about programs. I would like us to learn more about this "big tent" of evaluation. Bringing this knowledge - and these people - together will not only make us more inclusive, but can improve the overall quality of evaluation."
In keeping with the theme, the conference boasted evaluators from all disciplines, fields and experience levels. Forty-three percent of attendees were first-timers, and 11 percent were students. Attendees experienced four days of education sessions, four days of professional development workshops, and multiple networking events including an Awards Luncheon; Meet the Authors, Poster Exhibition & Reception; Reception & Silent Auction; Birds of a Feather lunch; and informal time during coffee breaks, TIG meetings, and during sessions.
Conference Highlights
AEA Evaluation 2013 included 18 Presidential Strand sessions addressing the conference theme and four Presidential Strand plenaries:
- The Breadth of Evaluation Practice Today: Arenas, Disciplines, and Influences, presented by Kathryn Newcomer and Jody Fitzpatrick
- Which Evaluations Should We Believe? Origins of Credibility and Legitimacy in Politicized Environments, presented by Arthur Lupia
- The Practice of Educational Evaluation Today: A Federal Perspective, presented by John Easton
- The State of Evaluation Practice in the Early 21st Century: How Has the Theme of Evaluation 2013 Influenced Our Beliefs?, presented by Len Bickman, Leslie A Fierro, Rakesh Mohan, Michael Morris, Anne Vo, and Sally L Bond
The conference also offered two-day, one-day and half-day professional development workshops. These workshops offer evaluators at all phases of their careers an opportunity to take a look at a topic or skill set of interest under the tutelage of an expert facilitator or facilitators. View all the professional development workshops.
Conference education opportunities also included more than 800 sessions for evaluators in many different disciplines, across all experience levels. Session formats included: Birds of a Feather gatherings; demonstrations; expert lectures; ignite sessions; panels; paper presentations; poster presentations; roundtables; skill-building workshops; and think tanks.
In addition to the great educational and networking highlights, 36 exhibitors showcased their evaluation-related products and services. The conference's sponsors included Gravic Inc. - Remark Software, SAGE, and Westat. View a full list of exhibitors.
|
AEA Announces 2013 Award Winners |
The American Evaluation Association honored six individuals at its 2013 Awards Luncheon in Washington, D.C. Honored this year were recipients in six categories involved with cutting-edge evaluation/research initiatives that have impacted citizens around the world. We've spotlighted two award winners thus far and will continue in upcoming issues. Today we extend our congratulations to Rebecca Campbell.
Rebecca Campbell, Ph.D., Professor of Psychology and Evaluation Science, Michigan State University, 2013 Outstanding Evaluation Award

For the past 25 years, Rebecca Campbell has been conducting victimology research and evaluation, with an emphasis on violence against women and children. Her work examines how the legal, medical and, mental health systems and rape crisis centers respond to the needs of adult, adolescent, and pediatric victims of sexual assault. Campbell has published more than 80 scientific papers and two books on these topics, and has conducted more than 200 presentations at state, national, and international conferences. Throughout her career, she has received more than $8 million in research and evaluation funding from the National Institute of Mental Health, the Centers for Disease Control and Prevention, and, most recently, the National Institute of Justice.
"I am honored to receive this award for a project that brought together evaluators and practitioners to find ways to build evaluation capacity in local violence against women and sexual assault service organizations," Campbell stated. "We are very proud of our collective efforts to help these organizations develop evidence-based programming."
|
AEA Values Rewind - Walking the Talk with President-Elect Stewart Donaldson
|
Are you familiar with AEA's values statement? What do these values mean to you in your service to AEA and in your own professional work? Each month, we'll be asking a member of the AEA community to contribute her or his own reflections on the association's values.
AEA's Values Statement
The American Evaluation Association values excellence in evaluation practice, utilization of evaluation findings, and inclusion and diversity in the evaluation community.
i. We value high quality, ethically defensible, culturally responsive evaluation practices that lead to effective and humane organizations and ultimately to the enhancement of the public good.
ii. We value high quality, ethically defensible, culturally responsive evaluation practices that contribute to decision-making processes, program improvement, and policy formulation.
iii. We value a global and international evaluation community and understanding of evaluation practices.
iv. We value the continual development of evaluation professionals and the development of evaluators from under-represented groups.
v. We value inclusiveness and diversity, welcoming members at any point in their career, from any context, and representing a range of thought and approaches.
vi. We value efficient, effective, responsive, transparent, and socially responsible association operations.
Editor's Note: In July 2012, AEA interviewed Stewart Donaldson for this section of the newsletter. Donaldson is AEA's president-elect for 2014; as such, we are re-running his feature.
I'm Stewart Donaldson, Dean and Professor of Psychology at Claremont Graduate University. I have had a wonderful experience with my colleagues at Claremont developing and implementing Ph.D., master's, certificate, and professional development programs in evaluation.
I am proud I was on the board when AEA's new values statement was developed and adopted. This statement promises to guide our association and members as we face and take on controversial issues and evaluation challenges in the months and years ahead. I am especially pleased that we have made it explicit for future leaders in AEA that they should strive to be responsive to all of our diverse members, and to lead our organization in a transparent and socially responsible manner. In the future, I'm optimistic that AEA will be known for its values of inclusiveness and diversity and for welcoming evaluators from all backgrounds and points of view to engage the key issues of the day.
Another dimension of my service to the evaluation profession has been supervising more than 50 Ph.D. students, teaching hundreds of evaluation master's and certificate students, and providing trainings for thousands of professionals. I have tried to inspire these often enthusiastic participants to reach for the stars, to think deeply about their ethics, social responsibilities, and awareness of cultural diversity and competency, as well as to work hard to develop cutting-edge evaluation knowledge and technical skills. This past year has been particularly rewarding as I have lived my values of inspiring and educating new evaluators from traditionally underrepresented groups in the U.S. through the GEDI program, as well as improved understanding of good evaluation practices in the international evaluation community through my collaborative capacity development projects with the Rockefeller Foundation, UNICEF, UN Women, and other international partners.
Finally, the AEA values stated above have challenged me to think more critically about the evaluations I conduct, as well as the evaluations I supervise for my students. At the end of the day, these values make us aware that it is our responsibility in this profession to ensure that our evaluations are rigorous and of the highest quality possible taking into account the context. Our work should lead to better evidence-based decision making, help with policy formulation, promote social betterment and justice, and/or improve programs, organizations, communities, and developing societies across the globe. What a wonderful set of values and profession to be part of!
|
Face of AEA - Meet Dayna Albert |
AEA's more than 7,800 members worldwide represent a range of backgrounds, specialties, and interest areas. Join us as we profile a different member each month via a short question-and-answer exchange. This month's profile spotlights Dayna Albert.
Name: Dayna Albert
Affiliation: Independent Evaluation Consultant
Degrees: B.S. nutritional sciences; M.A. adult education (University of Toronto, Toronto, Canada); credentialed evaluator (CE) through the Canadian Evaluation Society
Years in the Evaluation Field: eight years
Joined AEA: 2008
Why do you belong to AEA?
I belong to AEA for three main reasons: 1) continuing professional education, 2) networking with some great people, and 3) opportunities to volunteer and contribute to the field.
Why do you choose to work in the field of evaluation?
I fell into the field of evaluation almost accidentally. In 2005, I was hired to be the coordinator of the Towards Evidence-Informed Practice program at the Ontario Public Health Association. The program evolved to become the major provider of evaluation capacity building for public health professionals in the province of Ontario. It's so true that you learn what you teach!
I greatly enjoy helping people to not only overcome their fear of evaluation but to value the benefits of evaluation. To illustrate this point, I often tell the story of how I got my yellow belt in karate. This is how it goes: I had been a white belt for over a year and kept getting passed over for promotion. I didn't mind so much, but when higher belts stopped "criticizing" me because they said I was doing well enough for a white belt, I had to take action. In karate, you learn to value good criticism (aka evaluation) because that is how you improve your technique. So, I approached the Sensei and said "Sensei, you have to promote me because no one will criticize me anymore." So, I did get my yellow belt, but the most important lesson I learned was the value of good evaluative feedback.
What's the most memorable or meaningful evaluation that you have been a part of?
My most memorable or meaningful evaluation is yet to come. Instead, let me embarrass myself by describing my most thrilling moment in evaluation. As chair of the Canadian Evaluation Society's 2013 Conference Workshop Committee, I was in frequent touch with Susan Kistler, now AEA executive director emeritus. Susan was extremely generous in sharing her advice and expertise. Later, she invited me to participate on the workshop selection committee for the upcoming AEA conference so that I could further learn from AEA processes. I readily accepted. To be brief, Susan was so impressed with my work on this committee that she asked if I would agree to participate the following year. I'm a bit bashful to admit it, but being praised by Susan Kistler was a huge highlight for me.
What advice would you give to those new to the field?
I would recommend that those new to the field seek out interesting volunteer work. I still enjoy volunteer work and find that it can be more rewarding than paid work. It's a great way to make professional connections and to learn new skills. For example, as the volunteer coordinator of the Evaluation Stories project, I'm now connected to evaluators across the globe. I recently learned to launch and manage a professional blog and have learned to use several new software programs to record, edit and post videos to YouTube. It's been so much fun!
|
Policy Watch - Roadmap Update: What's New? |
From Cheryl Oros, Consultant to the Evaluation Policy Task Force (EPTF)
What is the Roadmap?
Written in 2010, the Evaluation Roadmap for a More Effective Government provides guidance on evaluation policies. In addition to defining evaluation policy, the Roadmap addresses requirements, methods, implementation, ethics, budgets and resources for evaluation that can help improve government. It promotes high-quality evaluations and their use in federal decision-making.
Why an Update?
The EPTF aims to be responsive and to ensure the timeliness and relevance of its guidance. The recent revision was designed to respond to issues raised by AEA members who asked that the Roadmap:
- Better reflect alternative evaluation approaches, especially forms of formative or developmental evaluation
- Expand the original emphasis on the independence of evaluators to better account for the role of internal evaluators seeking to improve government programs.
The revision includes other, generally minor updates.
What's New?
Here are the main additions in the 2013 version:
Page 4 - Agencies should re-examine program relevance and effectiveness over time.
Page 5, 8, and 9 - Evaluators should safeguard independence of evaluation while encouraging consultation with, and input from, agency and other stakeholders. The level of independence needed for public accountability is not necessarily the same required for formative or developmental evaluation aimed at organizational learning and program improvement.
Page 6 - In the discussion of factors to help determine which methods to use, both the circumstances under which the program is implemented and the available budget have been added as considerations. In addition, the Roadmap now explicitly notes that various approaches can complement each other, giving the example of performance measurement systems that can describe what is happening in programs, while evaluation studies can determine whether these observed changes were due to the program, why they occurred, and whether they were beneficial.
Page 7 - The discussion of resources now notes that, when severe budget constraints exist, funds should be prioritized to the greatest needs for both the program and the potential of evaluations to provide needed insights.
The material on professional competence has been expanded to include expertise on the evaluation team in the subject area under examination, and to refer explicitly to skill in developing evaluation study designs and writing actionable reports.
Page 8 - A new section has been added to emphasize the importance of tracking and re-examining the use made of evaluation findings over time, including sharing this information with agency leaders and stakeholders.
How Can You Use the Roadmap?
With the implementation of the new Government Performance and Results Act (GPRA) 2010, there may be "reachable moments" as agencies work to improve their performance and evaluation systems. The last Policy Watch described how the National Institutes of Health (NIH) developed guidelines for the evaluation of the Clinical and Translational Science Awards (CTSA) program by drawing, in part, on guidance from the Roadmap. Others have used the Roadmap to illustrate to their agencies what the evaluation field recommends in order to support innovations. If you would like examples of policies and evaluation plans that agencies have developed, contact me at EvaluationPolicy@eval.org.
Please Give Your Feedback
Let us know your reactions to the Roadmap; how you have used it in your organization (e.g., any policies, guidance, new plans); and what results if any can be traced to the changes your organization has made.
|
Diversity - The Most Wonderful Time of the Year: The Perfect Proposal |
From Zachary Grays, AEA Headquarters
What is the perfect proposal? It's that time of year again where our TIG volunteers eagerly await the opportunity to review your proposals for our annual conference. Hosted in Denver, the theme for this year's conference is Visionary Evaluation for a Sustainable, Equitable Future and, for the first time, proposals will be reviewed for relevance to the conference theme.
Last year, AEA received 2013 proposal submissions and accepted no fewer than 950. Without question, this would be impossible to curate without the unwavering work of our Topical Interest Groups. TIG leaders enlist the many experts within their membership to review proposals and make the tough but thoughtful decision on who will make the cut for the conference. Reviewers take into consideration many elements during their rigorous review but put a special emphasis on including topics that add variety, are diverse in subject background, and will propel the field of evaluation forward.
So what does all this mean, and how does it relate to diversity within the association? Taking the stage this year is the responsibility of evaluators to be thoughtful of the good of society and the impact evaluation practice has in our ever-changing global ecology. As you begin preparing to share your proposals with us, I want to turn your attention to the importance of remembering to inspire diversity and inclusiveness in your submissions. It is paramount that the impact of evaluation work is always in the interest of improving communities and creating unity. It is AEA's vision to foster an inclusive, diverse, and international community of practice positioned as a respected source of information for and about the field of evaluation. The AEA annual conference plays host to more than 3,000 attendees from across the globe and across disciplines. There may not be a more perfect platform to showcase, celebrate, and endorse inclusivity and promote this vision for the future of evaluation and the greater good.
As AEA's Statement on Cultural Competence in Evaluation reads: "Evaluators have an ethical obligation to ensure stakeholders in all aspects of the evaluation process fully understand their rights and inherent risks ...Vigilance to securing the well-being of individuals and their communities is essential." This statement can indeed be applied to leading the charge for insuring diversity and thoughtful impact in 2014. The deadline for proposal submission is March 17, 2014. We're waiting anxiously for your submissions and hope to see them come to life this October in Denver! More importantly, we can't wait to see how your submissions change the face of evaluation and lend to the evolution of the field.
|
eLearning Update - Discover Upcoming eStudy Courses and Coffee Break Demonstrations |
From Alexa Schlosser, AEA Headquarters
Our eStudy program is made up of in-depth virtual professional development courses. Below are February's eStudy offerings:
eStudy 038: Reality Counts: Participatory Methods for Engaging Vulnerable and Under-Represented Persons in M&E - Mary Crave, Kerry Zaleski, and Tererai Trent
Feb. 5 and Feb. 12
1-2:30 p.m. ET
Join us to learn several participatory methods for engaging vulnerable or under-represented persons who often do not have a voice in needs assessments or program monitoring and evaluation. This eStudy will occur in two 1.5-hour sessions and will include brief a task before the first session.
Read more and register.
eStudy 039: Practical Strategies for Managing Problem Behavior in Small Groups - Bob Kahle
Feb. 18 and Feb. 20
3-4:30 p.m. ET
The eStudy will describe methods of managing dominant, cynical and other bad behavior in focus groups, planning sessions and meetings of all types. This eStudy will occur in two 1.5 hour sessions and will includes preparation materials sent before, between, and after the sessions.
Read more and register.
____________________________________________________________________________________
Our Coffee Break Webinars are short, 20-minute presentations of tools or tips we think evaluators will find helpful in their work lives. Let's take a look at what's in the pipeline for February:
Thursday, Feb. 6
2-2:20 p.m. ET
Jeff Wasbes, the SUNY Charter Schools Institute's Director of Performance and Systems Analysis, will share the institute's method for using regression analysis within the institute's school performance accountability framework. After briefly discussing the development of the analysis from a theoretical framework to a practical measure of school performance, Jeff will focus the balance of his discussion on the complications that real world data and context present in interpreting and using the analysis.
Thursday, Feb. 13
2-2:20 p.m. ET
Stan Capela will share with participants a variety of techniques that he has learned in his 30-plus years working in the evaluation field. Stan will speak on his experience as an internal evaluator in a large nonprofit agency as well as ideas he picked up as a peer reviewer for the Council on Accreditation in more than 30 states, Canada, Germany, Guam, and Japan. At the conclusion of the session, participants will acquire different techniques on communication and engagement as well as different approaches on how to adapt to changing situations and create a culture that fosters positive results.
CBD172: Embracing Paper Free Evaluation: Applications for Tablet Based Data Collection - Caren Oberg
Thursday, Feb. 20
2-2:20 p.m. ET
Caren S. Oberg, owner of Oberg Research, LLC, will highlight four tablet apps that streamline and strengthen data collection: TrackNTime, QuickTap Survey, Sticky Notes, and Story Kit.
CBD173: Doing It Virtually: Online Tools for Managing Multisite Evaluation - Audrey Rorrer
Thursday, Feb. 27
2-2:20 p.m. ET
Audrey Rorrer is lead evaluator for the College of Computing and Informatics Center for Education Innovation at the University of North Carolina at Charlotte where she has designed and built online evaluation toolkits for computer science education programs. She will demonstrate Google as a user-friendly and cost-effective multisite evaluation tool. She will also share lessons learned in using Google tools to streamline the evaluation process, addressing the benefits and challenges of these tools.
You can pre-register for these webinars by clicking the links above!
|
Potent Presentations Initiative - Now Craft Your Message |
From Stephanie Evergreen, Potent Presentations Initiative Coordinator

Whether you've been thinking about your 2014 presentation since the 2013 conference or you're the type to wait until the day before the deadline, the Potent Presentations Initiative can help your proposal.
The proposal requirements aren't steep — just a title, an abstract, and a relevance statement. However, that title and abstract (hopefully!) become published in the conference program nine months down the road. Which means now is the time to think carefully about your message.
A crisp, concise message lets your future audience know what they are getting into when they sit down in your session room. In fact, mismatches between the title/abstract and the actual session content tend to be one of the larger annoyances for conference attendees. I know, I know, your content evolves over time, but the boiled-down essence of your message really won't change, so long as you know what it is. To find out, try this exercise: The Six-Word Presentation Story.
The idea of a six word memoir comes via Smith Magazine, which asked folks of all stripes to summarize their lives into six words. What a challenge! It was inspired by Ernest Hemingway, who wrote a story consisting of: "For Sale: baby shoes, never worn." We aren't all Hemingways, but we can follow his lead in getting to the core of our proposed content under the constraint of just six words.
Here are a few others for your motivation:
Secret to life: marry an Italian. —Nora Ephron
Well, I thought it was funny. —Stephen Colbert
My life story - spay or neuter. —Bob Barker
Now your turn. What six words (more or less) can you use to distill your presentation? That's your title. What's the backstory behind those six words? That's your abstract. And those pieces are the basic story upon which you elaborate and present in October. Can't wait to see you there!
|
New Jobs & RFPs from AEA's Career Center
|
- Long-Term Evaluation Specialists, Various Countries at Management Systems International (Washington, D.C.)
- Monitoring and Evaluation Manager at UMCOR (New York City)
- Evaluation Senior Consultant at TCC Group (New York City)
- Higher Education Analyst II at Duke University (Durham, N.C.)
- Monitoring & Evaluation Specialist at The Asia Foundation (San Francisco)
- Project Director II- Research & Evaluation at Bryan Building/CMHSR (Worcester, Mass.)
- Sr. Research Manager at Walter R. McDonald and Associates Inc. (Rockville, Md.)
- Senior Administrator, Program Evaluation at Orange County Public Schools (Orlando, Fla.)
- Senior Research Specialist at Texas Higer Education Coordinating Board (Austin, Texas)
- Monitoring & Evaluation Associate, Forests Initiative at World Resources Institute (Washington, D.C.)
Descriptions for each of these positions, and many others, are available in AEA's Online Career Center. Job hunting? The Career Center is an outstanding resource for posting your resume or position, or for finding your next employer, contractor, or employee. You can also sign up to receive notifications of new position postings via email or RSS feed.
|
About Us | AEA is an international professional association of evaluators devoted to the application and exploration of evaluation in all its forms.
The association's mission is to:
- Improve evaluation practices and methods.
- Increase evaluation use.
- Promote evaluation as a profession.
- Support the contribution of evaluation to the generation of theory and knowledge about effective human action.
phone: 1-202-367-1166 or 1-888-232-2275 (U.S. and Canada only)
|
|
|
|
|
|
|