|
Newsletter: June 2013
| Vol 13, Issue 6
|
|
|
|
A Serendipitous Route to Evaluation |

Dear colleagues,
It's summer! And, for some of us, summer prompts us to think back to those wonderful childhood years — riding bikes, swimming, and long, leisurely days of just doing nothing. I'm in this reminiscent mode partly because I'm packing up my house to move — downsizing for a new stage of my life. But it's also prompted me to think of how and why I became an evaluator as well as how I've been lucky enough to have a career that largely reflects my interests and values.
Here's my story: I went to college in the late 1960s when the United States was changing quite a bit. I couldn't decide on a major. I skipped from French to biology to general liberal arts and, finally, to political science with quite a bit of history and sociology along the way.
I had no idea of what I was going to do with it. So, when I graduated, I continued with what I was good at: school! I attended graduate school for political science with an interest in American studies. Then, it happened! I took a course in political socialization, and, while writing a paper on how children develop their political beliefs, my professor said: "Some people are coming from Washington to discuss government programs with children over in the psychology and education departments. You should go hear them." I went. They talked about the evaluation of Head Start, and I was sold! Wow! It was refreshing to be able to do something in government that could make a difference!
I began to investigate and found that the emerging field of evaluation had a Ph.D. program in the educational psychology department. I went — focusing on child development and evaluation — though never having had an undergraduate course in education or psychology. Nevertheless, it was the perfect combination for me — being involved in improving government policies and programs, using my writing and analytic skills, and getting out of the university into the "real world" of programs, clients, deliverers, managers, and decision makers.
I'm writing about my experience because my theme asks us to examine how our backgrounds or those of evaluators from other disciplines influence our own theories and approaches to evaluation. Although I earned my Ph.D. in educational psychology, my political science background and my teaching in public administration for many years have had a major effect on my views about evaluation and my practice. I was aware of the complexity of policymaking, the incremental nature of change in the United States, and the competing responsibilities of elected officials and public administrators. I worked with city governments, state agencies, and federal programs mostly concerned with economic progress for women and children.
I'm very passionate about this topic and will talk more about disciplinary influences in my presidential address at Evaluation 2013. But, having studied career choice, I know that my serendipitous route to evaluation is no different than most. No one dreams of becoming an evaluator when they are a child! How did you become an evaluator? How does your history, academic background, talents, and skills influence your approach to evaluation?
One more note: AEA just finished a great board meeting in Atlanta, coinciding with the Evaluation Institute. Furthermore, it was our first board meeting with our new association management company, SmithBucklin, and their representatives. Denise Roosendaal, our acting executive director, will introduce herself elsewhere in this newsletter. We're very happy to be working with Denise and her team, which is now our team.
Sincerely,
Jody
Jody Fitzpatrick
AEA 2013 President
|
|
 |
|
|
Message from the Executive Director: A Chance to Create New Opportunities |
Dear AEA colleagues,
I am thrilled to be serving as your interim executive director. By all accounts, AEA is a vibrant and growing organization with strongly held values, reflecting the profession and the academic roots of its individual members. The members whom I have had the pleasure to meet or interact with have all expressed a similar perspective of AEA: a solid member value proposition that emphasizes high-quality programs at reasonable prices, diversity and inclusion, high member engagement, and an environment that fosters individual and professional growth. You should be proud of your leadership for the path the organization has taken thus far.
Rest assured that the transition between management companies is proceeding nicely. The phones, emails, and website all transitioned the week of May 20 with relatively little impact to the members. July 1 represents the next deadline with the final transition pieces coming together. We have posted a list of staff members and their primary areas of responsibility. But if you are ever in doubt as to whom to contact, feel free to contact our customer service representative Zachary Gray.
As with any change, this transition represents an opportunity to honor the past and create new opportunities into the future. AEA is financially strong and in a good position to navigate this change. You can expect the same high level of customer service and strong programming. The website will continue to be maintained as a terrific resource for members and professionals in the field — even if there are a few areas temporarily under reconstruction. And the high-quality educational programming will definitely be maintained throughout the transition and beyond.
I look forward to serving the organization and its members. Please let me know if you have any questions about the transition, the new management team, or the continuation of member benefits. Feel free to contact me with any questions.
Sincerely,
B. Denise Roosendaal
AEA Interim Executive Director
|
New AEA Office Opened May 20!
|
On May 20, AEA phones were ringing at its new office in Washington, D.C.! At the same time, AEA's new management team assumed their new roles. Now, when you call AEA, you will hear new voices, but they will have the same commitment to serving the association. Please help welcome your new management team and bear with them during the transition. If you find new people at the other end of the line and they don't immediately have the answer you need, they will find it and get back to you ASAP.
Additionally, you may notice a slightly different look to eval.org. AEA moved the website to a more robust platform that will allow AEA to add new functionality as it moves forward.
To learn more about the management company transition and what it means for AEA, read the Transition FAQ.
|
AEA's Values - Walking the Talk with Nicole Vicinanza
|
Are you familiar with AEA's values statement? What do these values mean to you in your service to AEA and in your own professional work? Each month, we'll be asking a member of the AEA community to contribute her or his own reflections on the association's values.
AEA's Values Statement
The American Evaluation Association values excellence in evaluation practice, utilization of evaluation findings, and inclusion and diversity in the evaluation community.
i. We value high quality, ethically defensible, culturally responsive evaluation practices that lead to effective and humane organizations and ultimately to the enhancement of the public good.
ii. We value high quality, ethically defensible, culturally responsive evaluation practices that contribute to decision-making processes, program improvement, and policy formulation.
iii. We value a global and international evaluation community and understanding of evaluation practices.
iv. We value the continual development of evaluation professionals and the development of evaluators from under-represented groups.
v. We value inclusiveness and diversity, welcoming members at any point in their career, from any context, and representing a range of thought and approaches.
vi. We value efficient, effective, responsive, transparent, and socially responsible association operations.
I'm Nicole Vicinanza, current AEA board member and senior principal with JBS International. I have been an AEA member since 1990 and have served as job bank coordinator, conference policy committee member, conference chair, eLearning working group member, occasional AEA365 contributor, and all-purpose volunteer at the conferences. In my work with JBS, I oversee evaluation, research, and evaluation technical assistance supporting state- and federally funded programs.
Though I have been fortunate enough to work with the same company and core group of evaluation and research professionals since 1996, the exact content of the work I do, and the skills I need, never seem to be the same two days in a row. I often find myself having to make decisions in new situations with little time to reflect, doing the work that the time and circumstances will allow, and working with stakeholders and clients for whom evaluation is only one of many competing demands. The concepts underlying the AEA guiding principles serve as a foundation for me to use in making decisions about my practice as an evaluator, a yardstick against which to measure my work, and a tool to help me explain to non-evaluators the perspective that underlies our profession.
The values of contributing to decision-making processes, program improvement, policy formation, and enhancing the public good are ones that are particularly important when I need to make choices about who to engage in planning, what evaluation questions to ask, what measures to use, and how to identify important issues in the data I collect. Of course, when I'm just trying to get things done, I don't usually think about the values as they're stated — usually the questions I consider are less lofty: "Who can make things change?" or "Who has a different perspective that needs to be heard?" or "Why would anyone care about this question (or result, or issue)?" However, if the answers to these questions seem to point back to the AEA core values, then I feel reassured that I am making a good decision.
I work with both new evaluators and non-evaluators, and my role at JBS often involves evaluation, research, and technical assistance projects that engage participants and evaluators from many cultures. The AEA core values of inclusiveness, diversity, and development of evaluation professionals are important to me, as they inform how I organize my work and who I engage in which evaluation tasks. When I need to make decisions about how to include a new (or newer) evaluator in a task, two key factors I consider are the extent to which the task will contribute to their development as a professional and whether they will bring diversity (of culture, background, perspective, or methodological approach) to the project. By considering these values in my day-to-day decisions, the teams I work with are strengthened, and the results of the evaluations we produce are more useful and used.
|
Policy Watch - Looking for Policy in all the Right Places - 2013 Update |
From Cheryl Oros, Consultant to the Evaluation Policy Task Force

It is the season for Congress to field draft legislation, and at the end of May, AEA provided an endorsement of provisions of the proposed Foreign Aid Transparency and Accountability Act of 2013 that addresses steps to enhance the conduct and quality of the evaluation of United States Foreign Aid. AEA applauded efforts to ensure that guidelines for the conduct of evaluations are created; evaluation agendas are set; appropriate measures are established; evaluation questions are articulated; appropriate evaluation methods are identified to answer those questions; professional evaluation standards are incorporated when conducting evaluation studies; plans are made for dissemination of evaluation findings to foreign aid staff and the public; and sufficient resources are made available for quality evaluation.
The EPTF is looking for other opportunities to encourage integrating evaluation into federal programs as an essential feature of good government. One of AEA's most valuable resources is the knowledge base of its members, and you can help by letting us know where opportunities exist. It is best to contact legislative staff at the time they are just beginning the bill drafting process. If you know of federal legislation, please email me directly at evaluationpolicy@eval.org.
Some of you may hear about legislation in the making from your agency or your colleagues. While we will not be able to pursue every opportunity, the more we know about where those opportunities lie, the more we can make thoughtful and strategic decisions about next steps and can work with potential partners in pursuit of policy influence.
For those who are somewhat new to the legislative process, the following summary may help you understand where to look for legislatively based evaluation policy. Authorization laws establish federal programs, stipulating what must/will/may be done and how much money may be spent. Thus, major authorization bills are our first targets of opportunity for making evaluation an integral part of the programs themselves. However, the amount of money actually available for a program is usually limited to the amount appropriated for it in appropriations legislation. Thus, if we want evaluation to be part of program administration, we have to make sure the appropriations legislation provides funding for it. Funding for evaluation is often only implicitly part of funds appropriated for general management. That leaves the evaluation funding decisions up to the executive branch officials.
However, congressional authorization and appropriations committees can provide additional guidance in what is known as "report language," expressions of congressional intent in the reports issued by the committees at the time bills are sent to the floor for voting or to the president for signature. While non-binding, such guidance carries significant weight in the minds of executive branch officials responsible for program implementation.
|
Face of AEA - Meet John M. LaVelle |
AEA's more than 7,800 members worldwide represent a range of backgrounds, specialties, and interest areas. Join us as we profile a different member each month via a short question-and-answer exchange. This month's profile spotlights John M. LaVelle.
Name: John M. LaVelle
Affiliation: Claremont Graduate University's School of Behavioral and Organizational Sciences
Degrees: M.S., A.B.D.
Years in the Evaluation Field: 10 years
Joined AEA: 2003
AEA Leadership Includes: First curator of the AEA365 Tip-A-Day Blog; Graduate Education Diversity Internship (GEDI) Program Liaison; Co-Convener of the Southern California Evaluation Association
Why do you belong to AEA?
"I belong to AEA because it is a vibrant and welcoming community filled with people who want to make a positive difference through their work practicing and teaching evaluation. I think the many topical interest groups (TIGs) reflect the diverse ways in which evaluators can impact programs and communities worldwide — from teaching of evaluation to exploring approaches for inquiry to the diversity of evaluation contexts. I also appreciate how accessible the AEA leadership and thought leaders are to members and how willing they are to support and engage with new and developing evaluators."
Why do you choose to work in the field of evaluation?
"When I first learned about evaluation, I was drawn to the variety of contexts in which the skill set could be applied. The first evaluations I worked on spanned topic areas from health care to psycho-social interventions to education to local government. One aspect I really enjoyed was working with clients to better understand their programs, develop their questions, and then answer those questions with good evidence. As I got more involved in the field, I became interested in evaluation at the profession level. How are evaluators trained and educated? In what ways can we outreach to potential evaluators - people who could be evaluators but have not heard of the field."
What's the most memorable or meaningful evaluation that you have been a part of?
"One of my favorite evaluations was for a small private college in Southern California. It had been awarded a large grant to develop programs related to social responsibility and social justice. I was fortunate enough to be brought on early in the project when the college was beginning to plan its various initiatives and programs. This allowed me to help include an evaluation component in the program design, as well as work with to identify intended outcomes and use the data that was collected for mid-course program modifications. I was also able to integrate social science theory into the evaluation and program design to help foster discussions about reasonable outcomes for the programs (e.g., changing attitudes, perceived group norms, community engagement, etc.)."
What advice would you give to those new to the field?
"My advice would be to talk to everyone and get involved in AEA and the other professional organizations early in your evaluation career. There are many ways to be engaged: the AEA 365 blog, TIGs, the GEDI program, and presenting at the annual conference, to name a few. Also, consider writing a personal statement of evaluation so you can describe the field and its intentions to people who are unfamiliar with the field (and maybe your family will finally understand exactly what you do!). Above all, be bold, stay curious, and have fun!"
|
eLearning Update - Who Has Been Participating in the eStudy Program? |
From Stephanie Evergreen, AEA's eLearning Initiatives Director
Two years ago this month, AEA launched the Professional Development eStudy program. The association has grown so much in that time — adding new content that fits your needs and providing recordings for attendees to view at their convenience. So who has been participating?
AEA's enrollment patterns are consistent and strong. There is a mix of familiar and new faces in each eStudy course, often attracting people who are not AEA members but become members when signing up for the eStudy. Naturally, the specific course content may influence how many nonmembers sign up. Broad topics such as quantitative methods attract more nonmembers than more evaluation-specific topics. Likewise, our marketing efforts to those outside of AEA could also influence this rate.
AEA also attract a substantial and growing proportion of international attendees. AEA averages a 21 percent international audience. eStudy courses save on the expense of travel for professional development. Now that the eStudy courses are recorded, we've seen participation from people from time zones where the actual course occurs during non-business hours. These participants watch the recordings and engage when it is a more reasonable time for them.
Of course, AEA would love to see you participate, too. AEA is excited to announce two new eStudy courses:
- Twelve Steps to Data Cleaning with Jennifer Morrow, starting July 29 (six hours of content); and
- Intermediate GIS with Tarek Azzam and David Robinson, Aug. 14 and Aug. 21 (three hours of training).
But don't wait long to sign up. Registration closes a week before the first session.
AEA's Coffee Break lineup is another way to get doses of professional development every week without even leaving your office. Find out what's coming up next.
|
Diversity - AEA Celebrates 2013 GEDI Program Graduates |
From Liz Grater, AEA Headquarters
AEA was proud to celebrate another year of its Graduate Education Diversity Internship (GEDI) Program, hosting a graduation luncheon at the 2013 Summer Institute for this year's cohort. GEDI is designed to bring graduate students from underrepresented communities into the field of evaluation. Interns meet throughout the academic year, both virtually and by attending live events, including the Claremont Evaluation Center's Professional Development Workshop Series in Evaluation and Applied Research Methods, the AEA Annual Conference, the CREA Inaugural Conference on Repositioning Culture in Evaluation and Assessment, and the AEA/CDC Summer Institute.
This ongoing interaction encourages support among each other as peers as well as mentorship from GEDI's program chairs. Interns work with various placement sites around the country that are doing outstanding work in the field of evaluation to gain hands-on experience throughout the course of the GEDI program.
This year's cohort was quite impressive and covered a broad range of topics and training in the field of evaluation, all through the lens of how to use culturally responsive evaluation practices. During the graduation luncheon, each intern shared an overview of his or her internship projects and lessons learned about evaluation.
Stewart Donaldson, dean, professor, and director of the Claremont Evaluation Center at Claremont Graduate University, was once again the chair of the GEDI Program. Dr. Ashaki M. Jackson of Girls & Gangs. served her first year as director. John LaVelle, Ph.D. candidate and director of External Affairs in the School of Behavioral and Organizational Sciences at Claremont Graduate University, supported the GEDIs as the program liaison.
"I was honored to be asked to co-chair and am so proud to be a part of the GEDI Program," Jackson said. "I've been so impressed by the caliber of this year's interns. I'm excited to continue into my second year and cannot wait to see what next year has to bring."
Congratulations to AEA's 2013 AEA GEDI Program graduates, including:
- D. Pearl Barnett, University of Oklahoma, Urban League of Greater Oklahoma City
- Nnenia Campbell, University of Colorado-Boulder, Spark Policy Institute
- Michelle Corbett, Medical College of Wisconsin, Planning Council for Health and Human Services
- Kwam� Macintosh, Howard University, National Oceanic and Atmospheric Administration
- Sa�l Maldonaldo, University of California-Santa Cruz, JBS International
- Faheemah Mustafaa, University of Michigan, Public Policy Associates
Learn more about the GEDI Program.
|
Potent Presentations - Begin Preparing for Your Presentation Now |
From Stephanie Evergreen, Potent Presentations Initiative Coordinator

It's prep time!
Session notices will be delivered in a matter of days, and, even though you won't actually deliver your presentation for another three months, now is the time to begin your preparation so that you aren't furiously working on the plane.
Right now, get started on these tasks:
- Choose one to three key content points to be conveyed, and then develop notes regarding what you wish to share relating to each key point. Hone your message through the Messaging Model handout.
- Gather photos or images for use in slideshow.
- Check in with co-presenters on key content points and preparation timeline.
- Expect to hear from your session chair by email.
- Ask about length of time for your presentation, discussion time to be reserved for audience questions, and a discussant, and the sequence of those events during your session. Papers have about 15 minutes. If you are part of a panel, demonstration, think tank, etc., determine with your chair and co-presenters how much time is to be devoted to what content.
- Ask about your colleagues' presentations and coordinate content to limit overlap and respond to one another's work.
If you are looking for stock images, try Flickr (free with sign-in) and Shutterstock (for fee).
Distribute the full set of Presentation Preparation Guidelines to your co-presenters.
If you get started on your presentation now, you'll feel more prepared in October, and you'll spend less time holed up in your hotel room working on your session!
|
Enhancing Evaluation Use: Insights from Internal Evaluation Units |
AEA member John Mayne is the author of a new book, Enhancing Evaluation Use: Insights from Internal Evaluation Units, published by SAGE.
From the Publisher's Site:
"This book provides insight from evaluators working inside a range of organizations. They discuss the actual challenges they have faced over the years trying to make evaluation useful and used. Referencing the latest literature, they discuss the strategies they have adopted to address these challenges and enhance the utilization of evaluation in their organizations. Each chapter ends with questions to stimulate thought and discussion about the issues raised."
Key features:
- Present a wealth of experience-based advice and examples on carrying out and getting the best out of evaluations;
- Connect theory and practice by using real life case examples to illustrate major processes and issues; and
- Address evaluation with an international focus.
From the Author:
"The idea for the book was that much written about evaluation in organizations is written by outsiders such as academics and consultants. But in practice, there are those working 'inside' an organization who play a key role in helping shape, develop, manage, and ultimately make use of the evaluation. The contributions in this book are written by such 'insiders.' The chapters cover a wide range of organizations, from government departments in Scotland, New Zealand, Switzerland, and Canada, to international organizations such as the World Health Organization (WHO) and the International Labour Organization (ILO) to supra-national organizations such as the European Commission.
"The insider perspective and the wide scope of organizations covered make this book unique. Our contributors discuss the different strategies used over a period of time to make evaluation a part of the management of the organization, as well as their successes, failures, and lessons learned. The book highlights the commissioners and managers of evaluations, those who seek evaluations that can be used to improve the strategies and operations of the organization. The aim is to help organizations become more focused on using evaluation to improve policies, strategies, programming, and delivery of public and communal services."
About the Author:
Mayne is an independent adviser on public sector performance and has worked with organizations and jurisdictions around the world on results management, evaluation, and accountability issues. He formerly served in the Office of the Auditor General of Canada as well as the Canadian Treasury Board Secretariat and Office of the Comptroller General. Mayne has authored numerous articles and reports on results management, evaluation, and evaluation methodologies and edited five books in the areas of evaluation, public administration, and performance monitoring. In 1989 and 1995, Mayne was awarded the Canadian Evaluation Society Award for Contribution to Evaluation in Canada. In 2006, he became a Canadian Evaluation Society Fellow.
Visit the publisher's site.
|
AES 2013 International Conference Program |
This year's Australasian Evaluation Society (AES) International Conference Program offers more than 100 presentations comprising a diverse and high-quality program of papers, workshops, oral presentations, roundtables, symposiums, posters, and more. The program covers a diverse range of evaluation interests and contexts. Based on their primary focus, the presentations have been categorized into the following strands:
- responsive and responsible practices
- essential skills and understandings
- theory and methodology
- performance measurement systems and strategies
- technology
- evaluation and values
- influence and impact
- building capacity
- large-scale systems and interventions
|
2015 Is the International Year of Evaluation |
We have the pleasure to inform you that UNEG has decided to join EvalPartners in declaring 2015 as the International Year of Evaluation (EvalYear).
The decision was taken by UNEG Heads at the 2013 UNEG Annual General Meeting, after Natalia Kosheleva, EvalPartners co-chair and IOCE president, together with the
UNEG Task Force on National Evaluation Capacity Development, presented the initiative.
EvalYear will position evaluation in the policy arena, including by being a catalyst for important conversations and thinking, at international, regional, national, and sub-national level, on the role of evaluation in good governance for equitable and sustainable human development.
2015 was identified as a strategic year as EvalYear seeks to mainstream evaluation in the development and implementation of the forthcoming Sustainable Development Goals, and all other critical local contextualized goals, at the international and national levels.
For additional information about EvalYear, please contact Marco Segone or click here.
Best regards,
Marco Segone and Natalia Kosheleva, EvalPartners co-chairs
|
New Member Referrals & Kudos - You Are the Heart and Soul of AEA! |
Last January, AEA began asking as a part of the AEA new member application how each person heard about the association. It's no surprise that the most frequently offered response is from friends or colleagues. You, our wonderful members, are the heart and soul of AEA, and we can't thank you enough for spreading the word.
Thank you to those whose actions encouraged others to join AEA in May. The following people were listed explicitly on new member application forms:
Nancy Kingsbury * Liliana Rodriguez-Campos * Sharon Rallis * Hui-Jeong Woo * Jaclyn Kelly * Erica Morse * Kathleen Haynie * Michelle Mitchell * Tessie Catsambas * Laura Bloomberg * Judy Kelley * Shana Alford * Leah Ersoylu * Carol McPhillips-Tangum * Nick Smith * Michael Newman * Molly Hamm * Megan Tiernan
|
New Jobs & RFPs from AEA's Career Center
|
Descriptions for each of these positions, and many others, are available in AEA's Online Career Center. Job hunting? The Career Center is an outstanding resource for posting your resume or position, or for finding your next employer, contractor, or employee. You can also sign up to receive notifications of new position postings via email or RSS feed.
|
About Us | AEA is an international professional association of evaluators devoted to the application and exploration of evaluation in all its forms.
The association's mission is to:
- Improve evaluation practices and methods
- Increase evaluation use
- Promote evaluation as a profession and
- Support the contribution of evaluation to the generation of theory and knowledge about effective human action.
phone: 1-202-367-1166 or 1-888-232-2275 (U.S. and Canada only)
|
|
|
|
|
|
|