|
Dec. 2015
| Vol 18, Issue 12
|
|
|
|
Message from the Executive Director: Locating the Right Space for AEA's Conferences
|
From Denise Roosendaal, AEA Executive Director, and Sydney Vranna, AEA Events Services Manager
According to the feedback we have received, Evaluation 2015 was a strong success (always with room for improvement, of course). The AEA Management Team enjoys serving AEA's members and the conference attendees. In conversations with members over the past two years, we have noticed a real interest in how the conference sites are chosen and how arrangements with the hotels are negotiated or managed. For this month's column, I have teamed up with AEA's events services manager, Sydney Vranna, to outline the factors that are considered when choosing sites for the AEA conference. In this article, we will address the factors for selecting the space. In a future article, we will address how the space is negotiated and managed to match the specific needs of AEA.
|
|
|
|
|
Diversity - Diversity and the International Year of Evaluation: The Year in Review
|
From Zachary Grays, AEA Headquarters
As we bring to a close the International Year of Evaluation, it goes without saying that 2015 has been an especially exciting year here at AEA and in the evaluation community at large. There has been much cause to celebrate. The Graduate Education Diversity Initiative (GEDI) saw its largest cohort with 11 scholars poised to take the evaluation world by storm. (More to come from this group in 2016 in this column. Stay tuned!) We partnered with the Center for Culturally Responsive Evaluation and Assessment (CREA) to offer a unique strand of professional development workshop offerings during Evaluation 2015 that proved popular among attendees. We welcomed the very first Voluntary Organization for Professional Evaluators (VOPE) to participate in the AEA International Partnership Program. Finally, Evaluation 2015 itself, AEA's annual conference, brought over 3,500 attendees from more than 77 countries together in Chicago to celebrate the International Year of Evaluation. This also included a virtual conference component that brought conference session access to over 1,500 registrants from across the globe. By all measures this has been a banner year in spotlighting the diversity of the individuals in the evaluation field, creating inclusive environments for practitioners and highlighting the exemplary and culturally responsive evaluations conducted in diverse, global communities.
While there were many, here are some of the highlights from this year at AEA.
|
Potent Presentations Initiative - State of the [Presentation] Art
|
From Sheila B. Robinson, Potent Presentations Initiative Coordinator
Potent Presentations (p2i) is looking forward to some exciting updates in 2016. In thinking about keeping our content and resources relevant, I often do a quick Google search to find out what people are talking about in the presentations space outside of our world of evaluation.
So, what's going on these days with regard to presentations?
Continue Reading |
International Policy Update - Global Evaluation Week a Huge Success
|
From Mike Hendricks, AEA Representative to the International Organization for Cooperation in Evaluation (IOCE), with contributions from Jim Rugh, EvalPartners Co-Coordinator
Greetings from Kathmandu, Nepal, where the very exciting Global Evaluation Week has just wrapped up. AEA was well-represented here, specifically by John Gargani, AEA president-elect; Mike Hendricks, AEA representative to IOCE and EvalPartners; Bianca Montrosse-Moorhead, AEA representative to (and global co-chair of) EvalYouth; Svetlana Negroustoueva, AEA representative to EvalGender; and Jim Rugh, coordinator of EvalPartners. The week consisted of four separate, but highly related, events.
|
Book Profile - Credibility, Validity, and Assumptions in Program Evaluation Methodology
|
Apollo Nkwake is the author of Credibility, Validity, and Assumptions in Program Evaluation, a new book published by Springer.
From the Publisher's Site:
This book focuses on assumptions underlying methods choice in program evaluation. Credible program evaluation extends beyond the accuracy of research designs to include arguments justifying the appropriateness of methods. An important part of this justification is explaining the assumptions made about the validity of methods. This book provides a framework for understanding methodological assumptions, identifying the decisions made at each stage of the evaluation process, the major forms of validity affected by those decisions, and the preconditions for and assumptions about those validities.
Though the selection of appropriate research methodology is not a new topic within social development research, previous publications suggest only advantages and disadvantages of using various methods and when to use them. This book goes beyond other publications to analyze the assumptions underlying actual methodological choices in evaluation studies and how these eventually influence evaluation quality. The analysis offered is supported by a collation of assumptions collected from a case study of 34 evaluations. Due to its in-depth analysis, strong theoretical basis, and practice examples, "Credibility, Validity, and Assumptions" is a must-have resource for researchers, students, university professors and practitioners in program evaluation. Importantly, it provides tools for the application of appropriate research methods in program evaluation.
From the Author:
I am an evaluator, and I love what I do. In my career nothing has fascinated me like assumptions - assumptions of stakeholders and evaluators about programs and evaluations. Unexamined assumptions can be a huge risk to program success, and to helpful evaluations. For a while now, my aspiration has been to encourage and contribute to a conversation on how to work with assumptions in program evaluation. A starting point for this conversation is getting to a common understanding of what the critical assumptions are - what is worth examining and what is not. In this book, I propose a typology of evaluation assumptions structured according to a cycle of decision points in an evaluation process. This is a follow-up to " Working with Assumptions in International Development Program Evaluation" (2013), in which I introduced some typologies of program assumptions and tools for examining them. I hope these resources will encourage and support evaluators to examine our own assumptions about the programs we evaluate, and the methods and tools we use.
About the Author:
Apollo M. Nkwake is senior manager for monitoring and evaluation (M&E) at African Women in Agricultural Research and Development (AWARD). He joined AWARD from Tulane University, where he was a research associate professor for M&E. Prior to that, he held senior M&E advisor positions at the World Vision United States; University Research Co, LLC; and JSI Research and Training Institute. He has research/M&E field experience with USAID, Bill and Melinda Gates Foundation, World Bank, DFID, UNICEF and World Vision programs in Africa, Asia, and Latin America. Nkwake earned his Ph.D. in social development from the University of Cape Town and holds the Canadian Evaluation Society's Credentialed Evaluator designation.
|
New Jobs & RFPs from AEA's Career Center
|
-
VP/Director of Outcomes and Evaluations at Choices Inc. (Indianapolis, Indiana)
-
Methods Advisor at Independent Evaluation Group, The World Bank Group (Washington, D.C.)
-
Security and Operations Manager at Social Impact (Arlington, Virginia)
-
Department of Education Grant Evaluator at Brescia University (Owensboro, Kentucky)
-
Monitoring and Evaluation Manager at Water.org (Kansas City, Missouri)
-
Bilingual Research Assistant at Harder+Company (San Diego, California)
-
Database and Evaluation Specialist at The Family Partnership (Minneapolis, Minnesota)
|
About Us |
AEA is an international professional association of evaluators devoted to the application and exploration of evaluation in all its forms.
The association's mission is to:
- Improve evaluation practices and methods.
- Increase evaluation use.
- Promote evaluation as a profession.
- Support the contribution of evaluation to the generation of theory and knowledge about effective human action.
phone: 1-202-367-1166 or 1-888-232-2275 (U.S. and Canada only)
|
Welcome, New AEA Members! |
Click here to view a list of AEA's newest members.
|
Important Note |
To ensure this newsletter reaches you every month, add info@eval.org to your email contacts!
|
|
|
|
|
|