Newsletter: April 2011Vol 11, Issue 4


Embracing Environmental Responsibility

Greene.11Greetings AEA colleagues,


And welcome to springtime, marking the re-awakening of the earth for many of us in the northern hemisphere, while those in the south are entering the quiet times of earth's sleep. May she sleep peacefully.


This month, I would like to continue my musings on the relevance of evaluation to contemporary world events and challenges - and the key role that values play therein - by engaging with several values of widespread importance. These are the values of community, consequence, and civility.  


AEA has long nurtured the development and maintenance of a strong internal community. We take pride in the ways we actively welcome newcomers to the association and first-timers to the conference, for example, through our volunteer ambassadors and conference hospitality suites. Our policies state "We [also] value a global and international evaluation community and understanding of evaluation practices." This month, the AEA Board released a Request for Proposals for an International Listening Project, an initiative aimed at strengthening the international community of evaluators through collaborative, mutually beneficial activities and relationships with multiple international partners. Communities are important because they offer opportunities for shared commitments, alongside safe spaces for dissenting views. There is power in a community's collective presence and thus there is also strength of voice.


Being of consequence is a widely shared value amongst evaluators, more commonly labeled being useful. Just imagine if the intense and divisive political debates that currently dominate the U.S. landscape were actually informed by defensible evaluation studies and data. Just imagine a data-informed discussion of the quality and effectiveness of agricultural subsidies, charter schools, and Medicare. The consequentiality of our work has perhaps rarely been as important, and perhaps rarely as politically challenging.


Now also please imagine if the intense and divisive political debates in the U.S. and elsewhere around the globe were held not only with data, but also with a good measure of civility that included graciousness, consideration, and respect for the right of others to hold a standpoint different from one's own. How can we as evaluators advance the importance of civility to democratic debate in our own work? What opportunities do we have for modeling civil discourse in our own evaluation practices?


Until next month, respectfully,



Jennifer Greene

AEA President, 2011

In This Issue
Policy Watch with George Grob
TechTalk with LaMarcus Bolton
Meet Melanie Hwalek
IOCE Update
Meet Stephanie Evergreen
In the News
Book: Purposeful Program Theory
New Master's Program
Data Den Conference Attendees
New Job Postings
Get Involved
About Us
Quick Links
Policy Watch - Evaluation of International Development
From George Grob, Consultant to the Evaluation Policy Task Force

GrobEarlier this year, the U.S. Agency for International Development (USAID) announced a sweeping new evaluation policy [] that greatly expands and improves the conduct and use of evaluation as an integral part of USAID's planning, programming, and implementation. Of particular note are:

  • integration of evaluation and program planning
  • definitions and distinctions of various types of evaluations that together cover the life cycle of programs. These include impact evaluations that measure the change in a development outcome that is attributable to a defined intervention; performance evaluations that focus on descriptive and normative questions about what programs have achieved (either at an intermediate point in execution or at the conclusion of an implementation period); and performance monitoring through performance indicators that reveals whether desired results are occurring and whether implementation is on track.
  • requirements for at least one performance evaluation for each major program and untested and innovative intervention; and that major interventions be subject to impact evaluations whenever feasible
  • acknowledgement of the need for both qualitative and quantitative methods
  • a clear statement that no method is superior to others but that methods must be chosen that are appropriate for the program to be evaluated and the evaluation to be performed, and
  • a 3% set aside of major program office funds for conducting evaluations.

This policy is a good example of the influence of AEA's Evaluation Policy Task Force (EPTF). In many ways, the USAID mirrors AEA's Evaluation Roadmap for A More Effective Government. This is no accident. We provided a copy of the Roadmap to senior evaluators at USAID and were told that the USAID policy team consulted the Roadmap in developing its policy. Ruth Levine, USAID's Deputy Assistant Administrator, Bureau of Policy, Planning and Learning, who chaired the internal group that prepared the policy, invited AEA's Executive Director Susan Kistler to send an AEA representative to a reception and discussion of the USAID policy hosted by Georgetown University's Mortara Center for International Studies. I was privileged to represent AEA. She also invited EPTF Chair Patrick Grasso to attend. In her opening remarks, Ruth cited AEA's Roadmap as a significant resource for her group's efforts. In turn, what USAID has done here could well serve as a model for what other Federal agencies might do.


This is a necessarily brief summary of a policy that could profoundly impact our country's international development efforts for years to come. I suggest you start with Ruth's blog  and then read the policy itself. We will discuss more about evaluation of international policies and programs in future columns. Meanwhile, as always, it will be helpful for us to hear from you about your comments and concerns.


Go to the EPTF Website and Join the EPTF Discussion List.

TechTalk - AEA's Use of Technology to Go Green
From LaMarcus Bolton, AEA Technology Director


This past week we celebrated EarthWeek with our colleagues in the Environmental Program Evaluation (EPE) Topical Interest Group. On aea365, we learned from EPE colleagues about their efforts at evaluating environmental programs, about tools for environmental evaluation, and about efforts to green their own work and workplace. Beverly Parsons led the April Thought Leader Discussion and asked us all whether environmental sustainability should be a core value for evaluators. Expanding on this theme, I thought that I would share ways in which AEA has used technology to lessen its environmental impact:  

  • Even as the association has grown, AEA uses less total paper than we did in 2007, due to expanded use of online journal access, electronic registration and membership renewal, and the eLibrary for auxiliary materials distribution.
  • We're working to lessen the costs and environmental impact of travel through increasing the availability of essential training when and where you need it. We started in January 2010 with our Coffee Break Webinars. Stay tuned this July for the kickoff of full-length online workshops!
  • We've expanded our staff capacity by drawing on the expertise of distance-based team members from around the country. Through using collaborative technologies and communications tools (some of which will be demonstrated in our upcoming AEA/CDC Summer Institute session), we've been able to diversify the knowledge, expertise, and background of the AEA staff, while limiting commuting.
  • We're exploiting technology to improve AEA communications and give members more ways to stay abreast of what is happening in the field and to connect both with the leadership and with one another. I've already mentioned the Thought Leader Discussion and aea365, you also receive AEA's monthly newsletter and calendar, and perhaps you subscribe to EVALTALK or AEA’s LinkedIn Group.

Do you have ideas regarding how to use technology to lessen your environmental impact? Take a moment to share them by adding to the comments on Annelise Carleton-Hug's aea365 post on Greening Your Evaluation Practice. Do you have ideas for what else you would like to see AEA doing to green our evaluation practice? Share them with me at and I'll be sure that they get to the right parties.


Go to AEA's Go Green Page to learn More About Our Greening Efforts 

Face of AEA - Meet Melanie Hwalek
AEA's 6,500 members worldwide represent a range of backgrounds, specialties and interest areas. Join us as we profile a different member each month via our Questions and Answers column. This month's profile spotlights Melanie Hwalek, an independent consultant who has actively served both as TIG chair and board member.



Name, Affiliation: Melanie Hwalek, CEO, SPEC Associates

Degrees: B.S. Biology (Niagara University); M.A./Ph.D. Social Psychology (Wayne State University)

Years in the Evaluation Field: 30

Joined AEA: More than 20 years ago, pre-dating the merger of the Evaluation Research Society (ERS) and the Evaluation Network (EN) to form AEA in 1986. 

AEA Leadership Includes: Board of Directors (2004-2006), Chair, Independent Consulting Topical Interest Group (1999-2000)


Why do you belong to AEA?

"Camaraderie: Being with others in the U.S. who want to use social research skills to make a difference - to help solve real problems in real communities - was the real impetus for my joining AEA. When I first learned about AEA and found the Independent Consulting TIG, I knew I was home.


Professional development: There is no place in the world where so many evaluators from so many backgrounds come together - through the annual meeting, through AEA-sponsored journals, through EVALTALK. AEA has created an amazing resource for continuing professional education in the field of program evaluation.


Business development: Presenting at the annual meeting and being listed on AEA's online "Find an Evaluator" service are both free marketing opportunities. AEA's annual meeting brings together both evaluators and users of evaluation. Being a member in good standing provides a "good housekeeping seal of approval."


Shopping for evaluation partners: SPEC Associates does big work with a small core of full-time salaried staff. I am always looking for other evaluators who have content expertise different from ours. I find experts at AEA's international meeting, through the journals, and via EVALTALK."


Why do you choose to work in the field of evaluation?

"I've always wanted to solve social problems. I joined the Peace Corps naively thinking that I could help to solve poverty in the world. I learned that I was much better "behind the scenes" doing research and writing than in direct service work. Program evaluation presented the perfect match for my skills - providing solid, defensible information to organizations that were working on the ground to solve social problems."


What's the most memorable or meaningful evaluation that you have been a part of - and why?

"It isn't one particular evaluation. It's when evaluation comes alive! When the evaluator and evaluation user find excitement together deciding questions to ask, authentically considering evaluation findings, using evaluation to learn. I have had this experience regardless of evaluation design, in very shoe-string evaluations, and in complex national work."


What advice would you give to those new to the field?

"Love what you do. Seek advice from veterans. Decide how you want to love evaluation: through academic study, teaching, work in the trenches, internal to an organization, or on your own as an independent consultant. If you can't decide, take Gail Barrington's Evaluation Consulting workshop at AEA's annual meeting."

If you know someone who represents The Face of AEA, send recommendations to AEA's Communications Director, Gwen Newman, at
IOCE Promotes More Holistic Impact Evaluation of International Programs

From Jim Rugh, AEA's Represenative to the International Organization for Cooperation in Evaluation 


RughIn response to a critical paper by the Center for Global Development in 2006 ("When will we ever learn?  Improving lives through impact evaluation") and others by the Abdul Latif Jameel Poverty Action Lab (J-PAL) at MIT (dedicated to the "use of Randomized Evaluations to answer questions critical to poverty alleviation",) the major international donor agencies formed a Network of Networks for Impact Evaluation (NONIE). NONIE brought together three networks: OECD/DAC bilateral donors (like USAID), multi-lateral development banks (like the World Bank), and UN agencies (like UNDP and UNICEF). In 2008 a fourth network led by IOCE was invited to join, representing professional evaluators, especially those working in developing countries.


A highlight of the collective work of NONIE was the publication in March 2009 of "Impact Evaluations and Development: NONIE Guidance on Impact Evaluation." Though the authors acknowledge that impact evaluation is just "one tool within the larger toolkit of monitoring and evaluation (including broad program evaluations, process evaluations, ex ante studies, etc.)" the Guidelines' stated purpose was to share methodological approaches most appropriate for impact evaluations. 


"Overall, for impact evaluations, well-designed quantitative methods are usually preferable for addressing attribution and should be pursued when possible." While the NONIE Guidance does mention that there are nonquantitative techniques and mixed-methods approaches, there is a clear preference for the promotion of RCTs. "NONIE aims to promote the use of this more specific approach by its members within their larger portfolio of evaluations." 


The IOCE Board has expressed its concerns about the bias towards RCTs within the international donor community. One such expression can be seen in the recent IOCE newsletter, accessible on the home page of the site. Another was in a plenary presentation I made on behalf of IOCE at the recent NONIE annual conference in Paris.


One of the main points I made is the following: to attempt to conduct an impact evaluation of a program using only one pre-determined tool is to suffer from myopia, which is unfortunate. On the other hand, to prescribe to donors and senior managers of major agencies that there is a single preferred design and method for conducting all impact evaluations can and has had unfortunate consequences for all of those involved in the design, implementation and evaluation of international development programs.


Fortunately, IOCE is not alone in promoting more holistic approaches to impact evaluation. We were encouraged by complementary presentations by others at the NONIE conference, including the opening keynote address by François Bourguignon, Director of the Paris School of Economics, and a presentation by Leon Hermans of Delft University of Technology in the Netherlands, among others.

Meet AEA's Director of eLearning Initiatives  

EvergreenWelcome to AEA's newest staff member - Stephanie Evergreen. In her new role, Stephanie serves as Director of AEA's eLearning Initiatives. You might know her name already - as an independent consultant, as founding chair of AEA's newest topical interest group for data visualization and reporting or from her work with the Western Michigan Evaluation Center where she hosts the weekly Evaluation Café presentation series.  

Stephanie currently is completing her dissertation, which focuses on the role of graphic design in evaluation reporting. She is active on the Human Subjects Institutional Review Board, serves as a peer reviewer for many journals and federal grant programs and is a frequent blogger, twitterer and presenter.

AEA's popular Coffee Break Demonstration Series debuted last year and quickly became a member favorite. More than 300 participants attended the very first session with representation from around the world. There have been 38 webinars to date with 3-4 planned each month - and with a longer format to be unveiled later this year.

Stephanie will oversee this growing professional development/member engagement initiative, coordinating the actual sessions and maintaining the online archive. She schedules the events, handles the logistics and serves as moderator. So if you've called in already, you know her voice and hear her enthusiasm. 

"It's a great way to offer professional development opportunities outside of the conference," says Stephanie. "It's highly accessible and what I really appreciate is that AEA recognizes this is the way of the future. A lot of organizations aren't there yet."

The 20-minute Coffee Break sessions feature tools, websites and software of interest. Each webinar draws an average of 82 participants, with 12 percent from outside the U.S. The longer webinars, meanwhile, will provide opportunities for members to engage with experts and ask questions in a highly interactive environment. 

Last year, AEA was spotlighted in a book that explored the popularity and practicality of new social media platforms. Social Networking for Nonprofits: Increasing Engagement in a Mobile and Web 2.0 World  noted AEA was attentive to the needs of its members as well as responsive to emerging trends.

If interested in past sessions, please visit AEA's online archive. And, if interested in presenting, please email

In the News

CastilloAEA member Isaac Castillo is the recipient of a 2011 Lifetime Achievement Award from the Superstar Foundation, which recognizes social services professionals who demonstrate outstanding performance in promoting transformational relationships. Castillo currently serves as Director of Learning and Evaluation at the Latin America Youth Center in Washington, DC, where he helped develop a mentoring program and has overseen numerous evaluations in areas including youth development, youth violence, substance abuse, educational strategies for urban youth, and gang prevention and intervention. Castillo - who helped develop a mentoring program - has shared his knowledge and expertise with fellow non-profits and his work has been recognized in the Chronicle of Philanthropy, Youth Today and The Wall Street Journal. Congratulations Isaac!


AEA member Robert Stake will be honored next month with a Special Career Award in Qualitative Inquiry, presented to a scholar whose contributions to qualitative scholarship and practice has spanned many decades and whose recipient has maintained a strong influence on the field throughout their career. Stake is a Professor of Education at the University of Illinois, Urbana-Champaign, and serves as Director of the Center for Institutional Research and Curriculum Evaluation, which he helped found. He is author or co-author of eight books and is the 1988 recipient of AEA's Paul F. Lazarsfeld Evaluation Theory Award.

Purposeful Program Theory

Funnell.RogersAEA members Sue Funnell and Patricia Rogers are authors of a new book published by Jossey-Bass, Purposeful Program Theory: Effective Use of Theories of Change and Logic Models.


From the Publisher's Site:

"Between good intentions and great results lies a program theory - not just a list of tasks but a vision of what needs to happen, and how. Now widely used in government and not-for-profit organizations, program theory provides a coherent picture of how change occurs and how to improve performance. Purposeful Program Theory shows how to develop, represent, and use program theory thoughtfully and strategically to suit your particular situation, drawing on the fifty-year history of program theory and the authors' experiences over more than twenty-five years."


From the Authors:

"There are some good guides to using a particular type of program theory but nothing that showed the range of ways it can be done," says Funnell. "We wanted to be able to point people to a book that sets out different options for developing, representing and using program theory, and also helps people to choose the appropriate approach for their situation. We examine variations on four major ways of representing theory - the 4 or 5 box 'pipeline', the chain of intermediate outcomes, realist matrices and narratives. We discuss different approaches to identifying or developing a program theory, and how to address some common challenges such as stakeholder reluctance to disclose. Throughout the book we have used different practical examples from around the world and in different policy contexts that show how the general principles of program theory have been adapted in different circumstances."


"We've particularly focused on common challenges when using program theory," says Rogers, "such as how to adequately address complicated and complex aspects of programs and policies, and how to use archetypes and building blocks to streamline the process of developing logic models across an organization. We encourage people to distinguish between theories about how change occurs and theories about how a program activates these causal processes - this is essential for using program theory for evidence-based policy."


About the Authors:    

SUE C. FUNNELL has more than 35 years of experience in program design, evaluation, and performance measurement, working at senior levels within Government and since 1992 as a successful consultant. She has played a pivotal role in introducing program theory in Australia since the 1980's. Sue's work has been recognized by the Australasian Evaluation Society with its ET&S award for Outstanding Contributions to Evaluation.


PATRICIA J. ROGERS is a professor of public sector evaluation and has used program theory for more than 25 years across a range of program types, increasingly for large, complicated and complex interventions. Patricia's work has been recognized by the American Evaluation Association with its Myrdal Award for Evaluation Practice. She currently serves on AEA's Board of Directors.


AEA members receive a 20 percent discount when ordering from Jossey-Bass. Please use the promotional code "AEA10" when ordering online or via phone (1-877-762-2974). If ordering from outside the U.S., please email the AEA office for international ordering instructions. 


Go to the Publisher's Site 

New Master's Program for Quantitative Literacy

The University of Illinois at Urbana-Champaign is launching a new online master's program in Quantitative Literacy designed for professionals required to engage in data-driven decision making.


"The creation of an online master's degree with a specialization in quantitative literacy will allow us to appeal to a new audience of individuals interested in earning a degree in research and evaluation but not inclined to become a researcher," explains Thomas Schwandt, who chairs the university's Department of Educational Psychology.


The degree - designed for the practitioner - targets a wide audience of P-16 educators, educational administrators, and professional staff in educational institutions, social service agencies and public health agencies who, largely as a result of their job requirements, must acquire a greater depth of understanding of issues and means of applied measurement and statistics as well as evaluation methodology in the contemporary climate of evidence-based and data-driven decision making.


"There has been a veritable explosion of interest in evidence-based and data-driven decision making in public school educational settings since the passage of the federal No Child Left Behind Legislation," says Schwandt. "Pressures emanating from accountability movements as well as ideas about developing the knowledge society have encouraged similar interests in all sectors of higher education and the not-for-profit sector. Many of these individuals have never had basic training in quantitative methods or data literacy. These individuals do not generally seek professional training as applied statisticians or psychometricians; rather, they seek understanding of how to make sense of and work with numerical data in the decisions they face; how to draw reasonable and warranted inferences from such data; and, how to assess current arguments for evidence-based and data-driven decision making."


For more information, click here. Applications are due May 1 for July 2011 enrollment.

Data Den - Conference Attendees by State
Welcome to the Data Den. Sit back, relax, and enjoy as we deliver regular servings of data deliciousness. We're hoping to increase transparency, demonstrate ways of displaying data, encourage action based on the data set, and improve access to information for our members and decision-makers.

This month as we examine options for future conference locations, we hoped to inform the decision-making process by looking at how the location impacted attendance. For instance, we know that holding the conference in a particular state increases regional attendance. The question that we were less sure about was whether the increase in attendance from a particular state one year was likely to hold to subsequent years when the conference moved locations. Because we were in lovely San Antonio in 2010, with strong attendance from our Texas colleagues, should we anticipate greater attendance from Texas delegates in 2011 when we move on to Anaheim in California?

Examination of the following heatmap that looks at AEA Conference attendees by state suggests that increases in any given year because of location are not maintained in subsequent years. There is little after-effect in terms of attendance from year to year.

State Heatmap 


New Jobs & RFPs from AEA's Career Center  
What's new this month in the AEA Online Career Center? The following positions and Requests for Proposals (RFPs) have been added recently: 
  • Strategic Measurement Consultant at KnowledgeAdvisors (Chicago, IL, USA) 
  • Senior Program Data Analyst at First Things First (Phoenix, AZ, USA) 
  • Sr Health Policy Researcher at Advocates for Human Potential Inc. (Sudbury, MA, USA)  
  • Monitoring, Evaluation and Research Advisor at IntraHealth International (Washington, DC, USA)
  • Consultancy for a Meta Analysis of Education Evaluations (2007-2010) at UNICEF (New York, NY, USA) 
  • Senior Scientist at Advocates for Human Potential Inc. (Albany, NY, USA) 
  • Director of Research and Assessment at Connections Academy (Baltimore, MD, USA)  
  • Measuring, Learning and Evaluation Officer at Bill & Melinda Gates Foundation (Seattle, WA, USA)
  • Project Evaluator at The Carter Center (Kinshasa, SUDAN)
  • Senior Researcher at Public Policy Associates Inc. (Lansing, MI, USA)

Descriptions for each of these positions, and many others, are available in AEA's Online Career Center. According to Google analytics, the Career Center received approximately 6,300 unique page views in the past month. It is an outstanding resource for posting your resume or position, or for finding your next employer, contractor or employee. Job hunting? You can also sign up to receive notifications of new position postings via email or RSS feed.


Get Involved
About Us
The American Evaluation Association is an international professional association of evaluators devoted to the application and exploration of evaluation in all its forms.
The American Evaluation Association's mission is to:
  • Improve evaluation practices and methods
  • Increase evaluation use
  • Promote evaluation as a profession and
  • Support the contribution of evaluation to the generation of theory and knowledge about effective human action.
phone: 1-508-748-3326 or 1-888-232-2275