In this issue...
  • Global Twinning
  • AJE's Top 10
  • Guttentag Winner
  • Mixed Methods
  • Systems Thinking
  • Summer Institute
  • SEA Conference
  • Get Involved!

  • AEA Newsletter
    January 2008

    AEA Colleagues,

    Greetings and Happy New Year!

    At the top of the list of AEA resolutions for 2008: improve communications. Towards that end, please join me in welcoming Gwen Newman as our new Newsletter Editor. She will see that this missive appears in your email inbox each month. Do you have a great idea for an article? Share it with Gwen at [email protected].

    You may also have had the opportunity to meet Damon Thompson at the annual conference in Baltimore. Damon began work this fall as our Communications Director, a new position for AEA and one identified as a top staffing priority by the Board of Directors. Damon is helping us to think strategically about the content, quality, frequency, and means of communications used across AEA. As the year moves forward, you will see changes throughout the association and we will be using the newsletter as one way to keep you up to date on happenings within AEA as well as in the broader evaluation community.

    Whatever your professional and personal goals in the year ahead, all of us at AEA wish you the very best! And, if there is a way that we may help you to reach those goals, please do not hesitate to connect!


    Susan Kistler,
    AEA Executive Director

    Global Twinning
    AEA partners with international network

    As part of its international efforts, AEA in 2007 undertook a collaboratively-developed twinning project with the International Program Evaluation Network (IPEN). IPEN asked AEA to send a member and evaluation expert to serve as its official representative to IPEN's 7th annual conference, held September 26-28 in Moscow, to lead a workshop and to provide a lecture related to the conference theme of "Reforms and Evaluation of Programs and Policies."

    Under its chair, Donna Podems, the International Committee selected former AEA President Ross Conner to represent the association. Conner was part of the conference's opening session, during which he extended AEA's greetings and explained the IPEN-AEA "twinning" project. He also conducted a workshop on community-based evaluation and gave an expert lecture on evaluation and politics.

    "In general, I believe the goals of the twinning project were accomplished," Conner said. "The IPEN attendees learned more about some evaluation issues from an American perspective and also about AEA, and I learned more about IPEN, its leadership, and current issues of special interest in the Russia-NIS (Newly Independent States) region. At AEA's Baltimore meeting, I shared my experiences and learnings with AEA's leadership and members."

    More than 100 people attended the IPEN conference, with about three-fourths of them from Russia. Among the other four US attendees was Thomas Grayson, the incoming chair of AEA's International Committee, who attended on his own. Grayson made a presentation on the use of logic models for performance management program planning and evaluation.

    IPEN's Website

    AJEcover AJE's Top 10
    Most-cited articles in last five years

    The American Journal of Evaluation (AJE) remains a publication of choice for top authors to submit their articles and to search for relevant content for citation. According to the Thomson Web of Science® Database, these are the top 10 most-cited AJE articles since 2002:

    • Questions about behavior: Cognition, communication, and questionnaire construction, by N. Schwarz and D. Oyserman; Spring-Summer 2001
    • Fidelity criteria: Development, measurement and validation, by C.T. Mowbray, M.C. Holter, G.B. Teague, et al; Fall 2003
    • A three-step approach to teaching logic models, by R. Renger and A. Titcomb; Winter 2002
    • Integrating a comparison group design into a theory of change evaluation: The case of the urban health initiative, by B.C. Weitzman, D. Silver and K.N. Dillman; Winter 2002
    • Evaluation and organizational learning: Past, present and future," by R.T. Torres and H. Preskill; Fall 2001
    • Evaluation, knowledge management, best practices, and high quality lessons learned, by M.Q. Patton, Fall 2001
    • The metaevaluation imperative, by D.L. Stufflebeam; Spring-Summer 2001
    • Beyond use: Understanding evaluation's influence on attitudes and actions, by G.T. Henry and M.M. Mark; Fall 2003
    • The potential of social capital measures in the evaluation of comprehensive community-based health initiatives, by D.M. Petersen; Winter 2002
    • Is sustainability possible? A review and commentary on empirical studies of program sustainability, by M.A. Scheirer; September 2005

    Kudos to these wonderful authors, and to all who share their knowledge and expertise via the journal. A subscription to AJE is one of the many benefits of AEA membership. AEA members also get free online access to AJE back content - including these articles and others - by signing on to the "members only" section at

    In addition to AJE's more than 6,100 subscribers, 956 libraries also have access to the journal - an increase of more than 33 percent over last year. So why don't you submit your best work? Find more information about submissions to AJE at

    To access AJE online, sign in to the AEA website at using your AEA username and password.

    • Your username is:
    • Your password is:

    Rodriguez-Campos Guttentag Winner
    Liliana Rodriguez-Campos honored as young professional

    Liliana Rodriguez-Campos bears the honor of being named the recipient of AEA's 2007 Marcia Guttentag Award, which recognizes a talented young professional in the field of evaluation. Presented to a promising new evaluator during the first five years after completion of his or her Masters or Doctoral degree and whose work is consistent with AEA's Guiding Principles for Evaluators, the award most recently was claimed by a 2002 graduate of Western Michigan University (WMU) who already is the published author of a critically acclaimed book, a promising teacher and mentor and a professional already very active in her field.

    Rodriguez-Campos graduated with honors from WMU's Evaluation, Measurement and Research Design program and in 2005 wrote Collaborative Evaluations: A Step-by-Step Model for the Evaluator, which reviewers note provides a very clear and easy to follow road map filled with practical tips and examples of real-life experiences. Rodriguez-Campos dedicates one full chapter to the use of specific guidelines when conducting evaluations and another full section to the use of AEA's Guiding Principles for Evaluators.

    Rodriguez-Campos has presented in more than 15 countries, served as a keynote speaker and is unremitting in her emphasis on the use of established AEA principles in any type of evaluation. She is now a faculty member in the Department of Educational Measurement and Research at the University of South Florida.

    "It is an honor to be recognized by as prestigious an organization as the American Evaluation Association for my accomplishments in the field," says Rodriguez-Campos. "I come away from the experience with a heightened sense of purpose and dedication to my colleagues and my students."

    WMU's Michael S. Nokes nominated Rodriguez-Campos for the award, explaining that her passion for sharing knowledge and experience is apparent both in her writing and in her teaching.

    Greenecover Mixed Methods
    AEA award winner explores integrated methodology

    Author Jennifer C. Greene offers insights into the theory and design of mixed methods studies as well as practical guidance and detailed examples in Mixed Methods in Social Inquiry, published in October 2007 by John S. Wiley & Sons Inc./Jossey-Bass.

    Greene is a professor in quantitative and evaluative research methodologies at University of Illinois-Champaign and recipient of the 2003 AEA Paul F. Lazarsfeld Award for contributions to Evaluation Theory. She has offered training in mixed-methods evaluation in a number of venues including the AEA/CDC Summer Institute and her work focuses on the intersection of social science and policy. Greene notes that she "seeks to advance the theory and practice of alternative forms of evaluation, including qualitative, democratic, and mixed-method approaches."

    From the book cover:

    "This is an excellent addition to the literature of integrated methodology. The author has skillfully integrated diverse ways of thinking about mixed methods into a comprehensive and meaningful framework. By providing detailed examples, she makes it easy for both the students and the practitioners to understand the intricate details and complexities of doing mixed methods research. On the other hand, by comparing, contrasting, and bridging multiple perspectives about mixed methods, she has made this book very relevant and useful to seasoned scholars of mixed methodology." - Abbas Tashakkori, Florida International University & founding coeditor, Journal of Mixed Methods Research

    "Jennifer Greene's book is an exquisite and indispensable map for those who are ready for the challenge of genuinely mixing methods." - Michael Quinn Patton, author, Utilization-Focused Evaluation.

    Wiley/Jossey-Bass offers AEA members a special savings on its publications when ordered directly from the publisher. To receive your 20% discount, please use the Promotional Code "AEAF8" online or by phone (1-800-225-5945).

    Publisher's Website

    Systems Thinking
    Systems in Evaluation TIG compiling case studies

    The Systems in Evaluation TIG formed as the result of a series of sessions on systems thinking in evaluation offered during the 2002 annual meeting in Washington, DC. The TIG was created to provide a forum for ongoing conversations about systems thinking and its use in evaluation. The TIG grew rapidly to a current membership of over 300 and with it has increased interest in the use of ideas drawn from the systems field in evaluation. Many speakers in past two AEA conferences have drawn directly and indirectly on ideas drawn from the systems field. The recent AEA Monograph, Systems Concepts in Evaluation, has been widely distributed and discussed.

    The TIG's focus has been to look at systems thinking as a framework for evaluation design, implementation, and data analysis. Examples include:

    • Methodologies for evaluation that utilize systems theory to inform the design and execution of the actual evaluation itself.
    • Discussion about how to ground evaluation methodology in systems thinking and systems theory.
    • Discussion around the intersection of program design, evaluation methodology, and data analysis within the framework of systems thinking.

    Over the next year or so, these themes will expand into the collection and dissemination of case studies that highlight the ways in which systems thinking and systems methodologies can be used in evaluation. The TIG keeps an eye on the Systems and Evaluation discussion group Evalsys ( - the main means by which TIG members and others can exchange ideas and experiences and has responded to member issues through surveys and meetings that combine team building, show and tell and group decision making. In the year ahead, the group plans to work more closely with other TIGs to promote cross-disciplinary ideas and activities and to identify additional resources. For more information, contact Bob Williams at [email protected].

    This TIG profile is part of an ongoing effort to spotlight the goals and activities of AEA's more than 40 topical interest groups. AEA members may belong to up to five TIGs as part of their membership benefits and may change their TIG selections at any time by signing on to the AEA website and updating their personal profile.

    AEA's TIG Directory

    Summer Institute
    Save the Dates!

    We are finalizing the lineup for the 2008 AEA/CDC Summer Evaluation Institute. Key coordinates:

    • Dates: Monday, June 23 through Wednesday, June 25
    • Location: Sheraton Atlanta Hotel in Atlanta, Georgia
    • What: More than 50 evaluation-focused training sessions

    Last year's institute brought together more than 500 attendees for quality training and the opportunity to build their professional networks. This year, we'll have tried-and-true offerings such as instrument development and quantitative analysis, more advanced topics including applications of evaluation theory, and keynote addresses to motivate and inspire. Registration - and the full program - will go online in March.
    We'll keep you updated via this newsletter and send out a free-standing announcement upon registration startup.

    SEA Conference
    Annual Meeting in Tallahassee with Stufflebeam

    The Southeast Evaluation Association (SEA), an affiliate of AEA serving the Southeast United States, will hold its 20th annual conference February 28- 29 in Tallahassee, Florida. The conference theme is Evaluation and Accountability: A Formula for Success and will cover evaluation across a broad array of program and policy areas at the national, state, and local levels.

    This year's Keynote Speaker, Daniel Stufflebeam, is known for his work on the evaluation checklists project, his authorship of numerous standardized achievement tests as well as articles and texts including Evaluation Theory, Models, & Applications (with Anthony Shinkfield, Jossey Bass, 2007), his leadership of the Evaluation Center at Western Michigan University, and his leadership in developing the Joint Committee standards for evaluations. Stufflebeam was the originator of one of the first models for systematic evaluation, the CIPP (Context, Input, Process, and Product) model, and has been recognized for his outstanding work with awards from the American Evaluation Association, Western Michigan University, and the Center for Research on Educational Accountability and Teacher Education. Stufflebeam will offer two pre-conference workshops on February 27 focusing on evaluation approaches and models and standards-based meta-evaluation, as well as deliver the keynote address on February 28, titled The CIPP Model for Evaluation: a model that supports improvement & accountability.

    The two-day conference will include multiple presentations by colleagues in a range of disciplines. SEA members and nonmembers alike are invited to Tallahassee for this event. Registration is open and more information is available online.

    SEA Website

    Get Involved!
    Get the most out of your membership

    The start of the new year brings more ways than ever to get involved with AEA. Here are just a few. We'll be sending more details about the Calls for Board and Award Nominations, but wanted you to have a list of the many things to do right now to participate in the life of the association.

    Please click through to the appropriate item below.

    We will have more to share over the coming months about how to participate fully in the life of the association.

    AEA Site Links
  • AEA Home
  • TIGs
  • Affiliates
  • Board
  • About Us

    The American Evaluation Association is an international professional association of evaluators devoted to the application and exploration of evaluation in all its forms.

    The American Evaluation Association's mission is to:

    • Improve evaluation practices and methods
    • Increase evaluation use
    • Promote evaluation as a profession and
    • Support the contribution of evaluation to the generation of theory and knowledge about effective human action.

    phone: 1-508-748-3326 or 1-888-232-2275