Newsletter: June 2010Vol 10, Issue 6

Banner600

AEA Ballot & Bylaws Revisions - Vote by July 21

Dear Colleagues,

 

cooksy2

You should have just received the ballot for this year's election. I want to thank the members of the Board's Leadership Priority Area Team and the Recruitment Task Force for their excellent work putting together a slate of candidates, which includes the 2012 President (also known as "President-elect"), a Treasurer, and three Board Members-at-Large. With AEA's many talented, energetic, and service-minded members, it must be challenging to select a subset of nominees for inclusion on the ballot. I greatly appreciate their work.

 

This year, the ballot has a second part - a referendum on a revision of association bylaws. The bylaws are the legal document that set forth the governance structure and the conditions of membership, including Board obligations to communicate with the members. The revisions that you will vote on are the result of a long process of Board deliberations that started when the Board began to develop governance policies. In some ways, the policies describe how the statements in the bylaws are manifested. For example, the revised version of the bylaws includes a new statement of the Board's responsibility for establishing a system of ongoing monitoring and evaluation, while the governance policies include more detail. Similarly, instead of identifying specific standing committees, the revised bylaws require that policies explicitly address diversity, ethics, finances, and nominations and elections and make member engagement in those areas a priority, while the standing committees, now organized in Priority Area Teams, are in the policies.

I support the proposed revisions on two grounds. First, the revision process has been thorough and included member input. I have witnessed the Board's thoughtful review of each item, and the discussion, debates and votes that have accompanied each change. The Board unanimously agreed on the importance of seeking member input, so an earlier version of the revised bylaws was posted on the website and announcements of the opportunity to comment were sent in a targeted email and in the newsletter. When the comments came in, each was considered and additional revisions were made. In order to ensure that our changes did not have legal implications that we were not aware of, the revised bylaws were also reviewed by an attorney. In short, the revised bylaws are the result of a lengthy and careful process with multiple layers of review.

I believe the process has resulted in a clearer, firmer statement of what members can expect from the association. While many of the revisions simply bring the bylaws in line with current practice, some are more substantial, with the removal of the committees probably the most significant. If this change was not accompanied by the addition of language stating our obligation to the core values and functions of the association and of member engagement in the process of carrying them out, I would be concerned. But, as it is, I believe the end result is that we are holding ourselves to higher standards. Our commitment to the values is a legal one, spelled out in the bylaws. How we express that commitment is spelled out in the policies. Since the Board is obligated to critically review all policies in an annual cycle and to use evaluative evidence in the process, we will continually be asking ourselves whether we are doing the right things and enough of them to make the kind of difference we want to be making in our association and for our association in the world. I urge you to carefully review the revisions and the justifications provided for the revisions. In addition, a summary of comments received from members and the Board's responses can be found by clicking the link on the side-by-side document accessible from the ballot. Please let me (ljcooksy@udel.edu) or Debra Rog (DebraRog@westat.org) know if you have any questions.

Now that the bylaws are in your hands, the Board is preparing for the upcoming July Board meeting. Some of the topics on the agenda are a review of current policies, consideration of policy recommendations from the Monitoring & Evaluation Task Force and the Multicultural Task Force, planning for member input and engagement, updates from the Association Management Company, and others. In addition, we plan to celebrate the success of the Summer Institute - over 550 attendees!

While I plan for the Board meeting, I hope you are taking the time to review the qualifications of the candidates and the revisions to the bylaws and Vote!

Thank you, 

Leslie
 
Leslie Cooksy
2010 AEA President
In This Issue
Policy Watch with George Grob
Meet Robert Shumer
Client Communications
TechTalk with LaMarcus Bolton
Graduate Interns
AEA eLibrary
Book: Developmental Evaluation
Member News
New Jobs Postings
Get Involved
About Us
Quick Links
Policy Watch - Rounding Out Health Care Reform Evaluation Policy
From George Grob, Consultant to the Evaluation Policy Task Force
 
05BannerFLast month, I wrote to you about evaluation of the new health reform law. As discussed then, the Evaluation Policy Task Force's most immediate concern was ensuring that evaluation would be used to assess the timely and effective implementation of the complex insurance reforms. Much could go wrong, not all of it turning on fidelity of implementation. There was, and still is, legitimate concern about waste and fraud. The institutional responsibilities of the Inspector General -audit, investigation, and evaluation - provide one set of strong safeguards against such challenges.

I was also initially concerned about the evaluation of the many separate bits and pieces of the public health and systems improvements that are scattered throughout the legislation. Reforms related to health services, health professions, preventive health, health care quality, health information technology, and refinements to Medicare and Medicaid programs also need to be effectively implemented and their impact evaluated. Fortunately, this potential problem was addressed as a result of the apparent acculturation of evaluation within the public health community and the corresponding congressional authorization committees. They did a good job of building evaluation into the very fiber of these programs.

 

A much larger concern was how to use evaluation to help ensure the success of the whole enterprise. Would the legislation result in a healthcare system that is accessible and affordable? Would it provide quality healthcare to those who need it? Will we as a nation be healthier five, ten, fifteen or more years out?

 

Fortunately, the new law does, in fact, provide one tool for addressing these overarching questions. It is section 5605, Key National Indicators, of the enacted health reform legislation, the Patient Protection and Affordable Health Care Act

 

This section establishes a congressionally appointed Commission on Key National Indicators to oversee the development of such a system and authorizes the National Academy of Sciences to determine how best to establish it. The Academy will convene a multi-sector, multi-disciplinary process to define major scientific and technical issues associated with developing, maintaining, and evolving the indicator system. The system will be subject to annual reports by both the Commission and the Academy, and to financial audits and programmatic reviews by the Government Accountability Office.

 

Of course, a national indicator system is not an evaluation. However, it will hopefully provide a means to help all Americans and policy makers see where things stand and promote the commissioning of evaluations to follow up on both problems and promising results revealed by the indicators.

 

As most insiders remarked upon passage of health care reform legislation, now the real work begins. Perhaps national indicators will help policy makers and evaluators stay on top of our evolving health care system.

 
Meet Robert Shumer - Researcher, Lecturer & Author 
AEA's 5,700 members worldwide represent a range of backgrounds, specialties and interest areas. Join us as we profile a different member each month via our Questions and Answer column. This month's profile spotlights Robert Shumer, a research associate at the University of Minnesota who has been active with his local affiliate and in publishing. He is author of Youth-Led Evaluation: A Guidebook, a title that was released in 2008 and remains one of the most popular books spotlighted in AEA's newsletter as indicated by click-throughs.
 

ShumerName: Robert Shumer

Affiliation: Research Associate/Lecturer, University of Minnesota. 

Degrees: Masters at CSU Northridge, Ph.D. in Education at UCLA

Joined AEA: 1992

Years in the Evaluation Field: 20 years. 

AEA Leadership Includes: My involvement is at the state affiliate level, serving as president of the Minnesota Affiliate. I am currently serving as past president, trying to provide continuity to actions we started last year.

 

 

Why did you join AEA?

"I joined AEA to get better connected to people who knew and understood the evaluation field. When I moved to Minnesota from UCLA, I got to know people who were well connected to AEA....Dick Kreuger, Michael Patton, Jean King....who helped acquaint me with the literature and practice of the field. I have come to understand in my professional life that evaluation is simply an integral part of learning. We often talk about feedback and reflection in educational circles, and for me that has translated simply into a form of systematic evaluation/reflection, always examining what you are doing and learning to continuously reflect on the meaning of experience. I have also found that the various ideas and perspectives shared through AEA and those engaged in evaluation have caused me to expand and enrich my understanding of both learning and the process of participating as an active citizen in society. I see knowledge about evaluation and the ability to actually perform evaluative tasks as a foundational piece of our democratic system."

 

What's the most memorable or meaningful evaluation that you have been a part of - and why?

"My most memorable work has been an evaluation of the AmeriCorps program in Minnesota. We created a course at the University of Minnesota on "evaluating community programs." It was an introduction to naturalistic methods of evaluation, with AmeriCorp as our focus. More than 30 students, graduate and undergraduate, studied the many programs in Minnesota. We conducted observations, interviews, case studies, and even cost-benefit work to understand the impact of the program on the participants, the community, and on the teaching of evaluation. Students completed a doctoral study, two Master's studies, and reports on the program over five years. Perhaps most importantly, we learned to do empowerment work, and realized how complicated it was in real life applications. We used evaluation as a service, itself."

 

What advice would you give to those new to the field?

"New evaluators should get connected to local evaluation groups and try to select a mentor to guide you on your development of knowledge and skills. I believe local affiliates of AEA make the field real and create a community of learners. We all continue to learn about evaluation every time we conduct a new study. Surround yourself with people who can help you to grow and enjoy the field."

 
If you know someone who represents The Face of AEA, send recommendations to AEA's Communications Director Gwen Newman at gwen@eval.org.
Communicating with Clients - How do You? 

How do you communicate with your clients? AEA would like to spotlight samples of great client and stakeholder communications. We begin with an e-newsletter produced monthly by Community Science. To see a full issue of The Change Agents, click here.

CSNewsThe Change Agents Newsletter. The Change Agents newsletter provides a forum where Community Science can share more about its work, its approach and new projects. The most recent issue focused on cross-cultural competency, a new partnership with The Partnership for a Drug-Free America and a hands-on community project with Habitat for Humanity.  It also provided a venue in which to announce staff hires, appointments and professional opportunities of interest to readers. The newsletter was launched in March 2009.
 

"The newsletter supports our mission of fueling social change through knowledge," says David Chavis, President/CEO of Community Science. "We provide information on resources that promote this mission to a broad audience and show how our work can be done for different size organizations, working in different contexts, addressing different issues. Evaluation can be a useful tool for improving the effectiveness of grassroots organization as well as small and medium size nonprofits, not just government agencies and large foundations, and the newsletter allows us to get that message out in a effective way." 

Community Science. Community Science is a group of social change professionals committed to building healthy, just, and equitable communities by providing advisory services, capacity-building products and services, initiative management and support, and research and evaluation services. Formed in 1997, the group has offices in Maryland, Italy and Portugal.

To share samples of the ways that you interact with your audiences, email AEA's Communications Director Gwen Newman at gwen@eval.org. We'd love to share the ways you communicate via this column as well as AEA's online eLibrary. Thanks!

TechTalk - Fumbling for Info? Find Files Quickly - with Tags
From LaMarcus Bolton, AEA Technology Director
 
Bolton

If you are an aea365 subscriber, you may remember a post I submitted about ways to more effectively manage your computer files. One of the suggestions I introduced was the idea of file tagging. If the idea of "tagging" still seems a little nebulous, read on as I attempt to explain in more concrete terms what tagging is, and how AEA is leveraging this technology.

 

I wanted to open with an example. Imagine you are working on a manuscript or report, and in efforts to more efficiently organize your files, you grouped all articles from your literature review within a project-specific folder. Now, what if you worked on another project that used much of the same literature? Do you make duplicate copies of the same files in a new project folder? Obviously, you could soon be in the market for added hard drive space in no time!

 

In traditional hierarchical filing systems, files are "tagged" and placed into one central location. However, tagging offers a more unique and natural way of organizing your files. Rather than trying to fit things into one "bucket," tagging is a non-hierarchical system that utilizes "keywords" to more quickly identify computer files or Internet information - wherever they may reside. This is done in an effort to more precisely describe items and to more efficiently find them again later. If you have ever used email services, such as Gmailâ„¢, you likely have already had experience with tagging due to Gmail's ability to categorize and sort emails by label (i.e., tag).

 

AEA has leveraged tagging technology in multiple ways, but most prominently within aea365 and our eLibrary. In particular, did you know that you can now easily find aea365 and eLibrary posts from your favorite Topical Interest Groups (TIGs) through our tagging system? To find posts by TIGs within aea365, simply visit our blog and scroll rightward to the pane entitled, "Posts Tagged by AEA Topical Interest Groups." From here, you can view posts of topical interest. Likewise, to find eLibrary documents of interest to you, simply visit our eLibrary, click the "Search Library" button within the left pane, and scroll downward to "Search Tags." From here, you can not only select tags based upon TIGs, but by other descriptors as well. And, you can even create your own tags through the "Add Tags" link housed within each eLibrary post!

 

So, go forth and have fun exploring tags of interest! And, as always, if you run into any issues or have any questions, please do not hesitate to contact me at marcus@eval.org.

AEA's Graduate Education Diversity Internship Program Graduates Nine

GEDIPAEA's Graduate Education Diversity Internship (GEDI) Program saw its sixth cohort of interns graduate during June's AEA/CDC Summer Institute in Atlanta. To get to this point, they have completed a number of requirements, including a nine-month placement at an evaluation agency in their geographical area, an evaluation report as a result of this placement, attendance at two 3-4 day seminars plus the AEA conference in the fall and the Summer Institute in June, and numerous assignments throughout the year designed to maximize their learning experience.

"Rita and I are overjoyed with the progress the interns made over the course of the year," says Michelle Jay, program co-chair (with Rita O'Sullivan). "I think the structure of the program has provided them with wonderful learning opportunities to deepen their understanding of the evaluation field, of the critical importance of culturally responsive evaluation, and, most importantly, of how much the field has to gain from their skills, knowledge, and experience."

The 2009-2010 GEDIP graduates are:


Karen Anderson, Clark Atlanta University

Karen worked on two projects at ICF Macro, but the bulk of her work was with the Garrett Lee Smith Suicide Prevention Project. She interviewed key stakeholders from colleges across the county and in the territory of Guam, analyzed data, and assisted with writing summative reports.

 

Lisa Aponte-Soto, University of Illinois at Chicago

Working with the TCC Group, Lisa developed a culturally competent logic model for an extant 18-week college and career awareness program for ninth grade students in a racially diverse community. She also provided strategies for constructing culturally-response logic models.

 

Johnavae Campbell, University of North Carolina at Chapel Hill

Johnavae worked with Evaluation, Assessment and Policy Connections (EvAP), who partnered with NESCent at Duke University to evaluate the effectiveness of their science podcasts as a new teaching tool to enhance student learning in evolutionary science.

 

Frances Carter, University of Maryland, Baltimore County

Frances helped develop and pilot an evaluation plan for the Annie E. Casey Foundation's (AECF) Race Matters Toolkit.

 

Soria Colomer, University of Georgia

Soria worked with ICF Macro and provided technical assistance for a rapid evaluation of Coordinated School Health in three school districts across the nation.

 

Larry Daffin, New York University

Larry worked as part of a team tasked with creating a virtual version of the systems evaluation protocol developed by the Cornell Office for Research on Evaluation (CORE). 

 

Deborah Ling Grant, University of California, Los Angeles

Deborah worked with the California Comparative Effectiveness and Outcomes Improvement (CEOI) Center to evaluate diabetes care programs for Latinos living in Southern California.

 

Jessica Johnson, Virginia Commonwealth University

Jessica worked with a team at Westat on evaluation projects related to a homelessness systems change initiative and homeless medical respite care programs.

 

Neva Pemberton, University of California, Los Angeles

Working with the HIV Vaccine Trials Network, Neva conducted formative research on the role of mentoring in advancing successful HIV/AIDS prevention research careers for African-American and Latino/a scientists. 

 

Please join us in congratulating this year's graduating class! 

Got to AEA's GEDI Program page to learn more

Great Content in the AEA eLibrary
Your AEA eLibrarians are working hard to promote this tremendous resource, which can be accessed directly from the AEA home page. We encourage all of you to upload materials, in particular "hands-on" items such as course syllabi, contract templates, and sample instruments.
 
If your proposal for the 2010 conference is accepted, please consider uploading your presentation slides and related materials to the eLibrary. This is an environmentally friendly way to share your resources and expertise with a large audience of evaluators while contributing to an important archive for our field.
 
To illustrate the diverse set of resources found in our eLibrary, take a look at some of its most viewed items:
  • An Empirical Review of Theory-Driven Evaluation Practice From 1990-2008; Chris Coryn. This is a slide set from the 2009 conference. 
  • Coffee Break Demonstration - Amy Germuth on the Fantastic Five Checklist to Write Better Survey Questions; Amy A. Germuth. These are the materials from a recent coffee break webinar.
  • Ealuation Contract Template; Melanie Hwalek. This is a sample evaluation contract.
  • Graphic Recording Examples; Jara Dean-Coffey. These are examples of graphic charts developed for a variety of group facilitation processes.
  • Independent Consulting TIG Client Feedback Form (CFF); Kathleen A. Dowell. This form aims to assist evaluators in gathering systematic feedback from clients on performance and satisfaction.
  • Improving Evaluation Questions and Answers: Getting Actionable Answers for Real-World Decision Makers; E. Jane Davidson. This is a slide set from the 2009 conference.
  • Spatial Regression Discontinuity: Estimating Effects of Geographically Implemented Programs and Policies; Christopher Moore. This is a slide set from the 2009 conference.
There are many other useful resources in our eLibrary. Consider finding them and uploading yours today! For questions or help, please contact any of the eLibrarians: Michael Fagen mfagen1@uic.edu, Kimberly Hall kimberly.m.hall@gmail.com, Oksana Jensen oksana.jensen@yahoo.com, or Ayaba Logan ayaba@umich.edu.
 
Developmental Evaluation: Applying Complexity Concepts

PattonAEA member Michael Quinn Patton is author of a new book published by Guilford Press entitled Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use. The book explains and illustrates how to use systems thinking and complexity concepts to facilitate systematic data-based reflection and decision-making by creating a dynamic evaluation partnership between social innovators and the developmental evaluator.

From the Publisher's Site:

"Developmental evaluation (DE) offers a powerful approach to monitoring and supporting social innovations by working in partnership with program decision makers. In this book, eminent authority Michael Quinn Patton shows how to conduct evaluations within a DE framework. Patton draws on insights about complex dynamic systems, uncertainty, nonlinearity, and emergence. He illustrates how DE can be used for a range of purposes: ongoing program development, adapting effective principles of practice to local contexts, generating innovations and taking them to scale, and facilitating rapid response in crisis situations. Students and practicing evaluators will appreciate the book's extensive case examples and stories, cartoons, clear writing style, "closer look" sidebars, and summary tables. Provided is essential guidance for making evaluations useful, practical, and credible in support of social change."

 

From the Author:

"The advantage of a book-length treatment of complexity concepts," says Patton, "is that I had the space to tell in-depth stories about people engaged in innovation and how developmental evaluation has contributed to their efforts at bringing about major change on complex issues in dynamic environments. Each chapter features a developmental evaluation story and case example. The principles and methods of developmental evaluation are grounded in these real-world case examples. They are the basis for the practical advice offered."   

 

About the Author:

Michael Quinn Patton is an independent organizational development and program evaluation consultant. A former president of AEA, Michael teaches regularly in the association's professional development workshops, The Evaluators' Institute, and The World Bank's International Program in Development Evaluation Training. He is a recipient of AEA's Lazarsfeld Award for Lifelong Contributions to Evaluation and the Myrdal Award for Outstanding Contributions to Useful and Practical Evaluation Practice from the Evaluation Research Society. He first wrote about developmental evaluation in a special issue of the American Journal of Evaluation in 1994.

All AEA members receive 20% off the retail price of all books and journals ordered directly from Guilford Press as part of AEA's Publishing Partners program. To receive your 20% discount, use the promotional code "AEA" online or call 1-800-365-7006.

Go to the Publisher's Site

Member News
AEA members practice in an amazing array of contexts. We're hoping to highlight how your work is making a difference in a rich variety of places and spaces.
 
AEA member Abraham Wandersman of the University of South Carolina received the 2010 Distinguished Evaluator for Advancing Reflection and Accountability in Evaluation Award from the Health Foundation of Greater Cincinnati. Congratulations Abe!
 
AEA member Eric Abdullateef, with Directed Study Services in Washington, DC, has been appointed to the 2010 Board of Examiners for the Malcolm Baldridge National Quality Award. The award is the highest level of national recognition for performance excellence that a U.S. organization can receive. Congrats Eric! 
 
If you have a recently published article or other experience to share highlighting how evaluation is making a difference across the disciplines, contact Gwen Newman, Communications Director, for possible inclusion in a future newsletter. You can reach her at gwen@eval.org.
New Jobs & RFPs from AEA's Career Center  
What's new this month in the AEA Online Career Center? The following positions and Requests for Proposals (RFPs) have been added recently:

  • Director, Office of Evaluation D2 at United Nations World Food Programme (Rome, Italy) 
  • Research and Evaluation Analyst at Children's Services Council, Palm Beach County (Boynton Beach, FL, USA) 
  • Evaluation Research Service Consultant at Pearson (Los Angeles, CA, USA)  
  • Biostatistician/Evaluation Research Analyst at NOVA Research Company (Bethesda, MD, USA)
  • Monitoring and Evaluation Consultant at National Democratic Institute (Ouagadougou, Burkina Faso) 
  • Monitoring and Evaluation Expert for USAID Project at Hartlands International Ltd (Washington, DC, USA) 
  • Research Assistant Professor at University of Alabama at Birmingham (Birmingham, AL, USA)
  • Senior Program Evaluator for International Programs at University of Florida (FL, USA)
  • Director of Collaborative Learning Center at Savannah College of Art and Design (Savannah, GA, USA)
  • Director of Assessment for Arts and Sciences/Lecturer in Educational Assessment at Tufts University (Medford, MA, USA)
Descriptions for each of these positions, and many others, are available in the AEA Online Career Center. According to Google analytics, the Career Center received over 4300 unique visitors in the past month. It is an outstanding resource for posting your resume or position, or for finding your next employer, contractor or employee.
 
Job hunting? You can also sign up to receive notifications of new position postings via email or RSS feed.
 
Go to the AEA Online Career Center
Get Involved
Get the most from your membership by taking advantage of the many things that you can do right now to participate in the life of the association, share your input, and promote your business.  
About Us
The American Evaluation Association is an international professional association of evaluators devoted to the application and exploration of evaluation in all its forms.
 
The American Evaluation Association's mission is to:
  • Improve evaluation practices and methods
  • Increase evaluation use
  • Promote evaluation as a profession and
  • Support the contribution of evaluation to the generation of theory and knowledge about effective human action.
phone: 1-508-748-3326 or 1-888-232-2275