Newsletter: November 2011Vol 11, Issue 11

Banner600


Greene.11

Greetings friends and colleagues,

 

On the fourth Thursday of November, many of us celebrated the American holiday of Thanksgiving. This holiday originated with the Mayflower Pilgrims from England, who hosted a feast of gratitude in 1621 for their first successful harvest in the new land, thanks in large part to the teaching and guidance of Native Americans living in what is now Massachusetts. Americans today continue to celebrate Thanksgiving primarily with a feast, the centerpiece of which is typically an oven-roasted turkey stuffed with autumn vegetables and breads. Beyond the feast, Thanksgiving is usually a time to spend with family and close friends, a time to rest from the labors of autumn and renew one's energies for the coming winter, and a time indeed to be grateful. Beyond the feast, Thanksgiving is a 'time out' from the busyness of daily life, a time for quiet reflection, and a time to contemplate pathways taken and pathways that lie ahead.

 

In somewhat parallel fashion, evaluation can also offer a 'time out' from the busyness and routine demands of daily life, notably for evaluation stakeholders and especially for program developers, administrators, and staff. Our educative traditions in particular are oriented toward goals of learning, enlightenment, reflection, and redirection. These traditions, which are anchored in the evaluation ideals of Lee Cronbach and Carol Weiss, aspire to provide a data-based window into how well the logic of a program translates to particular experiences in particular contexts, into promising practices evident in some contexts even if they are not part of the program design, into who is being well served by the program and who remains overlooked. Our educative practices position evaluation as a lens for critical reflection on the quality of a program's design and implementation, for reconsideration of the urgency of the needs the program is intended to address, for contemplation of alternative pathways that could be taken, and thus broadly as a vehicle by which society learns about itself (from Cronbach's 95 theses).

 

I am grateful for these educative traditions in our field, as I value the 'time out' for data-informed review and reflection and as I also believe that education remains the most powerful of all social change alternatives.

 

It was wonderful to see so many AEA members at our annual conference in Anaheim. And for those who were not able to attend, please do check the AEA eLibrary where many presenters have uploaded their presentations.

 

All my best,

Jennifer  

Jennifer Greene

AEA President 2011  


In This Issue
Policy Watch with George Grob
Outstanding Evaluation Awards
AJE Special Collection
Meet Kathryn Newcomer
Meet Allison Titcomb
Book: The Evaluation Society
Book: The Basics of Project Evaluation and Lessons Learned
Save the Dates
EPTF Evaluation
Data Den Eval 11 Registrants
New Job Postings
Get Involved
About Us
Quick Links
Policy Watch - It Was a Very Good Year for Evaluation Policy
From George Grob, Consultant to the Evaluation Policy Task Force 
In his poem, Desiderata, Max Ehmann advises his son to "Enjoy your achievements as well as your plans." We got to do that at AEA's annual international conference earlier this month during a session updating members about the Evaluation Policy Task Force (EPTF). We happily discovered that last year was a very good one for evaluation policy. The standout achievement was the publication of two national evaluation policies in international affairs.  

 

The first was that of USAID, issued in January. It calls for:

  • Integration of evaluation and program planning in international development programs
  • Requirements for evaluations of major programs and untested interventions
  • Acknowledgement of need for both quantitative and qualitative methods
  • 3% set aside of major program funds for evaluation

Equally impressive is the State Department's evaluation policy of both development and diplomacy programs published in May. Among the highlights are:

  • An evaluation framework covering all programs, projects, and activities in bureaus and missions
  • Evaluation requirements for major program areas
  • Requirements for evaluation plans
  • Emphasis on evaluator independence

Both policies follow intensive efforts by the EPTF over a three-year period involving funding authorities for international HIV/AIDS programs, reauthorization language in House and Senate bills, and meetings with State Department officials. The director of the USAID policy acknowledged the influence of  AEA's Evaluation Roadmap during a presentation unveiling the policy at Georgetown University.

 

Another key piece of legislation was the GPRA Modernization Act enacted in January. It enhances requirements for evaluation of federal programs in the context of strategic and annual planning.

 

Other developments included:

  • GAO use of the AEA Evaluation Roadmap as a criteria for oversight of international feeding programs such as the McGovern-Dole Food for Education Program 
  • The hiring of additional evaluators at the HHS Inspector General's Office to strengthen oversight of health care reform 

The year was also a very active one for the EPTF in developing comments on emerging evaluation policy issues, including:

  • The Government Accountability Office's Auditing Standards
  • The Health Resources and Services Administration criteria for evaluating evidence of the effectiveness for maternal, infant, and early childhood home visiting programs
  • Congressional evaluation legislation under consideration by Senator Mark Udall and SenatorsThomas Carper and James M. Inhofe
  • The NIH Evaluation Key Committee on Clinical and Translational Science Awards
  • The Editor of the New York Times regarding protection of evaluator independence in federal policy making on environmental matters and elsewhere.

Finally, at the request of the AEA Board of Directors, the EPTF developed policies to enlarge the role of AEA members and the Board in the development and approval of evaluation policies.

 

Whether all these new evaluation policies will take hold is now in the hands of government planners and program officials -- and, of course, of the evaluators who will be called upon to deliver on these new policies.

 

Go to AEA's Evaluation Policy Task Force website page 
AEA Recognizes Two Outstanding Evaluations

The American Evaluation Association honored four individuals and three groups for outstanding work at its annual awards luncheon on Friday, Nov. 4, held in conjunction with its Evaluation 2011 conference in Anaheim, CA. Honored this year were recipients in six categories who have been involved with cutting-edge evaluation/research initiatives that have impacted citizens around the world. Both an individual and a team were recognized with AEA's Outstanding Evaluation Award. They will be recognized in this issue. Five other awards will be recognized in our next two months' issues.  

 

2011 Outstanding Evaluation Award

The Changing At-Risk Behavior Team

 

Michael Coplen, U.S. Department of Transportation's Federal Railroad Administration (FRA) and Joyce Ranney, Michael Zuschlag, and Michael Harnar with Boston-based Volpe National Transportation Systems Center. 
rail.safety.awardChanging At-Risk Behavior (
CAB) is a peer-to-peer safety intervention pilot project sponsored by the U.S. Department of Transportation's Federal Railroad Administration that incorporated 1) peer-to-peer observation and feedback, 2) safety leadership development and 3) continuous process improvement. The CAB program evaluation team is recognized for evaluating this comprehensive new initiative that resulted in significant day-to-day safety improvements for Union Pacific Railroad (UPRR) and that influenced a broader shift in safety culture in the railroad industry. Long recognized as a leader for innovative safety programs, UPRR debuted new safety initiatives in its San Antonio, TX, service unit, which spanned 800 miles and included more than 1,000 locomotive engineers and conductors. Results of the CAB evaluation showed an 85% reduction in at-risk behaviors, a 72% drop in locomotive engineer decertification rates, and a 69% drop in the rate of human factor-caused derailments as a result of the CAB effort.


Since CAB, similar
projects have been initiated at other railroads, including Amtrak, Toronto Transit and Burlington Northern Santa Fe. "The data generated by the evaluation, and the strategically-valuable findings, convinced me to try this on a larger scale," says Joe Boardman, former FRA Administrator and current President/CEO of Amtrak. "When I came to Amtrak, I initiated the "Safe-2-Safer" program, which is modeled on the CAB approach. Safe-2-Safer is a $14 million multi-year effort intended to improve safety and safety culture in every department across the entire company."


2011 Outstanding Evaluation Award

David Jenkins, UK-based Independent Consultant

 

JenkinsCo-director of PLEY (Proactive Learning from Early Years), a collective of artists, teachers and researchers, Jenkins is being honored for A TALE Unfolded, his thought-provoking evaluation of a European teaching program for mid-career professionals working with youths. Notes nominator Bob Stake, a professor at the University of Illinois, "The final report, A TALE Unfolded, is a substantial document arising out of an eclectic evaluation methodology that in context was far from risk free but was conducted in an exemplary fashion that has led to an analysis of outstanding quality and usefulness to the sponsors...It deploys such literary devices as narrative vignettes, irony, metaphor and wit, seeing humour as a legitimate way of addressing ambivalence...Given the setting, this is a report that does not pull any punches, pointing up serious unresolved tensions and ambiguities both in policy and practice that demand attention, but doing so with considerable grace and elan."  

   

"It is an honor to lead an association with the caliber of dedicated professionals like our award-winners," says AEA President Jennifer Greene. "Their work demonstrates the substantial value of evaluation to diverse policy and program arenas in our society and around the globe." 

Envisioning Evaluation's Future - Special AJE Collection

What does the future hold for evaluation? 2011 is the 25th anniversary of the American Evaluation Association, and as part of that celebration, the last issue of the American Journal of Evaluation (AJE) for this year includes a collection of short commentaries on the possible futures of evaluation. In "Looking Ahead, The Future of Evaluation", eight authors look into the future of evaluation to consider What might happen? What could happen? What should happen?

 

Nick Smith and Paul Brandon set the stage for looking ahead, followed by Susan Kistler's examination of the potential role of technology and social networking in evaluation. Susan Labin offers insights into the possibility of integrative methodological approaches, followed by Veronica Thomas's reflections on the coming centrality of cultural issues in evaluation. Melanie Hwalek speculates on the possibilities of evaluation accreditation and certification, while Jim Rugh spreads the expanding international evaluation scene before readers. Louise Yarnall and Nick Smith conclude the collection by considering how theory and practice might interface 25 years from now. The commentaries' styles and perspectives, as well as the topics and contexts they address, provide a fresh mix of reflections that, in several cases, present bold and provocative predictions about the future of evaluation theory, methods, practice, and the profession. Don't miss this stimulating collection of provocative opportunities and fresh insights into the future of evaluation.

 

Smith, N. L., Brandon, P. R., Hwalek, M., Kistler, S. J., Labin, S. N., Rugh, J., Thomas, V., & Yarnall, L. (2011). Looking ahead: The future of evaluation. American Journal of Evaluation, 32(4), 565-599.

Meet Kathryn Newcomer - Incoming Board Member

In our last issue, we promised a quick introduction of our three incoming Board members as well as the 2013 President. We'll spotlight each individually and thank them for their commitment to service.  

   

NewcomerKathryn Newcomer, a Professor at George Washington University where she also serves as Director of the Trachtenberg School of Public Policy and Public Administration, has published extensively on program evaluation and performance measurement and has designed and implemented evaluations for a number of organizations, including the American Speech-Hearing-Language Association, the American Society of Clinical Oncology, the Senior Executive Association, the U.S. Department of State, the U.S. General Services Administration, the U.S. Department of Health and Human Services, the U.S. Department of Transportation, the National Association of Schools of Public Affairs and Administration, the Center for Park Management, the American Association for the Advancement of Science, and the Horticultural Research Institute.

 

In her ballot statement, Kathryn spotlighted her professional and service contributions. She has served as president of the Washington Evaluators and as AEA's representative on the Advisory Council on Auditing Standards for the U.S. General Accounting Office. She served as the founding chair of the Center for Accountability and Performance that was established by the American Society for Public Administration, assisted in the development of the Environmental Evaluators Network (EEN), and hosted the annual conference of EEN the last few years.

 

"I feel that my career as both an academic and a practitioner working on the evolving role of evaluation and performance measurement in the U.S. federal government provides me with a rather unique perspective," she says. "Over the last thirty years I have served the evaluation profession through my assumption of leadership roles; training MPA, MPP, and PhD students at GWU who have gone into government, nonprofit and evaluation careers; providing non-degree training to evaluators and aspiring evaluators within the U.S. and in other countries; designing and implementing evaluations for a large number of public and nonprofit organizations in the U.S.; and through my publications. I would be honored to serve the American Evaluation Association through service on the Board."   

Face of AEA - Meet Allison Titcomb, Independent Consultant
AEA's more than 7,000 members worldwide represent a range of backgrounds, specialties and interest areas. Join us as we profile a different member each month via a short Q&A. This month's profile spotlights Allison Titcomb, an independent consultant long active with her local affiliate and the broader network of resources.

TitcombName, Affiliation: Allison L. Titcomb; ALTA Consulting, LLC; Tucson, Arizona

Degrees: Ph.D., 1996, Educational Psychology (learning, memory & measurement) and BA with Honors, 1985, Ecology & Evolutionary Biology.  Also, continuing education in Mediation and Cognitive Coaching.

Years in the Evaluation Field: Over 25

(My undergraduate honors thesis was a three-semester-long course evaluation).

Joined AEA: 1998, I think. As the decades lengthen, the details blur a bit. I know my first annual conference was 1999 in Orlando, FL.

 

AEA Leadership Includes:

Since joining AEA, I have served as co-program chair (EPE TIG) and leadership team member (OL-ECB TIG). I've been most active in what has become known as the Local Affiliate Collaborative. That group has been hosting discussions, creating documents, and generally "holding the space" around support for the local affiliates since the early 2000s. The effort was strengthened by a series of retreats funded by the Kellogg Foundation. See www.lacaea.org for more history and resource documents. Member of the Arizona Evaluation Network (AZENET) since 1997, AZENET board member from 2001-2011.
 
Why do you belong to AEA?

"When people ask what I do, I say "I'm an evaluator." AEA represents one of the best ways to foster my own professional development and to learn from and share with others."


Why do you choose to work in the field of evaluation?

"Learning and evaluation are inextricably connected for me. I have endless enthusiasm for helping others think about, learn from, and plan their own work through evaluation.


What's the most memorable or meaningful evaluation that you have been a part of - and why?

"There are many, but I have to say my undergraduate honors project because I was asked by a professor to "make it your business to tell me how to teach my class." I took all the faculty workshops on instructional design-related topics and learned from an expert on course evaluation and experienced first-hand how to be flexible as well as systematic (e.g., the course evaluation statistical norms were changed in the middle of my project). It definitely set the stage for my life-long interest in evaluation and learning. I was also told when I was an undergraduate that I'm a "methods geek." So I guess you could call it fate that I ended up in evaluation as a profession. The most recent "most meaningful evaluation" for me has been an opportunity to blend systems and developmental evaluation. More on that at Evaluation 2012 in Minneapolis!"


What advice would you give to those new to the field?

"Keep learning-both "classic" and new ideas.Follow your interests and strengths - you'll find the best success in honestly following your BLISS and not just the latest fad."

 

If you know someone who represents The Face of AEA, send recommendations to AEA's Communications Director, Gwen Newman, at [email protected].

The Evaluation Society

Dahler-LarsenAEA member Peter Dahler-Larsen is author of The Evaluation Society, a new book published by Stanford University Press.

 

From the Publisher's Site:

"Evaluation-whether called by this name, quality assurance, audit, accreditation, or others-is an important social activity. Any public or private organization that "lives in public" must now evaluate its activities, be evaluated by others, or evaluate others. What are the origins of this wave of evaluation? And, what worthwhile results emerge from it?

 

"The Evaluation Society argues that if we want to understand many of the norms, values, and expectations that we, sometimes unknowingly, bring to evaluation, we should explore how evaluation is demanded, formatted, and shaped by the two great principles of social order: "organization" and "society." With this understanding, we can more conscientiously participate in evaluation processes; better position ourselves to understand many of the mysteries, tensions, and paradoxes in evaluation; and most effectively use evaluation. After exploring the sociology and organization of evaluation in this landmark work, author Peter Dahler-Larsen concludes by discussing issues that are critical for the future of evaluation-as a discipline and a societal norm."

 

From the Author:

"What prompted the book is basically my wondering why we have such a vast evaluation wave in modern society. My interest is in understanding evaluation as an organizational and societal phenomenon. Why does evaluation take so many forms? What is the impact of evaluation on human beings and their collective, democratic life? What has been most rewarding is the discussions that the drafts of the book have opened for me - with colleagues and reviewers. I am looking forward to more of such stimulating discussions now that the book is coming out.  And what sets the book apart from most books about evaluation, I think, is its broad macro-sociological and organizational perspectives. A very different literature base. And, I hope, its unconventional standpoint."

 

About the Author:

Peter Dahler-Larsen is Professor of Evaluation in the Department of Political Science and Public Management at the University of Southern Denmark, where he is Director of the Master's Program in Evaluation. He is a past President of the European Evaluation Society.

 

Go to the Publisher's Site

The Basics of Project Evaluation and Lessons Learned

ThomasAEA member Willis H. Thomas is author of a new book, The Basics of Project Evaluation and Lessons Learned, published by Productivity Press, a subsidiary of CRC Press Taylor & Francis Group.  

 

From the Publisher's Site:

"How do you determine if your project was a success (beyond being within budget and completed on time)? How do you determine the impact of a project? How do you capture valuable knowledge from a current or past project to enhance future programs? The answer to all three questions is through project lessons learned.

 

"Although lessons learned provide invaluable information for determining the success or failure of projects, a systematic method for conducting lessons learned is critical to the ongoing success of your projects, programs, and portfolios. The Basics of Project Evaluation and Lessons Learned details an easy-to-follow approach for conducting lessons learned on any project, in any organization. Whether your job entails running small projects from a home-based business or managing large projects as a part of an international supply chain, this book will be of great benefit. It outlines a well-indexed strategy to capture, categorize, and control lessons based on best practices. The book:

 

* Outlines a practical 10-step process for conducting effective lessons learned

* Includes a wealth of project job aids, including templates, checklists, forms, and a Project Evaluation Resource Kit (PERK) on the accompanying CD

* Is supported by a comprehensive website at http://www.lessonslearned.info 

 

"Based on more than a decade of research supported by renowned experts in the field of evaluation, this practical guide delivers the necessary resources for active engagement. It introduces innovative concepts, improved models, and highlights important considerations to help you gain a multi-dimensional perspective of project evaluation in the context of lessons learned."

 

From the Author:

"It was a learning experience over the past year working with a large publisher. I took the content of my dissertation and adapted it into a useful 161-page guide for project managers. It utilizes the Project Management Body of Knowledge (PMBOK) as the framework. The PMBOK is the de facto standard for project management with millions of copies sold. This new text represents new knowledge as the connection between the PMBOK and project evaluation was not previously developed at this level. I hired a team of six programmers to help me complete the design on web-based applications and databases. While these applications look simple, they are very complex in coding. I have been receiving many inquiries from around the world as far as Australia."

 

Go to the Publisher's Site

Save the Dates!

Be sure to mark your calendars now for 2012.

  • 2012 AEA/CDC Summer Evaluation Institute, Atlanta, GA, June 3-6, 2012
  • 2012 AEA Annual Conference, Minneapolis, MN, October 22-27, 2012
We'll be sharing more about each event in the interim months. Hope to see you there!
EPTF Advance Notice of Evaluation

In 2007, AEA began an initiative to assist in developing an ongoing capability to influence evaluation policy. Since then, the Evaluation Policy Task Force (EPTF) has engaged in a range of processes, including the development of an influential Evaluation Roadmap, which you can read about at http://www.eval.org/EPTF.asp. As part of our regular processes of reviewing innovative projects, the Board will be evaluating the EPTF, including seeking member input.  

 

Look for details in the December newsletter about how to provide your comments. 

Data Den: Evaluation 2011 International Registration
From Susan Kistler, AEA Executive Director

We're back from a wonderful annual conference in Anaheim, have processed all of the on site registrants, and are now in a place to sit back for a moment and take a look at who actually joined us in California.

With 2719 registrants for Evaluation 2011, our attendance was up just over 7% from 2010. International registration was up even more. With 383 delegates coming from around the world, the international attendance increased 37% over last year to represent 14% of the total.

There were 2336 registrants from the United States, and 133 from Canada, with another 16 countries having five or more delegates present. It was exciting to meet new colleagues from near and far and to see efforts in international outreach coming to fruition.

Countries Other Than the United States and Canada
With Five or More Registrants for Evaluation 2011 
International Registrants
New Jobs & RFPs from AEA's Career Center  
What's new this month in the AEA Online Career Center? The following positions have been added recently: 
  • Research Associate at Westat (Rockville, MD, USA) 
  • Evaluator Position at Tidwell and Associates Inc.  (Tampa, FL, USA) 
  • Evaluation and Research Intern at Capital Partners for Education (Washington, DC, USA)   
  • Evaluation Methodologist at National Quality Forum (Washington, DC, USA)
  • Research Associate/Evaluator at Office of Educational Innovation and Evaluation (OEIE) (Manhattan, KS, USA)   
  • Evaluation and Research Coordinator at Hatchuel Tabernik and Associates (Berkeley, CA, USA) 
  • Research Associate at American Institutes for Research (Chicago, IL, USA)
  • Academic Technology and Evaluation Consultant at University of Wisconsin-Madison (Madison, WI, USA)
  • Director of Research & Evaluation at Los Angeles Universal Preschool (Los Angeles, CA, USA) 
  • Transparency and Accountability Impact Research at Transparency and Accountability Initiative (Flexible)

Descriptions for each of these positions, and many others, are available in AEA's Online Career Center. According to Google analytics, the Career Center received more than  3,100 unique visitors in the past month. It is an outstanding resource for posting your resume or position, or for finding your next employer, contractor or employee. Job hunting? You can also sign up to receive notifications of new position postings via email or RSS feed.

 

Get Involved
About Us
The American Evaluation Association is an international professional association of evaluators devoted to the application and exploration of evaluation in all its forms.

 

The American Evaluation Association's mission is to:
  • Improve evaluation practices and methods
  • Increase evaluation use
  • Promote evaluation as a profession and
  • Support the contribution of evaluation to the generation of theory and knowledge about effective human action.
phone: 1-508-748-3326 or 1-888-232-2275