Newsletter: December 2013

Vol 13, Issue 12

Banner600

Message from the President

Fitzpatrick.2

Dear colleagues,

 

In my final column as 2013 president, I'd like to reflect on the many changes AEA has undergone this year and some of the actions we will be taking in the future. 

 

At this time in 2012, we were working furiously to identify an association management company (AMC) to lead us. Thanks to Mel Mark, Stewart Donaldson, Robin Miller, John Gargani, and Ricardo Millett, members of the Selection and Transition Task Force, for all their work gathering information and making tough choices. As a result, in February 2013, the board voted to pursue a contract with SmithBucklin. We then underwent several months of transition as Susan Kistler acquainted SmithBucklin staff with the ways in which Kistler and Associates had managed AEA. We also undertook a search for an executive director within the AMC and, after exhaustive recruitment and interviews with three excellent candidates, we hired Denise Roosendaal to serve as our executive director in August. 

 

Throughout the year, a policy task force has been reviewing our financial policies to update them to articulate our new relationship with SmithBucklin. Thus, we end 2013 with new, skilled and experienced managers in place. Based on our experience this year, we look forward to learning from SmithBucklin and moving AEA that "step up" to greater learning and improved management by both the board and our executive director.

 

As 2013 began, another new person was added to the payroll: The Evaluation Policy Task Force (EPTF) hired Cheryl Oros as their part-time consultant to lead EPTF's work to influence evaluation policy. Cheryl began her work this year connecting with international evaluation groups and federal agencies. In 2012, the AEA board voted to "institutionalize" the Evaluation Policy Task Force, meaning to establish them as a permanent task force working for AEA and its members to improve evaluation policies that affect much of our work and the practice of evaluation as a whole. EPTF developed a plan to extend their work, which had had a focus on federal evaluation policy, to include evaluation policies at the state and local levels and international policies. 

 

Finally, EPTF began a rotation of members that will continue in the future. We said good-bye and thank you to Eleanor Chelimsky, Patrick Grasso, and Susan Kistler who did so much to help EPTF get started, develop the "The Evaluation Roadmap for a More Effective Government," and begin influencing federal evaluation policies. This year, EPTF added Jonathan Breul, Cynthia Clap-Wincek, and Rakesh Mohan to the task force. 

 

Although most of the year was consumed with hiring and the transition to new management, we took action near the end of the year to pursue two new initiatives: connecting with other evaluation-related disciplines and considering whether AEA should create a National Academy of Evaluators and, if so, what the nature of the academy might be. A task force was created to address each issue with Nicole Vicinanza chairing the Connecting with Evaluation-Related Disciplines Task Force. Tom Chapel, Laura Leviton, Dominica McBride, Lance Potter, Hallie Preskill, Andy Rowe, Tom Schwandt, and I also serve on this task force. We have begun our work discussing our goals in connecting and our methods to learn how other disciplines have pursued such connections. I've also received many helpful suggestions from you! The National Academy Task Force is chaired by George Julnes and Kathy Newcomer and will begin its work in January. Thanks to Len Bickman, Ian Davies, Leslie Goodyear, Stafford Hood, Ernie House, and Donna Podems for serving with that group. 

 

Other initiatives this year included increased participation with the AEA international community. Susan Kistler appointed an International Working Group, which Denise Roosendaal is now using to guide our international connections. Hubert Paulmer is the chair of that group. Meanwhile, Tessie Casambas continues to represent AEA in international organizations such as the International Organization for Cooperation in Evaluation (IOCE) and EvalPartners. 2015 has been declared the Year of Evaluation, and AEA will be working in 2014 to join in these efforts.  

 

2014 will begin with strategic planning. Now that we have new management in hand and have completed the transition to them, it is an appropriate time for us to pause and consider our future directions. Of course, we will be using policy-based governance to operate and, in 2014, will be updating all our policies to reflect the transition to a new management company. We also will move forward with attention to evaluating our work as a board.

 

I could go on and on, but I've exceeded my word limit already! These are a few highlights of 2013. I've very much enjoyed serving as president in this transitional year and look forward to continue work with the board in 2014 as past president.   

 

Happy holidays to you all! 

 

Sincerely,

Jody

Jody Fitzpatrick

AEA 2013 President  

In This Issue
Meet Donna Podems
2013 Award Winners
Walking the Talk
Face of AEA
Policy Watch
Diversity
p2i
New Job Postings
Register
Get Involved
About Us
Quick Links
Join our Mailing List!
Meet Donna Podems - Incoming Member at Large

Dr. Donna Podems is a researcher, facilitator, and monitoring and evaluation specialist with more than 20 years of experience in more than 20 countries. She holds a doctorate in interdisciplinary studies focused on program evaluation and organizational development, and a master's degree in public administration. Podems is the founder and director of OtherWISE: Research and Evaluation, and is a research fellow with Stellenobosch University. 

  

She has worked with governments, civil society, nongovernmental groups, international donors, and foundations. Podems has experience in program evaluation conducting implementation, outcome and impact evaluations, and developing M&E frameworks with and for projects in gender, women's empowerment, HIV/AIDS, TB, health systems, youth interventions, education, capacity building, human rights, environment, and community needs. She is an experienced facilitator and trainer, both in strategic planning and M&E. Podems is a past board member for the South African Monitoring and Evaluation Association (SAMEA) and past chairperson for the American Evaluation Association's (AEA) International Committee. She is a member of AEA, IDEAs, and SAMEA. She has published articles on evaluation competencies, evaluation process use, feminist evaluation, and evaluation and public management. She is a resident of South Africa.

 

In her ballot statement, Podems stated, "As a board member, I would recognize and strengthen AEA's role to keep members informed of relevant debates and support processes that encourage theoretical dialogue and vibrant evaluation discussions." Podems also stated, "As a board member, I would support areas that continue to strengthen practitioners who work in the field."  

 

We welcome Donna Podems and thank all who participated in this year's election process!  

AEA Announces 2013 Award Winners

The American Evaluation Association honored six individuals at its 2013 Awards Luncheon in Washington, D.C. Honored this year were recipients in six categories involved with cutting-edge evaluation/research initiatives that have impacted citizens around the world. We'll spotlight each award in upcoming issues. Today we extend our congratulations to Daniela Schr�eter!


Daniela Schr�eter, Director of Research, The Evaluation Center, Western Michigan University

2013 Marcia Guttentag Promising New Evaluator Award  


As director of research and an associate faculty member of the Interdisciplinary Ph.D. in Evaluation program, Schr�eter leads evaluation studies, conducts research on evaluation, provides professional development in evaluation, serves on The Evaluation Center's leadership team, and works with a diverse group of doctoral students from the Interdisciplinary Ph.D. in Evaluation program. She has led and been involved in a wide range of externally funded program evaluations, including educational program and policy evaluations; organizational evaluations; community-based, statewide and national multisite evaluations; and international development evaluations. Schr�eter's primary evaluation and research interests are interdisciplinary and center on evaluation theory, methodology, communication, practice and capacity building. 

 

Schr�eter has more than 10 years of experience in teaching university-level courses, facilitating workshops for professionals, and providing capacity building in local nonprofit organizations. She has provided training and development opportunities in China, Germany, Norway, Switzerland, Thailand, and the United States. As an associate faculty member for the Interdisciplinary Ph.D. in Evaluation program, she has taught evaluation courses, continuously supervises independent studies and field experiences in evaluation, and serves on program and dissertation committees. She has authored and coauthored more than 30 publications and numerous technical evaluation reports, and regularly presents refereed and invited papers, roundtables, think tanks, and panels at local, national, and international conferences and events. 

 

Schr�eter completed her Ph.D. in interdisciplinary evaluation at Western Michigan University in 2008 and her M.A. in intercultural business communication, German as a foreign/second language, and American Studies at Friedrich Schiller University, Jena, Germany, in 2002. 

 

Visit AEA's awards page
AEA Values - Walking the Talk with Jean A. King

Are you familiar with AEA's values statement? What do these values mean to you in your service to AEA and in your own professional work? Each month, we'll be asking a member of the AEA community to contribute her or his own reflections on the association's values.  

 

AEA's Values Statement

The American Evaluation Association values excellence in evaluation practice, utilization of evaluation findings, and inclusion and diversity in the evaluation community.

 

             i.  We value high quality, ethically defensible, culturally responsive evaluation practices that lead to effective and humane organizations and ultimately to the enhancement of the public good.

             ii. We value high quality, ethically defensible, culturally responsive evaluation practices that contribute to decision-making processes, program improvement, and policy formulation.

            iii. We value a global and international evaluation community and understanding of evaluation practices.

            iv. We value the continual development of evaluation professionals and the development of evaluators from under-represented groups.

             v. We value inclusiveness and diversity, welcoming members at any point in their career, from any context, and representing a range of thought and approaches.

            vi. We value efficient, effective, responsive, transparent, and socially responsible association operations.

 

 


 

 

I have always felt a personal connection to the field of evaluation. I was born the year that Ralph Tyler crafted his famous rationale. I graduated from high school the year that Michael Scriven published "The Methodology of Evaluation," naming concepts we still use and framing issues we continue to address. At Cornell I was lucky enough to study with Jay Millman, one of the seminal May 6 group, and as a grad student played a very small role on the initial version of the Program Evaluation Standards. I helped develop AEA's annual meeting culture beginning when Bob Ingle still ran the conference as only he could and then with John McLaughlin for many years before the advent of personal computers. (Imagine the entire program in post-it note form on large sheets of paper.) I assisted in co-founding three AEA TIGs. As I routinely remind my students, I am that old.  

 

So what do AEA's values mean in my evaluation practice over time? An obvious problem with values statements is that they run the risk of sounding like motherhood and apple pie. Who would speak against such appealing ideas? But that is exactly the point of making them explicit. To speak about each of the components would require too lengthy a document, so let me reflect on just one aspect.

 

Think for a moment about how times have changed over the course of our field's development. Those of us who have been around through much of its history can see these changes. Three examples from my youth in the 1950s quickly provide warrant for progress:

 

  • When I first learned English grammar, Miss Walker taught me to use the masculine pronoun to refer to mixed gender groups-and never once did I protest. Now I pay close attention to my pronouns and alternate gender as appropriate. 
  • I was a student in elementary school when Brown v. The Board of Education of Topeka thankfully changed education in the USA forever. 
  • One of my parents' best friends, who was gay, kept his identity hidden in our conservative community where the consequences of coming out would have been devastating.

 

You can likely pull examples from your own life to expand this unfortunate list. The USA of my youth was far less attentive to and accepting of diversity, and sadly, for many, the situation in this country and elsewhere today is no better. A veneer of civility in this country often masks a deep-seeded lack of acceptance of the other. This is why the content of our AEA values statement — in particular phrases like "culturally responsive," "inclusion and diversity," and a "global and international evaluation community" — are important to my evaluation practice. Our field may have made progress, but we've only just begun to bring our commitment to these inclusive values to life. It matters that practicing evaluators, myself included, routinely hold these ideas before our eyes and act in accordance with them. 

 

Horace Mann, often called the founder of US public education, reportedly kept on despite continuing challenges by telling himself, "Be ashamed to die until you have won some victory for humanity." For me, program evaluation is my approach to seeking that victory for humanity, and the AEA Values Statement highlights what I believe to be important grounding for the process of getting there. 

Face of AEA - Meet Michael Boger

AEA's more than 7,800 members worldwide represent a range of backgrounds, specialties, and interest areas. Join us as we profile a different member each month via a short question-and-answer exchange. This month's profile spotlights Michael Boger.

 

Name: Michael Boger

Affiliation: Mecklenburg County Office of Management & Budget, Enterprise Management Analyst

Degrees: B.A., Criminology (North Carolina State University); M.A., Public Administration (University of North Carolina at Charlotte)

Years in the Evaluation Field: Less than one

Joined AEA: 2013

  

Why do you belong to AEA?

 

As my number of years in the evaluation field suggests, I am fairly new to the world of evaluation. I joined AEA mainly to bring back best practices from the 2013 AEA Conference to my organization. We are starting an initiative here in Mecklenburg County, N.C., to evaluate our services and programs to ensure we are responsible with taxpayer dollars. The conference was invaluable as it connected me with many people willing to share their knowledge, and provided easy-to-digest information for a new evaluator. Our evaluation team has also been taking advantage of AEA online materials to help prepare us for conducting future evaluations. We meet once a week to watch one of the many available webinars and discuss how we would best apply the new information. 

 

Why do you choose to work in the field of evaluation? 

 

Although Mecklenburg County has a lot of experience in performance measurement, we chose to move to evaluation to give us a better idea of the effectiveness of the services we provide. We don't want to fund services only because funding is available. Evaluation is also unique in that there are a variety of tools to use along the way. This type of flexibility is needed when dealing with county services that range from human services to parks and a sheriff's office.

 

Is there a program you're working on right now that excites you?

 

We have not finalized a list of county services to be evaluated, but have received many requests for evaluation from executive leadership and department directors. Our goal is to conduct pilot evaluations on two services before the process is fully implemented. This will help determine our capacity and determine if it's a good process. 

 

One question that we are still working to finalize is to consider how effective our efforts have been to centralize a variety of enterprise functions, such as information technology, public information, and web services. This evaluation will likely address how satisfied customers are with the consolidation and how service quality has been affected by the change. Stay tuned for the results of our first evaluation! 

Policy Watch - Evaluation Policy at NIH's CTSA 

 OrosFrom Cheryl Oros, Consultant to the Evaluation Policy Task Force (EPTF)

 

An evaluation policy is any rule or principle that an organization uses to guide its decisions and actions regarding evaluation. These policies can address evaluation goals, methods, participation, roles, management, capacity building, dissemination, use, and meta-evaluation. Evaluation policies, by their very nature, are general rather than providing specific details. 

 

The National Institutes of Health (NIH) developed guidelines this year for the evaluation of the Clinical and Translational Science Awards (CTSA) program. We are pleased that, in addition to the contribution of the CTSA Consortium's Evaluation Key Function Committee, this new evaluation policy drew on AEA guidance from An Evaluation Roadmap for a More Effective Federal Government. It is not always possible to track the influence on an effort such as the Roadmap, but the CTSA explicitly states that their policies were informed by the Roadmap and input from the EPTF and other AEA leaders. 

 

NIH launched the Clinical and Translational Science Awards (CTSA) in 2006 to fund (with $500 million/year) a national consortium of 61 medical research institutions. The goals of CTSA are to transform the way biomedical research is conducted, speed the translation of discoveries into treatments, engage communities in clinical research, and train a new generation of clinical researchers. Each award is required to have an evaluation core that assesses administrative and scientific accomplishments, conducts self-evaluation activities, and participates in a national evaluation. CTSA written policies note that, among other things, the evaluation function should:

 

  • Be an integral part of program planning and implementation.
  • Establish and continuously improve a formal evaluation planning process
  • Include plans about how evaluation results will be used.
  • Address the entire range of translational research.
  • Include evaluations that are prospective and retrospective, internal and external, process and outcome, qualitative and quantitative, attentive to local and national levels, and using traditional and innovative methods.
  • Be open, public, and accessible. 
  • Engage stakeholders in all phases.
  • Encourage participatory and collaborative evaluation at all levels.
  • Follow the highest professional standards. 

 

CTSA noted that the program should: 

 

  • Develop evaluation policies collaboratively for general guidance.
  • Assess the degree to which evaluations are well-conducted and useful in enhancing the CTSA.
  • Encourage ongoing professional development.
  • Bring to bear the proper mix of evaluation skills to accomplish evaluation.
  • Be proactive and strategic regarding how to coordinate and integrate evaluation conducted at different organizational levels.
  • Work collaboratively with the national evaluators to identify and pilot-test a small, rigorous set of standard definitions, metrics, and measurement approaches for adoption by all CTSAs.

 

We hope this example has provided you with ideas about how your organization might develop evaluation policies drawing on AEA's Roadmap.  

This Year in Diversity: The Milestone Year

From Zachary Grays, AEA Headquarters

 

I have had the pleasure of bringing you the latest on diversity and cultural and global competence here at AEA. It has been an absolute joy curating this column and connecting personally with everyone who makes this association and profession propel. 2013 has been an extraordinary year here at AEA, offering me the opportunity to celebrate a few milestones and famous firsts along the way. Each of the events below are successes and firsts that truly champion diversity within our association:

 

The Graduate Education Diversity Internship celebrated its 10th birthday this year! One of 14 recommendations made by the AEA Building Diversity Initiative (BDI) subcommittee, the GEDI program prepares scholars from diverse communities to do evaluation work and grants them the opportunity to work among the ranks at professional evaluation sites across the country. Fifty-four GEDI scholars have graduated the GEDI program with this year's talented interns, Shipi Kankane, Bailey Murph, Crystal Coker, and Anael Ngando well on their way to graduation at the annual Summer Institute in Atlanta this June. To commemorate the 10-year anniversary of the GEDI, the spring 2014 issue of New Directions of Evaluation will highlight the GEDI program, its robust lineage, evolution, and, of course, the impact heard around the world by its illustrious graduates.

 

For the first time in AEA history, a GEDI graduate, Maurice Samuels, hosted a GEDI site for this year's cohort. Currently a lead evaluator and researcher at Outlier Research and Evaluation, CEMSE|University of Chicago and a graduate of the inaugural GEDI cohort, Samuels was inspired to become a mentor because of the opportunity to provide and introduce an individual to the evaluation work that he and his colleagues do at Outlier and to provide an intern the experience of conducting evaluations for different types of organizations. 

 

Our annual conference is a weeklong meeting of the minds where evaluation professionals, practitioners, and enthusiasts from across the globe have the opportunity to attend more than 850 sessions and acquire endless amounts of knowledge from their colleagues. The annual conference is topped off on Friday evening with the Silent Auction, an evening not to miss. Items donated from our international attendees are auctioned off to raise money to fund the AEA International Travel Awards. This year's conference attendees (also a record-setting number, teetering just under 4,000) helped raise more than $9,000, the most money ever raised to fund the awards. The International Travel Awards are competitive awards given to presenters from developing and underdeveloped countries to present their work and represent their respective nations and institutions during our annual conference. There were 40 applicants representing 25 countries vying for this year's awards. Our awardees were Katharine Tjasink of South Africa, Asad Rahman of Bangladesh, Benedictus Stepantoro of Indonesia, and Basan Shreshtha of Nepal. 

 

Diversity has had quite a few shining moments this year, and it goes without saying that it would not be possible without the members of AEA, dedicated staff members, and passionate volunteers and coordinators. My grandest hope is that this column serves as a call to action for everyone to make diversity far more outreaching and accessible to all communities. The AEA network of members is solid, connected, and committed to inclusivity, the perfect ingredients for the evaluation profession to broaden cultural, global, and diversity competence. 2014 will be a stellar year and will mark our countdown to the International Year of Evaluation in 2015! 

Potent Presentations Initiative - Best Presentations from AEA 2013
From Stephanie Evergreen, Potent Presentations Initiative Coordinator  
Potent Presentations

 

Last month, I asked you to send in your nominations for presentations you saw at the conference that were particularly potent in their message, design, and delivery. Thanks for all of your responses! Nominations included:

  

Laura Beals & Jennifer Lowe

  

Their nominator said, "I happen to know that Laura and Jennifer used the p2i materials extensively and gave a tremendous amount of thought to the best way to communicate their material effectively. In my opinion, their final slideshow and handout reflected this work." 

 

  

  

Kat Athanasaides

  

Kat's nominator said, "This one from Kat Athanasiades (with input from Veena Pankaj) [has] screenshots, photos she took herself, high-resolution landscape photos, charts, diagrams, etc. organized by headings (setting the stage, process, key insights) so the audience can follow along." Click through the entire slide deck.

 

 

Taj Carson  

 

Taj was nominated for her use of visuals as support, not as a replacement of her speaking. Her delivery was also crisp and well-timed. View the recording of Taj's Ignite session on mapping. 


 

Isaac Castillo & Ann Emery

  

Isaac and Ann were nominated because of their clear visuals representing somewhat abstract ideas and their ability to use many variables (text position, color, etc) to communicate a contrast between performance management and evaluation. View their complete slide deck or watch the recording of their presentation. 

 

 

New Jobs & RFPs from AEA's Career Center  
What's new this month in the AEA Online Career Center? The following positions have been added recently: 

Descriptions for each of these positions, and many others, are available in AEA's Online Career Center. Job hunting? The Career Center is an outstanding resource for posting your resume or position, or for finding your next employer, contractor, or employee. You can also sign up to receive notifications of new position postings via email or RSS feed.

Register
Get Involved
About Us
AEA is an international professional association of evaluators devoted to the application and exploration of evaluation in all its forms.

 

The association's mission is to:
  • Improve evaluation practices and methods.
  • Increase evaluation use.
  • Promote evaluation as a profession.
  • Support the contribution of evaluation to the generation of theory and knowledge about effective human action.
phone: 1-202-367-1166 or 1-888-232-2275 (U.S. and Canada only) 
websire: www.eval.org