Newsletter: February 2014 

Vol 14, Issue 2

Banner600

Message from the President: Conversations and Connections

As February comes to a close, I reflect on the diverse and fruitful conversations I've had this month about the field of evaluation and our 2014 conference theme. Here's a snapshot of several conversations and thoughts about how we collectively expand our interactions to contribute to the public good. 


Ongoing AEA Board Conversations

 

The February AEA board meeting kicked off the year's review of our overall strategic direction. The results of our work included two topics to discuss at our June meeting: credentialing/accreditation and appropriate growth for AEA. Please contact me or other board members if you have comments for us to consider. We welcome your engagement in these conversations. 

 

Making Diverse Connections

 

In early February, I discussed systems thinking and the conference theme — Visionary Evaluation for a Sustainable, Equitable Future — with participants in the Graduate Education Diversity Internship Program (GEDI). 

  

Later that same day, AEA staff members Denise Roosendaal and Zachary Grays and I enjoyed going over preparations for the conference with the Local Arrangements Working Group (LAWG). The leaders of LAWG, Stephanie Fuentes, Antonio Olmos, and Liesel Ritchie, are exploring great options for our conference. Stay tuned to the AEA365 blog for LAWG's week later this year.

 

Soon after, I was the keynote speaker at the University of Michigan School of Social Work. At the invitation of John Seeley, longtime AEA member and associate director of the school's Program Evaluation Group of the Curtis Research Center, I addressed the conference on the 2014 AEA theme. Talking about systems thinking with social workers proved especially rewarding because of their experience working with multiple social systems.  

 

The other week in Seattle, I participated in ORS Impact's Wine and Wisdom series. Lovely Dhillon, VP at ORS Impact (and co-chair of the AEA 2014 conference program), interviewed me about evaluation and the conference theme and then opened up the conversation to the group. It was a stimulating time of networking and reflection on the field of evaluation — past, present, and future — among the attendees from ORS Impact, businesses, foundations, and nonprofit organizations. Consider creating similar conversations in your own organization/area by drawing on local evaluators and evaluation users to enrich our collective vision of evaluation and its future.

 

Inviting Your Participating in the Conference

 

Now is the time to prepare for engaging conversations at the 2014 conference. The deadline for submitting conference proposals is March 17. I've been impressed with the creative thinking, the appreciation for diversity, and the methodological rigor of the proposal ideas that several of you have shared with me. 

 

Consider submitting a joint proposal with someone outside your discipline or geographic area. And, rather than reporting only on what you've done, include in your proposal how you envision your future work increasingly will support a sustainable equitable future for all.

 

Whether or not you submit a proposal, you can engage in conversations about the 2014 theme. Consider these two ideas: 

 

  • The conference theme highlights systems thinking. Check out the Thought Leaders Forum, March 5-12, with Glenda Eoyang, an expert in systems thinking. 
  • The theme emphasizes a global perspectives: a sustainable, equitable future for all. Take a moment to connect with someone in another country to help you better understand evaluation's contribution to such a future. Think particularly of our colleagues throughout the world who are in politically tense and difficult times or suffering from natural disasters. Let an evaluator in such a situation know you are thinking of him or her by sending an encouraging email. Check out the IOCE and EvalPartner websites to learn more about the evaluation community worldwide.


Warm regards for a bountiful spring, 

 

Beverly Parsons

AEA 2014 President

In This Issue
Call for Proposals
Call for Board Nominations
2014 Summer Institute
2013 Award Winner
Walking the Talk
Policy Watch
Diversity
Book Profile
eLearning
p2i
New Job Postings
Register
Get Involved
About Us
Quick Links
Important Note
To ensure this newsletter reaches you every month, add [email protected] to your email contacts!
Join our Mailing List!
Proposal Submissions for Evaluation 2014 Due by March 17

AEA invites all who are involved in the field of evaluation to share their best work in evaluation theory or practice at Evaluation 2014, the annual conference of the American Evaluation Association (AEA), held in Denver on Oct. 15-18, 2014 (professional development workshops are held Oct. 13-14 and 19, 2014).  

 

The conference is divided into topical strands that examine the field from the vantage point of a particular methodology, context, or issue, as well as the conference theme highlighting this year's theme of Visionary Evaluation for a Sustainable, Equitable Future. Presentations may explore the theme or any aspect of the range of evaluation theory, practice, management, or consulting.

 

Proposal Submissions must be received by 11:59 p.m. ET on March 17, 2014.

 

If you have an eval.org account, please log in to submit a proposal.

Call for Nominations for the AEA Board of Directors

Show your commitment to the value of the American Evaluation Association and help shape its future! You may nominate yourself or a committed, AEA colleague for the board of directors. This year we will elect three board members-at-large and a president-elect. Nominating candidates for office is a valuable service to the association, and your thoughtful participation in this process is greatly appreciated.  

 

Only AEA members may serve on the AEA Board of Directors or as officers. The president-elect serves as president-elect in the first year, becomes president in the second year, and serves as past-president and secretary in the third year of his or her term. The president-elect and board members-at-large all will serve three-year terms, beginning Jan. 1, 2015, and will attend three in-person board meetings each year, as well as multiple, approximately monthly, phone-based meetings.

 

Deadline: Friday, March 28, 2014 

 

Read the full call for nominations.

Save the Date: 2014 Summer Institute, June 1-4 in Atlanta

The 2014 Summer Institute is right around the corner! Join the American Evaluation Association June 1-4, 2014, in Atlanta for this year's Summer Institute. Stay tuned for registration details and a list of professional development workshops to be announced in early March. 

 

Read about the 2013 Summer Institute

AEA Announces 2013 Award Winner Thomas Chapel

The American Evaluation Association honored six individuals at its 2013 Awards Luncheon in Washington, D.C. Honored this year were recipients in six categories involved with cutting-edge evaluation/research initiatives that have impacted citizens around the world. We've spotlighted two award winners thus far and will continue in upcoming issues. Today we extend our congratulations to Thomas Chapel.

 

Thomas J. Chapel, MA, MBA, Chief Evaluation Officer, Centers for Disease Control (CDC), 2013 Alva and Gunnar Myrdal Government Evaluation Award

 

As the Centers for Disease Control's first chief evaluation officer, Chapel strengthens program evaluation and expands CDC-wide evaluation capacity through standards, training, tools, and resources. Chapel joined CDC in January 2001 as a health scientist and has been providing evaluation leadership ever since. He is known nationally for his work in evaluation and evaluation capacity building, and has been active in leadership with AEA and its affiliates. 

 

Before joining CDC, Chapel was a vice president in the Atlanta office of Macro International (now ICF International), where he directed several large task order contracts for CDC and other departments within the U.S. Department of Health and Human Services. Chapel received his bachelor's degree from Johns Hopkins University and his MBA and MPP degrees from the University of Minnesota.  

 

Chapel explains, "This award is an affirmation of almost a decade of work within CDC to integrate evaluation into performance measurement and planning, creating a culture of continuous program improvement. It's also a recognition of the challenges we face in decentralized environments such as CDC where the work of evaluation professionals like me is less about conducting strong evaluations and more about enabling and motivating a network of others to do so."

AEA Values - Walking the Talk with Dominica McBride

Are you familiar with AEA's values statement? What do these values mean to you in your service to AEA and in your own professional work? Each month, we'll be asking a member of the AEA community to contribute her or his own reflections on the association's values.  

 

AEA's Values Statement

The American Evaluation Association values excellence in evaluation practice, utilization of evaluation findings, and inclusion and diversity in the evaluation community.

 

             i.  We value high quality, ethically defensible, culturally responsive evaluation practices that lead to effective and humane organizations and ultimately to the enhancement of the public good.

             ii. We value high quality, ethically defensible, culturally responsive evaluation practices that contribute to decision-making processes, program improvement, and policy formulation.

            iii. We value a global and international evaluation community and understanding of evaluation practices.

            iv. We value the continual development of evaluation professionals and the development of evaluators from under-represented groups.

             v. We value inclusiveness and diversity, welcoming members at any point in their career, from any context, and representing a range of thought and approaches.

            vi. We value efficient, effective, responsive, transparent, and socially responsible association operations.

 

 


 

AEA's values have touched me deeply, both professionally and personally. I'd like to share two experiences in particular that highlight the importance of these values and what they've meant to me: 

 

Aroti was only 18 months old when I met her, crying and inconsolable by others. One of the few volunteers that was attempting to soothe her eventually gave her to me. After a couple minutes of rocking and cooing, she stopped crying. Soon after, it was meal time. I fed her some porridge, and she suddenly began to tear up again. As tears filled this little girl's eyes, they started to fill mine. When she noticed my tears, she suddenly stopped crying and lifted a finger to my eye as if to comfort me. I cried even more, but this time out of being touched by her inherent and immediate empathy.

 

This is one of the most memorable moments of my life. It took place at an orphanage for children who lost parents to AIDS. This was while I was working in in Tanzania on an evaluation of an HIV prevention program. Although I did not have the means to adopt her and take her home, I did have a tool (evaluation) that could help prevent others like her from becoming orphans due to HIV/AIDS. It's values like AEA's that help to ensure we as evaluators can support children like Aroti. Guided by these values, we have the encouragement to focus on diversity, be culturally responsive, strive toward the greater good, and do our part in manifesting social justice and thriving communities.  

 

The other night, I facilitated a community evaluation team meeting of youth, parents, elders, and organizational staff. In this meeting, we discussed our values, what we saw as important in the program and for the community, and how we would approach the evaluation at hand. This meeting took place in a community that experienced 22 homicides, 202 aggravated assaults, and 400 aggravated battery crimes within one year. This team was developed in partnership with my organization (Become), and a youth center in this community. I founded and run Become with the mission of working with communities facing these challenges and using evaluation as a tool to help the community thrive. I've necessarily infused AEA's values in our work and philosophy. Applying these values has also led to youths' voices being heard, parents mobilized to push for what they want and need, and community elders feeling valued and a sense of belonging. It's because of AEA values that evaluation can be used in this way - to support community empowerment, see and emphasize the value of culture, and, thus, make this world a better place.  

Policy Watch - New Guidelines for STEM Evaluation  

OrosFrom Cheryl Oros, Consultant to the Evaluation Policy Task Force (EPTF)

 

In February, AEA celebrated Science, Technology, Engineering and Mathematics (STEM) Education and Training TIG Week. On a related note, this column is intended to help AEA members be aware of new guidelines for evaluating STEM programs. In 2013, in response to recommendations from the Office of Science and Technology Policy (OSTP) and guidance from the Office of Management and Budget (OMB), the U.S. Department of Education (ED) and the U.S. National Science Foundation (NSF) published the cross-agency Common Guidelines for Education Research and Development, which cover program evaluation. Aimed at program officers, prospective grantees, and peer reviewers, this is a "living document" that may be adapted by agencies involved in improving the quality of knowledge development in (STEM) education. The Guidelines emphasize the importance of studies that build on an evidence base and, in turn, contribute to the accumulation of empirical evidence and the development of theoretical models. 

 

NSF and EDs shared expectations are intended to 1) help guide their decisions about investments in education research; 2) clarify for potential grantees and peer reviewers the justifications for, and evidence expected from, different types of studies; 3) highlight relevant aspects of research design that would contribute to high-quality evidence; and 4) provide advice on building an investigation team. 

 

The Common Guidelines generally adheres to the principles identified in Scientific Research in Education (National Research Council, 2002).  These call for an investigation that: 

 

  • poses significant questions that can be investigated empirically; 
  • links empirical research to relevant theory; 
  • uses research designs and methods that permit direct investigation of the question; 
  • is guided by a coherent and explicit chain of reasoning; 
  • replicates and generalizes across studies; and 
  • attends to contextual factors. 

 

The six types of investigation described in the document generally form a "pipeline" of evidence that begin with basic and exploratory research, move to development of interventions or strategies, and, for interventions with initial promise, result in an examination of their effectiveness for improving learning. The Guidelines acknowledges, however, that the reality of building a body of knowledge is considerably more complex than a simple sequence of development.  

 

Among the types of evaluation studies are impact studies, which are meant to generate reliable estimates of the ability of a fully developed intervention or strategy to achieve its intended outcomes. For an impact study to be warranted, according to the Guidelines, the theory of action must be well-established and the components of the intervention well specified. The Guidelines further identifies three types of impact studies that differ with regard to the conditions under which the intervention is implemented: 1) Efficacy studies testing an intervention under "ideal" circumstances, including those with a higher level of support or developer involvement than would be the case under normal circumstances; 2) Effectiveness studies examining the success of an intervention under circumstances that would typically prevail in the target context; and 3) Scale-up studies examining success in a wide range of populations, contexts, and circumstances.  

 

When the Common Guidelines are referenced in program solicitations, applicants will need to be familiar with and address them in their proposals. Agencies must ensure that expert review panels are well informed of how the guidelines should be applied when evaluating proposals. The Guidelines also note that OMB, OSTP, the Government Accountability Office (GAO), and other federal entities may elect to use them as part of oversight. The Guidelines also may make the public more aware of the agencies' goals for investments in education research and development to achieve immediate and long-term improvement of education and learning. This is another example of federal agencies developing evaluation policies.   

Diversity - Oh, the Places You'll Go: AEA Calls for 2014 International Travel Award Applications

From Zachary Grays, AEA Headquarters

 

It is no secret the many exciting things in store for 2014 here at AEA. Summer Institute is a few short months away, and, before we know it, we will be convening in the Mile High City for Evaluation 2014

 

Beginning Feb. 24, 2014, AEA and the International and Cross-Cultural Evaluation (ICCE) TIG are accepting applications from international evaluation professionals from underdeveloped and developing countries to participate in this year's round of International Travel Awards. Leading the reviewing process from the ICCE TIG this year is Jonathon Jones, senior evaluation specialist with EnCompass LLC. Thanks to generous donations from attendees at Evaluation 2013, donations from partner evaluation associations, and your support during the annual Silent Auction, AEA will be able to award five International Travel Awards this year. 

 

Awarded annually, the International Travel Awards are awarded to international practitioners to help curtail the cost of traveling abroad to attend the annual conference and for the opportunity to highlight the evaluation work they conduct in their home countries. In order to qualify for an AEA International Travel Award, all applicants must meet all criteria below:  

 

  • Have not previously participated in AEA conferences
  • Demonstrate good fluency in English, being able to make a professional presentation and sustain a discussion in English
  • Submit a complete individual application
  • Be a citizen of, and both reside and practice evaluation in, a developing country or country in transition for at least two years (U.S. citizens, or those with dual citizenship between the United States and a second country, are not eligible; for a list of countries considered to be developing or in transition, click here)
  • Propose to present at the Evaluation 2014 conference and have at least one proposal accepted

 

The deadline for submissions is midnight ET March 17, 2014. All completed application material should be sent to Jonathan Jones.  

 

Do you know someone who would make a great addition to the prestigious roster of conference presenters? Don't miss this opportunity to spread the word to your colleagues across the globe about these travel awards. More than anything, engaging the international evaluation platform offers a wealth of knowledge, collaborative opportunities, and accessibility to intellectual resources to practitioners across the evaluation discipline. It is the great pleasure of the association to present these awards and provide the awardees a place to showcase their remarkable work. 

 

Learn more about this year's International Travel Awards

Book Profile - Bridging the Gap Between Asset/Capacity Building and Needs Assessment: Concepts and Practical Applications

James W. Altschuld, a charter member of AEA and, before that, a member of one of its predecessor organizations, is the author of Bridging the Gap between Asset/Capacity Building and Needs Assessment: Concepts and Practical Applications, a new book published by SAGE. 

 

From the Publisher's Site:

 

In this groundbreaking text, the author examines the synthesis of two antithetical ideas: needs assessment and asset/capacity building. At the heart of this approach is a focus on assessing the strengths and assets that communities have and demonstrating how to make those assets stronger. The author explains the foundation of needs assessment and asset/capacity building, discusses their similarities and differences, and offers a new hybrid framework that includes eight steps for how they can be done jointly for better results. The author then applies a checklist for judging the quality of this approach to six cases that represent real-world applications of hybrid principles. The last chapter demonstrates how such efforts might be studied in the future, emphasizing ways findings and results from hybrid ventures can be used effectively. A wide range of examples, tables, and figures appear throughout, with insightful discussion questions at the end of each chapter to facilitate meaningful discourse. 

 

From the Author:

 

For years I knew of criticisms of needs assessment, but not once in my involvement in seven prior needs assessment books did I ever deal with them (avoidance to the utmost). So, I thought, let's bite the bullet and see what is being said. Hence this book. It looks at the history of needs assessment and a rising tide of negative voices about the topic. The counterpoint comes from a positive, assets, or resources stance that is in sharp contrast to the negative, something is missing/wrong premise of needs. There are merits to both positions, but it would be best if a hybrid framework across them could be synthesized, one that melds yet respects the integrity of each. In my biased judgment and that of the reviewers of the text, this book has accomplished that in a readable and straightforward manner. The new eight-step hybrid is explained, six real-world applications are examined, and, lastly, possibilities for future work and research are explored. It should be noted that since the text was written, more examples of applications in regard to educational and social programming are appearing in the literature. 

 

About the Author:

 

James W. Altschuld is a professor emeritus from The Ohio State University where he developed and taught basic research and program evaluation courses for 28 years. He is a longtime advocate for needs assessments, as evidenced by authoring, co-authoring, or editing many books on the topic in 1995, 2000, and the five volume Needs Assessment Kit of 2009/2010. He has won local, state, and national awards (AEA's Alva and Gunnar Myrdal Award) for contributions to the field.    

 

Visit the publisher's site

eLearning Update - Discover Upcoming eStudy Courses and Coffee Break Demonstrations

Our eStudy program is made up of in-depth virtual professional development courses. Below are March's eStudy offerings: 

 

eStudy 040: Digital Qualitative: Leveraging Technology for Deeper Insight - Bob Kahle 

March 8 and March 20

3-4:30 p.m. ET

 

This short course seeks to describe the range of new qualitative techniques available and describes how and when to use them to generate deeper insight as part of your evaluation efforts. This eStudy will occur in two 1.5-hour sessions and will include preparation materials sent before, between, and after the sessions. 

 

Read more and register.

____________________________________________________________________________________ 

 

Our Coffee Break Webinars are short, 20-minute presentations of tools or tips we think evaluators will find helpful in their work lives. Let's take a look at what's in the pipeline for March:

 

Monday, March 3
2-2:20 p.m. ET

Stan Capela will share with participants a variety of techniques that he has learned in his 30-plus years working in the evaluation field. Stan will speak on his experience as an internal evaluator in a large nonprofit agency as well as ideas he picked up as a peer reviewer for the Council on Accreditation in more than 30 states, Canada, Germany, Guam and Japan. At the conclusion of the session, participants will acquire different techniques on communication and engagement as well as different approaches on how to adapt to changing situations and create a culture that fosters positive results.

 

You can pre-register for the webinar by clicking the link above. 

Potent Presentations Initiative - Spread the Word
From Stephanie Evergreen, Potent Presentations Initiative Coordinator  

 

Good news! More and more AEA members have heard of the Potent Presentations Initiative and used the tools to improve their conference presentations. Awesome! Even better news? AEA 

members are using p2i's guidance to crank up their presentation effectiveness outside of the conference, too — in their regular evaluation practice. Sweet!

 

But the not so great news is that there are still a lot of AEA members who told us they had never even heard of the Potent Presentations Initiative. So do those folks a favor: Spread the word. It's a win-win proposition. They find ways to become better presenters and you, dear audience member, will learn more from what they present.


We have blogged about p2i on aea365, built an incredible website, posted about it on LinkedIn, Twitter, and Facebook, and written about it right here in this newsletter every month. While we think we've shouted about p2i from the rooftops, we aren't reaching everyone. Can you add your voice? Here's an easy way to do it. Download our one-page handout highlighting some of our favorite and most helpful presentation tools. Take copies of the handout to your next evaluation brown bag lunch. Email copies to your local affiliate membership. Post a copy in your office hallway. Send it via paper airplane down the hallway. 

 

Help us help improve presentations. Together, we are making AEA a leader in supporting member presentation skills.

New Jobs & RFPs from AEA's Career Center  
What's new this month in the AEA Online Career Center? The following positions have been added recently: 

Descriptions for each of these positions, and many others, are available in AEA's Online Career Center. Job hunting? The Career Center is an outstanding resource for posting your resume or position, or for finding your next employer, contractor, or employee. You can also sign up to receive notifications of new position postings via email or RSS feed.

Register
Get Involved
About Us
AEA is an international professional association of evaluators devoted to the application and exploration of evaluation in all its forms.

 

The association's mission is to:
  • Improve evaluation practices and methods.
  • Increase evaluation use.
  • Promote evaluation as a profession.
  • Support the contribution of evaluation to the generation of theory and knowledge about effective human action.
phone: 1-202-367-1166 or 1-888-232-2275 (U.S. and Canada only) 
websire: www.eval.org