Newsletter: October 2013

Vol 13, Issue 10

Banner600

Message from the President

Fitzpatrick.2

Dear colleagues,

 

I've just returned from a week at Evaluation 2013, our annual conference. From the feedback I received from you, I think many of you had great personal and learning experiences at the conference. (But, please do complete the evaluation form that was sent to you via email. I'm sure my sample was biased, and we would love your feedback to help us do more next year.) Meanwhile, I will use my monthly column to highlight some elements of the conference. 

 

First, the weather was beautiful! We could not have asked for more sunny, 60-degree, gorgeous fall weather. I hope many of you got out to see a bit of Washington D.C., as I did on Sunday. And, thanks again, to our local arrangement co-chairs, David Bernstein and Valarie Caracelli, the Washington Evaluators group, and the scores of volunteers who helped them to tell us about D.C. 

 

As you know, the board recently voted to have D.C. as our site every four years. This decision was made in order to make connections with government folks in the city who make evaluation policies and decisions on funding. We also are happy that this will allow us to draw on the abundance of evaluators and speakers who work in D.C. Many other professional associations do such a rotation, and we thought we should take up this good idea. We will be able to build on the work of the local arrangement chairs and their volunteers to make the most of this decision. To demonstrate the possibilities in D.C. at this conference, George Grob arranged a reception for and a dialogue with federal evaluators and policymakers. Many of them stayed to attend sessions at their first AEA and indicated they hope to return. George Julnes arranged a similar reception and dialogue with foundation leaders, and many key representatives from that stakeholder group attended the conference and learned more about our work. 

 

This conference was our first since our change to new management. Susan Kistler, our former executive director, stayed with us to develop the program and give strategic advice to SmithBucklin. Next year, she'll be with us as a member, and we look forward to her participation in that role. Meanwhile, Denise Roosendaal, our new executive director, and the staff of SmithBucklin, our new association management company (AMC), provided a seamless transition, handling the many logistics of the conference as well as the details — verything from directing us to rooms and notifying us of program changes as the federal government came back to life to making sure we had coffee, tea, water, and soda to tide us over during our breaks. They accomplished all this in a calm, unobtrusive way that allowed us to go on about our main business of attending sessions, making contacts, meeting old friends, and making new ones. 

 

For those of you who were unable to attend the conference — and for attendees who, of course, cannot make every session — we will be posting some papers and PowerPoint slides from the plenary sessions, presidential strand sessions, and a few other highlights on the website soon. I just wish we had a film of Huey Chen's session on "Are Threats to Internal Validity a Curse or a Blessing for Evaluation? Lessons Learned from a Zumba Class"! I heard wonderful things about it. (I do hope we can do some video next year.) We will also post some pictures to give you a sense for the conference and the enthusiasm we all felt. 

 

I would like to again thank my program co-chairs, Kathy Newcomer and Jonny Morell, and my program committee — Tom Schwandt, Leslie Goodyear, Marco Segone, Katherine Dawes, and Mike Hendricks — for their tremendous work in helping put together some absolutely wonderful sessions that made us aware of the breadth and depth of evaluation practice, among us and beyond us, today.  

 

Sincerely,

Jody

Jody Fitzpatrick

AEA 2013 President  

In This Issue
Meet Melvin E. Hall
Walking the Talk
Face of AEA
eLearning
Diversity
p2i
Book Profile
New Job Postings
Register
Get Involved
About Us
Quick Links
Join our Mailing List!
Meet Melvin E. Hall - Incoming Member at Large

 

Melvin E. Hall, Ph.D., is professor of educational psychology at Northern Arizona University. Hall completed his B.S. and Ph.D. degrees at the University of Illinois at Urbana Champaign in social psychology and educational psychology, respectively. He received his M.S. in counseling from Northern Illinois University.

 

Throughout a 36-year professional career in higher education, Hall has served in four successive appointments, as an academic dean, comprised of positions at Florida Atlantic University, University of California-Irvine, University of Maryland at College Park, and Northern Arizona University (NAU). At NAU, Hall served as dean of the College of Education and, additionally, was the principal investigator on two five-year U.S. Office of Education GEAR UP grants, providing dropout prevention programs and services to thousands of middle and high school students throughout Arizona. 

 

Hall additionally provides public service as an appointed member of the Arizona State Supreme Court Committee on Character and Fitness, which reviews all candidates for admission to the practice of law in the state of Arizona, and the 2013 Visioning Scottsdale effort to create the state mandated update of the city's General Plan. His primary teachings include educational research, program evaluation and two courses he developed titled "A Developmental Perspectives on Human Diversity," and "Foundations of Inquiry and Practice in Human Relations," which are offered as core courses in a master's program in human relations.  

 

In his ballot statement, Hall stated: "What I would bring to the AEA Board is diligence in positioning the association as an unwavering advocate for the public fiduciary responsibility inherent in the field. When work done in the name of program evaluation fails to prioritize a concern for the public good, demonstrate a willingness to be open to public scrutiny and transparency of methods, methodology, and purpose, in my view it is no longer program evaluation." 

 

We welcome Melvin E. Hall and thank all who participated in this year's election process! 

AEA's Values - Walking the Talk with Leah Goldstein Moses

Are you familiar with AEA's values statement? What do these values mean to you in your service to AEA and in your own professional work? Each month, we'll be asking a member of the AEA community to contribute her or his own reflections on the association's values.  

 

AEA's Values Statement

The American Evaluation Association values excellence in evaluation practice, utilization of evaluation findings, and inclusion and diversity in the evaluation community.

 

             i.  We value high quality, ethically defensible, culturally responsive evaluation practices that lead to effective and humane organizations and ultimately to the enhancement of the public good.

             ii. We value high quality, ethically defensible, culturally responsive evaluation practices that contribute to decision-making processes, program improvement, and policy formulation.

            iii. We value a global and international evaluation community and understanding of evaluation practices.

            iv. We value the continual development of evaluation professionals and the development of evaluators from under-represented groups.

             v. We value inclusiveness and diversity, welcoming members at any point in their career, from any context, and representing a range of thought and approaches.

            vi. We value efficient, effective, responsive, transparent, and socially responsible association operations.

 

 


 

Hello evaluators! I am Leah Goldstein Moses, founder and CEO of the Improve Group. I'm excited to reflect on how the AEA values are infused in my work as an evaluator and in how my company presents itself globally. The task is very timely for me. I recently watched two videos: Simon Sinek's TED talk on inspiring action, and Brene Brown's TED talk on the power of vulnerability. Both of these videos inspired me to think about why I do evaluation. In turn, the "why" influences how I practice and what I focus on. In fact, after returning from #Eval13, my staff took some of our time to better articulate our "why" across the whole firm. 

 

So, why evaluate? We evaluate because there are people in need in our communities, and we want the programs that serve them to actually make a difference. As one of my colleagues, Jill Lipski Cain, put it: "Evaluation has the power to make services better. By figuring out what is working and what isn't, we can make sure that time and resources are really going to help people." As described in the AEA values, we evaluate to make organizations more effective, humane, and ultimately enhance public good, and to help leaders fully consider the impact of their decisions.   

 

At the Improve Group, we fulfill this "why" in our practice by:    

 

  • Remembering that evaluation is about people. People are the ones who contribute their knowledge and ideas with us, and they are also the ones who will ultimately benefit from any changes made because of our findings. We strive to be respectful, considerate, and engaging in all of our work. For example, if we know we are going to work with aging adults in an evaluation, we gather data in places where they are already gathered, show respect for their wisdom, and ask representatives to help us interpret our findings. 
  • Linking to broader systems. None of the programs, policies, or systems we evaluate operate in isolation. Each is affected by evolving needs and the constraints and opportunities in other systems. We bring forward ideas from multiple systems when working with our client organizations. For example, when we are evaluating a program focused on workforce readiness, we look at how higher education, the job market, and availability of public services affect the ability of participants to prepare for work.
  • Being clear in our messages and communication. Ultimately, our findings need to be understood and relevant in order to be used to improve services. And we need to understand different audiences and how they prefer to hear information when making decisions. Our communication strategy is often multilayered and nuanced. We might write a detailed, technical report for a program manager and an accompanying short infographic for a legislator. Combined, these two pieces allow the decision makers to have informed, productive discussions about needed changes. Another example is a very simple chart - with only a few data points - showing program staff the difference between kids' perceptions of healthy weight and their actual BMI results. This graphic helped spur an in-depth discussion about programs and messaging for kids of all ages.   


I've been so lucky to be part of the AEA community. Through its listservs, journals, conferences, and webinars, I'm constantly exposed to new ideas about the "why," "how," and "what" of evaluation. I'm looking forward to an evolving, changing practice in the years ahead that continues to focus on the values of quality, inclusiveness, cultural responsiveness, and the greater public good. 

Face of AEA - Meet Robin Lin Miller

AEA's more than 7,800 members worldwide represent a range of backgrounds, specialties, and interest areas. Join us as we profile a different member each month via a short question-and-answer exchange. This month's profile spotlights Robin Lin Miller.

 

Name: Robin Lin Miller

Affiliation: Professor of Ecological-Community Psychology, Michigan State University

Degrees: B.A., Liberal Arts (Sarah Lawrence College, Bronxville, N.Y.), M.A., and Ph.D., Community Psychology (New York University, New York, N.Y.)

Years in the Evaluation Field: Approximately 27

Joined AEA: 1992

 

 

Why do you belong to AEA?

 

Because it's awesome! My first fulltime evaluation job was as an internal evaluator at the Gay Men's Health Crisis (GMHC) in New York City. I started there during a key historical period in the domestic AIDS epidemic just after the formation of ACT UP and prior to our having much to offer in the way of biomedical treatment for the disease. We were in uncharted territory in so many ways, with organizations like GMHC, the San Francisco AIDS Foundation, and AIDS Project Los Angeles leading the way in prevention, patient advocacy, and many other areas. It was exciting, yet often overwhelming as a junior evaluator gaining her sea legs and trying to provide useful information in the face of such a complex and urgent social issue. Thankfully for me, I met Ross Connor through our mutual connection to the American Foundation for AIDS Research and, through him, was introduced to AEA. AEA provided me with the professional support, role models, and learning opportunities to improve my work and sharpen my thinking. AEA continues to be the place that most stimulates my imagination and fosters my professional growth. From day one, it was my primary professional home.  

 

Why do you choose to work in the field of evaluation? 

 

In addition to the fact that I love the intellectual challenge of figuring out how to get useful and credible information in very messy real-world situations, I see evaluation as an essential tool to address issues such as HIV. Evaluation can ensure that the experiences and perspectives of the intended beneficiaries of programs and policies are closely examined. Without evidence produced through evaluation, many populations in need will be left to the mercy of others' benevolence. I began to lose lifelong friends and acquaintances to AIDS in 1981. During the earliest years of my volunteer and professional involvement in AIDS, the community I lived in on Fire Island held 30 or more memorial services a month for men who had recently died. Among the men memorialized in these services were men whom I had known from my earliest childhood. 

 

Between 1986 and 1995, I lost two of my dearest friends, several close GMHC colleagues, and numerous acquaintances. During those years, I would often leave work to spend my evenings to help take care of friends who were ill or dying. With the advent of improved treatment, three of my closest friends are healthily living with HIV today, two of whom became infected more than 20 years ago. That is a miracle I wouldn't have ever imagined back then. Thinking of these friends reinforces how important it is to me to contribute something to efforts to prevent other communities of people from ever having to experience the devastation HIV wrought in the lives of my friends. I think of evaluation as a critical tool through which this can be accomplished. 

 

I frequently work with small AIDS organizations in devastated areas such as Detroit. Every time I meet an 18- or 19-year-old young black gay man who just learned that he is HIV infected or someone living with HIV who is ill and not in medical care, it reinforces for me how critical it is that evaluators are out there trying to help communities navigate from where we are now to a place where young people are at minimal risk of exposure and people who are living with HIV can do so in health and with dignity. 

 

What's the most memorable or meaningful evaluation that you have been a part of?

 

Hard question! I'm torn between two recent evaluations. The first was a statewide assessment of the HIV prevention needs of black gay and bisexual men ages 12-24 for the Department of Community Health. That was an especially meaningful project because I hired six young black gay men who lived in different hard-hit areas of the state and who were all younger than 24 as co-investigators. They were part of every decision and aspect of the study from what to ask about and how, to hiring and training interviewers, to disseminating the findings to Michigan's statewide HIV prevention and care planning body. When these young men stood up and reported findings such as that about a third of the 180 young men we interviewed from across the state had been sexually assaulted, on the average before they had reached the age of 12, you could have heard a pin drop. One of the young men organized a town hall meeting on the findings in Detroit and, five years later, young men in the community still refer to it as their study. 

 

The second looked at the long-term health outcomes of HIV-infected people who had used a prison release re-entry service designed to link them to medical care. That was memorable because we had a tiny budget, a very tight timeline of seven months, and we had to track down 190 people who could have been released from prison as long as eight years ago and could now be anywhere. It took a lot of ingenuity and effort to answer the questions under the constraints we had to work with, but we did it. We found the vast majority of the 190 and learned something about how most had fared since release. 

  

What advice would you give to those new to the field?

 

Get involved with a TIG or local affiliate; don't just show up to their sessions and events. Volunteer! Take advantage of the resources that AEA has to offer, such as AEA 365, the Coffee Break Webinars, the thought leader forums, and the professional development sessions associated with the conference. Not only do these provide great no- to low-cost learning opportunities, they are essential means to build a diverse professional community of colleagues to support your future work. Join other evaluation societies. And, network, network, network! 

eLearning Update - Discover Upcoming Coffee Break Demonstrations and eStudy Courses

From Alexa Schlosser, AEA Headquarters 

 

Our Coffee Break Webinars are short, 20-minute presentations of tools or tips we think evaluators will find helpful in their work lives. As October's webinars come to a close, let's take a look at what's in the pipeline for November:

 

CBD163: InQuiry: Q Methodology for Participatory Evaluation - Matthew Militello & Chris Janson
Thursday, Nov. 14, 2013

2-2:20 p.m. ET

 

Q methodology uses distinct psychometric principles and operational procedures in order to provide researchers with the means to systematically and rigorously identify, describe, and examine human subjectivity (see www.qmethod.org). In doing so, Q methodology uses correlation and factor analysis to reveal the subjective structures of attitudes, opinions, or perspectives that are shared by people around virtually any topic. Participants in this demonstration will learn about Q methodology as a research tool and as a process for community-engaged evaluation. Attendees will be provided with links to resources and videos that demonstrate how Q methodology has been used in evaluation.   

 

CBD164: Using Slicers to View Data in Excel - Lisa Holliday

Thursday, Nov. 21, 2013

2-2:20 p.m. ET

 

Lisa R. Holliday, an evaluation associate at The Evaluation Group, will show how to use slicers to link and filter data across multiple pivot tables. Slicers are a built-in Excel feature that allows you to organize and find the information quickly. 

 

You can pre-register for these webinars by clicking the links above! 

____________________________________________________________________________________

 

Our eStudy program is made up of longer, more in-depth virtual professional development courses. Below are November's eStudy offerings:

 

035: Focus Group Research: Understanding, Designing and Implementing - Michelle Revels 

Nov. 15 and Nov. 22, 2013

1-2:30 p.m. ET

 

As a qualitative research method, focus groups are an important tool to help researchers understand the motivators and determinants of a given behavior. This course, based on the seminal work of Richard Krueger and David Morgan, provides a practical introduction to focus group research.

 

036: Practical Evaluation: An Overview and Introduction for Those New to Evaluation - Thomas Chapel 

Nov. 19, Dec. 3, and Dec. 10, 2013

noon to 2 p.m. ET (Nov. 19 and Dec. 3); 1-3 p.m. ET (Dec. 10)

 

This course is an introduction to the major steps in designing and conducting a program evaluation. Using CDC's Framework for Program Evaluation, the course will emphasize the central importance of setting the right evaluation focus and asking the right evaluation questions. And how engaging stakeholders and a strong program description help ensure that the focus and questions are the best ones. The last session will show how those early decisions make it easier to identify the most appropriate data collection methods and sources, choose the right ways to analyze data, and ensure that findings are reported in ways that will maximize the use of findings for program improvement. Students will work on a variety of case examples to help reinforce teaching points. 

 

Read more and register.  

Diversity - Halfway Around the World: Evaluation 2013 Silent Auction Sets Fundraising Record for International Travel Awards

From Zachary Grays, AEA Headquarters

 

As stewards of inclusivity and advocates of broadening the scope of professionals in and influencing evaluation, AEA awards several competitive travel stipends to offset costs for evaluation professionals who may not otherwise attend our annual meeting. With an attendance that included more than 3,000 aspiring, seasoned, and legendary evaluators from every corner of the Earth, global prospective and insight played a key role in the experience of this year's attendees. Those who joined us here in Washington, D.C., for our annual conference had the chance to experience the many exciting highlights. One that shined the brightest, however, was this year's Silent Auction. For the first time in AEA history, you guys helped raise more than $9,200 to benefit the AEA International Travel Awards.

 

The annual Silent Auction is one of the many high points of the AEA conference. Set on the Friday evening of each year's meeting, attendees have the opportunity to peruse the international delights donated from their fellow colleagues. Items from that evening varied from Dominican Rum to handcrafted jewelry from Brazil to one-on-one consultations with Michael Scriven. The highest bidder took home the rarity and all of the proceeds benefited the AEA International Travel Awards. Each year, AEA offers travel stipends of $2,000 to offset the costs to evaluators in underdeveloped and developing countries. Each awardee must first submit an abstract, have it accepted by program chairs, and present their work during the annual meeting. This year, there were 40 applicants representing 25 countries vying for this year's awards. Our awardees were Katharine Tjasink of South Africa, Asad Rahman of Bangladesh, Benedictus Stepantoro of Indonesia, and Basan Shreshtha of Nepal. These awardees were granted the opportunity to present their respective works during several sessions at the conference.

 

A staple of the annual meeting, the International Travel Awards have made it possible for evaluators from every corner of the world to represent their respective countries and present their works to the "who's who" of evaluation. "It was an incredible experience that I will never forget," said Tjasink of Farmer Voice Radio, led by Khulisa Management Services in South Africa, "The knowledge that I've gained at the conference will be put into good use."

 

As the International Year of Evaluation peeks over the horizon, now more than ever is the time  to fully extend the hand of fellowship to all global platforms. Inclusivity and diversity greatly broadens the spectrum of insights presented in the discipline of evaluation. As the discipline continues to grow and the face(s) of evaluation multiplies, cultural competence and global consciousness is required of all participants. AEA congratulates and celebrates this year's International Travel Award winners for their hard work and extraordinary presentations. I look forward to seeing you all next year in Denver, where I hope to not only set another record, but to also provide the opportunity to usher in more international evaluators at the annual meeting.  

Potent Presentations Initiative - You Did It! Now What?
From Stephanie Evergreen, Potent Presentations Initiative Coordinator  
Potent Presentations

 

Whew! You did it! You crafted a solid message, designed clear slides, and landed your presentation delivery. Time to relax until the next call for papers comes out, right? Almost! Quick, while it's still fresh: 

 

  • Tweak your content based on the feedback you heard. Doing this now, while the session is fresh in your mind, will reduce your burden when you present this information again.  
  • Contact or reply to those who heard, or heard of, your presentation. If you were presenting a paper, you should be prepared to email the completed paper. You may want to ask about their work to see how it might mesh with your own in ways that could be advantageous to you both. 
  • If you didn't do so beforehand, upload your slides and/or handout to AEA's eLibrary. Tag your entry with your TIG and other keywords so people can find it easily.
  • Consider whether your topic would be appropriate for a Coffee Break webinar, keeping in mind that it would have to be condensed to 10 minutes and focused on demonstrating some tool or technique of use to evaluators. Is that you? If so, contact Lenae at lboykin@eval.org.

 

For more inspiration year round, keep an eye on aea365, where we'll be posting the Fab Five Reboots for Tom Chapel and Michael Quinn Patton. We revamped five slides each from their workshop slide decks, articulated the design thinking, and shared the before and after pictures. Here is one of Michael Quinn Patton's original slides:

 

  

Want to see what the rebooted version looks like? Watch the blog on Nov. 9. Catch up on all of our Fab Five Reboots at p2i.eval.org. Navigate to Slides, Fab Five Reboot, then choose a Fab Five name!

Working with Assumptions in International Development Program Evaluation

  

AEA member Apollo M Nkwake is the author of Working with Assumptions in International Development Program Evaluation, published by Springer.

 

From the Publisher's Site:


Regardless of geography or goal, development programs and policies are fueled by a complex network of implicit ideas. Stakeholders may hold assumptions about purposes, outcomes, methodology, and the value of project evaluation and evaluators — which may or may not be shared by the evaluators. Even when all participants share goals, failure to recognize and articulate assumptions can impede clarity and derail progress.  

 

Working with Assumptions in International Development Program Evaluation probes their crucial role in planning, and their contributions in driving, global projects involving long-term change. Drawing on his extensive experience in the field, the author offers elegant logic and instructive examples to relate assumptions to the complexities of program design and implementation, particularly in weighing their outcomes. The book emphasizes clarity of purpose, respect among collaborators, and collaboration among team members who might rarely or never meet otherwise. 

From the Author:

 

A major barrier to viable program evaluations is that development programs are based on assumptions that often are not well-articulated. In designing these programs, stakeholders often lack clear outlines for how implemented interventions will bring desired changes. This lack of clarity masks critical risks to program success and makes it challenging to evaluate such programs. Resources/methods that have attempted to address this dilemma have been popularized as theory of change or sometimes theory-based approaches. These approaches emphasize the sequence of changes/mini-steps that lead to the longer-term program goal and the connections between program activities and outcomes that occur at the various stages. Often, however, they do not sufficiently clarify how program managers or evaluators should consider the assumptions inherent in the connections. Assumptions — the glue that holds all the pieces together — remain abstract and too often far from applicable. This book focuses on this important theme: by defining, categorizing, and suggesting tools for explicating program assumptions.

 

About the Author:

 

Dr. Apollo M Nkwake is a research associate professor for monitoring and evaluation (M&E) at Tulane University's Disaster Resilience and Leadership Academy. He previously held senior M&E adviser positions at World Vision United States, University Research Co. LLC, and JSI Research and Training Institute. He has field experience with USAID, World Bank, Unicef, and World Vision programs in Africa, Asia, and Latin America. He holds a Ph.D. in social development from University of Cape Town, South Africa. 

 

Visit the publisher's site.

New Jobs & RFPs from AEA's Career Center  
What's new this month in the AEA Online Career Center? The following positions have been added recently: 

Descriptions for each of these positions, and many others, are available in AEA's Online Career Center. Job hunting? The Career Center is an outstanding resource for posting your resume or position, or for finding your next employer, contractor, or employee. You can also sign up to receive notifications of new position postings via email or RSS feed.

Register
Get Involved
About Us
AEA is an international professional association of evaluators devoted to the application and exploration of evaluation in all its forms.

 

The association's mission is to:
  • Improve evaluation practices and methods
  • Increase evaluation use
  • Promote evaluation as a profession and
  • Support the contribution of evaluation to the generation of theory and knowledge about effective human action.
phone: 1-202-367-1166 or 1-888-232-2275 (U.S. and Canada only) 
websire: www.eval.org