Newsletter: November 2013

Vol 13, Issue 11

Banner600

Message from the President

Fitzpatrick.2

Dear colleagues,

 

One of the elements of my presidential theme concerned communicating with others who are working on the same problem area as we are. As an example, I cited the disciplines involved in the successful work in the last few decades to reduce deaths and accidents due to drunken driving. This success came through the joint work of evaluators, substance abuse specialists, public health advocates, transportation experts, citizens advocates such as Mothers Against Drunk Driving (MADD), the judicial system, and psychologists working to change public attitudes and behaviors. I argue that when we're conducting an evaluation, we should communicate more with others working on our same "fish scale" or problem area to have a greater impact on the problem.

 

In this column, I'd like to bring you up to date on recent activities in this area. First, at AEA, I have begun a task force concerned with crossing disciplinary boundaries. Chaired by Nicole Vicinanza, this task force of nine leaders in evaluation will explore successful precedents for working across disciplinary areas, identify other disciplines we might work with, and suggest means for doing so. Members of this task force, in addition to Nicole, include Tom Chapel, Laura Leviton, Domenica McBride, Lance Potter, Hallie Preskill, Andy Rowe, Tom Schwandt, and myself. 

 

We are just beginning our work and encourage you to contact any of us about your ideas for disciplines or associations we should work with as well as methods for encouraging work across evaluation-related disciplines. Our goals include linking individuals in evaluation, policy analysis, public administration, organizational development, and like disciplines who are working on a common problem to share knowledge, methodologies, and strategies. I think we can teach others much about working with and achieving use among program stakeholders, and developing logic models and evaluation questions, and we can learn from others about the content area itself, consultative skills, and new methodologies.  

 

Meanwhile, I would like to tell you a bit about our work with other associations. Since our conference, I have participated on panels with the presidents of the Association of Public Policy Analysis and Management (APPAM), the Association for Research on Nonprofit Organizations and Voluntary Associations (ARNOVA), the National Association of Schools of Public Administration and Affairs (NASPAA), and the American Political Science Association (APSA) at the APPAM and ARNOVA conferences. 

 

Our focus has been how our associations, all concerned with impacting and improving public policy with our findings, are working to encourage policymakers to use our information. I've found that we are one step ahead of many others in that we work closely with stakeholders to learn their information needs, their views, and values on credible evidence to help us facilitate use. However, our goals are most successful at the program level with the stakeholders with whom we interact. But how are we affecting policy in the content areas in which we work?  

 

I am considering using our Thought Leaders Forum to inform AEA members of the work of some in other fields. Another option we may pursue is linking those working on the same "fish scale" through listservs or other Internet mechanisms. But I'd like to solicit your ideas. Are you interested in linking with others exploring different dimensions of the content area you evaluate? With administrators and policymakers working on this area? If so, what might be an effective way of linking you with them? How can we as an association facilitate this cross-disciplinary communication to enhance all of our learning? I welcome your ideas.  

 

Sincerely,

Jody

Jody Fitzpatrick

AEA 2013 President  

In This Issue
Meet Robin L. Miller
2013 Award Winners
Walking the Talk
Face of AEA
Policy Watch
Diversity
eLearning
p2i
Book Profile
EvalPartners
New Job Postings
Register
Get Involved
About Us
Quick Links
Join our Mailing List!
Meet Robin L. Miller - Incoming Member at Large

Robin Lin Miller received her Ph.D. in community psychology from New York University, with a minor in quantitative methods. She is currently professor of ecological-community psychology at Michigan State University and vice chair of its social science and education institutional review board. Miller has 25 years of experience evaluating HIV prevention and care programs in community-based and clinical environments. Prior to assuming an academic career, she directed program evaluation services at the Gay Men's Health Crisis in New York City. During that 7.5 year period, she designed and conducted evaluations of diverse prevention programs for gay and bisexual men and of care programs for persons living with HIV.  

 

Since moving to an academic position, she has continued to evaluate HIV prevention and care programs, especially those targeting black gay and bisexual youth. She also studies the long-term use of evidence-based principles and practices in AIDS-related service settings. In pursuing both areas, she has maintained an overarching interest research on evaluation theory, methods, and practice, and, in particular, how evaluation theories are used. Her most recent evaluations include a prospective meta-evaluation for the U.S. PEPFAR Caribbean Regional Program and an evaluation of the long-term health consequences of ex-offender re-entry assistance services for persons living with HIV. With colleagues from the Adolescent Trials Network, she is evaluating several multisite programs for adolescents, including a project to link HIV-infected youth to care and one to mobilize communities to create structural change. 

 

Miller is an elected fellow of the Society for Community Research and Action and the American Psychological Association. She is a past recipient of the Marcia Guttentag Promising New Evaluator Award and the 2011 recipient of the Robert Ingle Award honoring service to the profession of evaluation. 

 

In her ballot statement, Miller stated: "Board stewardship will be essential to navigating the long, multiphase transition effectively. I want to become a member of the board to help ensure we build a solid relationship with our new AMC, one that facilitates AEA's long-term success and capitalizes on the resources and expertise our new AMC offers. The past leadership roles I have been privileged to assume for AEA provide me with intimate knowledge of the association's most-prized membership services and the diverse practice and scholarship of its members. I believe that my in-depth understanding of AEA's operations, its membership, and the profession, coupled with my leadership experiences from AEA and elsewhere, can aid our journeying to a new era. I am eager to work with other colleagues on the board, the membership, and our new management company to move us forward in enriching new directions."

 

We welcome Robin Miler and thank all who participated in this year's election process!  

AEA Announces 2013 Award Winners

The American Evaluation Association honored six individuals at its 2013 Awards Luncheon in Washington, D.C. Honored this year were recipients in six categories involved with cutting-edge evaluation/research initiatives that have impacted citizens around the world. We'll spotlight each award in upcoming issues. Today we extend our congratulations to Dominica McBride!

 

Dominica McBride, Ph.D., Founder/CEO and Evaluation Specialist, Become Inc.   

2013 Marcia Guttentag Promising New Evaluator Award  

McBride's personal mission is to help unify and strengthen communities and be a part of individuals' and families' self-actualization so that the community and people can live and function optimally. As the daughter of a first-generation Haitian mother and artist, and influenced by African-American, Polish-American and French-Canadian cultures, she has come to respect, appreciate, and learn from multiculturalism. McBride has designed and implemented workshops nationally, including cultural competence, wellness, social and emotional intelligence, program evaluation, and logic modeling for audiences including Goodwill Industries International Inc., prevention specialists, lawyers, mental health professionals, government employees, and community members. 

 

McBride is founder and CEO of Become Inc., a nonprofit organization dedicated to manifesting thriving communities and social justice. The organization uses program evaluation, coalition building, and training and education as tools to achieve its mission. McBride has conducted domestic and international program development and evaluation projects with marginalized communities, including rural communities in Tanzania and East Africa, as well as African American, Hispanic and urban Native American communities. Topic areas in these projects have focused on various health issues, including HIV/AIDS, substance abuse, cardiovascular disease, stress/resilience, domestic violence, and general mental and physical health. 

 

McBride has published articles and chapters on culturally responsive evaluation, cultural competence, prevention of risky behaviors in youth, prevention and human rights, HIV prevention in youth, cultural considerations in suicide-homicide, and cultural representations of Africa. She also has provided clinical psychotherapy services to individuals, couples, families, and groups focused on psychological well-being, including — but not limited to — life skills, parenting skills, and recovery. She has her Ph.D. in counseling psychology with a specialization in consultation from Arizona State University.

 

"I come from a multicultural family with a multiethnic background and a focus on global consciousness," McBride said. "It is from this soil that my professional life sprouted and has grown into a love of culture, discovery and responsiveness, and receiving this award is a manifestation of AEA's growing appreciation and focus on culture and its impact on both evaluation and life as a whole."

 

Visit AEA's awards page
AEA Values Rewind - Walking the Talk with Beverly Parsons

Are you familiar with AEA's values statement? What do these values mean to you in your service to AEA and in your own professional work? Each month, we'll be asking a member of the AEA community to contribute her or his own reflections on the association's values.  

 

AEA's Values Statement

The American Evaluation Association values excellence in evaluation practice, utilization of evaluation findings, and inclusion and diversity in the evaluation community.

 

             i.  We value high quality, ethically defensible, culturally responsive evaluation practices that lead to effective and humane organizations and ultimately to the enhancement of the public good.

             ii. We value high quality, ethically defensible, culturally responsive evaluation practices that contribute to decision-making processes, program improvement, and policy formulation.

            iii. We value a global and international evaluation community and understanding of evaluation practices.

            iv. We value the continual development of evaluation professionals and the development of evaluators from under-represented groups.

             v. We value inclusiveness and diversity, welcoming members at any point in their career, from any context, and representing a range of thought and approaches.

            vi. We value efficient, effective, responsive, transparent, and socially responsible association operations.

 

 


 

Editor's Note: In September 2012, AEA interviewed Beverly Parsons for this section of the newsletter. Parsons is AEA's president for 2014; as such, we are re-running her feature. 

 

I'm Beverly Parsons, executive director of InSites, a nonprofit research, evaluation, and planning organization. Thanks to many of you, I've had the privilege of serving on the AEA Board of Directors from 2009-2011 and was recently elected to serve as 2014 president. 

 

Recently, as I walked into a hotel ballroom, I saw the familiar round banquet tables and an unfamiliar object — at each place setting, a kaleidoscope stood at attention like a toy soldier. Guests aimed their kaleidoscopes at the chandelier. Oohing and aahing at the changing colors and designs, they noticed key words about Strengthening Families tumbling among the colored crystals.

 

What words from the AEA Values Statement would I put in a kaleidoscope? Enhancement of public good would show up on each turn. Inclusiveness and diversity would be there, tooViewers would also see high quality, ethical, culturally responsive, global, international, efficient, effective, transparent, and socially responsible. As the AEA values connected and reconnected with the crystals in the kaleidoscope, concepts that sometimes seem disjointed and scattered would begin to form patterns. I would be searching for values linked to three themes: environmental sustainability, social justice, and economic well-being.

 

In my evaluation work, these themes are becoming the definers of enhancement of public good. In the business world and elsewhere, they are known as the Triple Bottom Line (TBL). I began to focus on the Triple Bottom Line through my participation in the Bainbridge Graduate Institute certification program in Sustainable Business. This past year, the Triple Bottom Line has inspired my evaluation work and my role on the AEA Board.

 

The Triple Bottom Line encourages organizations to be fully responsible for their actions by establishing measures of their financial, social, and environmental performance. Through rethinking their work, they can account for the systemic impact of their work on the economic, social, and environmental well-being of the communities and populations they serve. I apply systems-oriented evaluation practices and AEA values to help organizations see their opportunities to promote the public good. As O.W. Holmes said, "Every now and then a [person's] mind is stretched by a new idea or sensation, and never shrinks back to its former dimensions." Linking our values, TBL, and evaluation has been such a stretch for me.

 

This past year, I chaired an exploratory AEA Board task force (with board members Thomas, Yates, and Cooksy) about our attention to environmental sustainability. To learn more, see the slides I posted in the elibrary from our session at AEA2011. 

Face of AEA - Meet Ginger Fitzhugh

AEA's more than 7,800 members worldwide represent a range of backgrounds, specialties, and interest areas. Join us as we profile a different member each month via a short question-and-answer exchange. This month's profile spotlights Ginger Fitzhugh.

 

Name: Ginger Fitzhugh

Affiliation: Evaluation & Research Associates

Degrees: B.A., Psychology; M.M. (Master of Management)

Years in the Evaluation Field: 10

Joined AEA: 2004

  

Why do you belong to AEA?

 

I belong to AEA because it is a vibrant, welcoming community committed to members' learning and growth. AEA offers many high-quality professional development resources, most of which are free or included within the cost of membership (including its journals, online resource library, Coffee Breaks, AEA365 blog, and Topical Interest Groups). I also appreciate that AEA generally "walks the talk," and uses the principles it espouses. I am looking forward to my new role as co-chair of the Systems in Evaluation TIG, where I expect to develop connections with more amazing colleagues, continue to learn, and give back to the organization I consider to be my professional home.  

 

Why do you choose to work in the field of evaluation? 

 

I stumbled on evaluation after working in the fields of research, social work, administration, and organizational development. For me, evaluation is a perfect blend of these disciplines. Evaluation is also a way to live the mission of my alma mater (The Heller School for Social Policy and Management at Brandeis University): "knowledge advancing social justice." I strongly believe in the importance of asking questions and helping people use what they learned to improve the world. 

 

What's the most memorable or meaningful evaluation that you have been a part of?

 

The very first evaluation that I took part in was a capstone project for my master's degree. Three classmates and I evaluated the beginnings of a new partnership between a large social service agency and a large school system aimed at increasing family engagement in children's education from birth through grade 12. It was a bit of a trial by fire, but a wonderful experience. I learned the importance (and challenge) of understanding the evaluation's and the project's (sometimes shifting) goals, the value of collaboration, the role of public policy in education reform, and the challenges associated with changing complex systems. I also learned how to relate evaluation theory to practice, as well as the practical limits of theory. We had a wise and wonderful mentor (Susan Lanspery), who later became a colleague. I became hooked on evaluation, and was fortunate to get my first job doing evaluation after I graduated. Since then, I've been involved in dozens of other evaluations in a wide variety of fields. 

  

What advice would you give to those new to the field?

 

Take advantage of the many formal and informal professional development opportunities that AEA offers. Develop your skills in some of the emerging trends in evaluation, such as data visualization. Find at least one mentor. Conduct informational interviews with other evaluators, asking them about their career path(s), their current work, and what their philosophy of evaluation is. I find that evaluators are very generous people.  

Policy Watch - Evaluation Policy Implementation Progress

 OrosFrom Cheryl Oros, Consultant to the Evaluation Policy Task Force (EPTF)

 

Recent reports offer lessons about evaluation policy development and implementation in different arenas. At the federal level, 2010 legislation (Government Performance and Results Modernization Act (GPRAMA)) updated how federal agencies assess their performance. Agencies were required to create chief operating officers (COO) and performance improvement officers (PIO). The Partnership for Public Service, in conjunction with Grant Thornton, examines subsequent changes in "Taking Measure: Moving from Process to Practice in Performance Management," September 2013.  

 

Based on feedback from agency leaders, the report indicates that enthusiasm for performance assessment has risen and quarterly discussions are happening, but data are not being used at all levels for decision making (leaving budgets and performance still disconnected). Progress toward a performance culture varies widely across agencies, but with relatively little emphasis on evaluation, which can provide in-depth understanding of program performance (compared to performance metrics). Outreach and communication with Congress on performance progress is nearly nonexistent. OPM has identified skills to deal with performance evaluation and measurement, which most agencies lack (particularly in program staff). The report recommends better coordination between those who lead evaluations and those who set goals and are responsible for performance measurement, as COOs and PIOs are not well versed in evaluation. 

 

At the state level, the Government Finance Officers Association recommended that states develop and use performance measures and measure program outcomes, efficiency, and effectiveness. These should be used for continued improvement as an important component of long-term strategic planning and decision making, which, in turn, should be linked to governmental budgeting ("Eye on the Prize: States Looking at Goals, Outcomes for Budget Decisions, Council of State Governments"). 

 

The key concept is that programs that can prove their effectiveness with data will be more likely to obtain the funds they seek. Most states claim to use outcome-based budget decision making, and a number of cities have also developed such systems. Every state measures outcomes in at least some agencies, but fewer have a statewide strategic plan to measure outcomes across all agencies. Even fewer relate those outcomes to the budget. For example, both Maryland and Baltimore have developed such systems ("Baltimore's Outcome Budgeting Approach", John Kamensky). Although Baltimore considers evidence when selecting programs for funding, it does not yet have an evaluation system in place to assess the performance of those programs once in operation. 

 

Internationally, the International Organisation for Cooperation in Evaluation, et al, recently released a report, Voluntary Organizations for Professional Evaluation (VOPEs): Learning from Africa, Americas, Asia, Australasia, Europe, and Middle East (Volume 2 of "Evaluation and Civil Society: Stakeholder Perspectives on National Evaluation Capacity Development"). The report provides information on the approximately 100 national evaluation associations worldwide (with a total membership of about 34,000 evaluators, of which AEA comprises 23 percent). Of particular interest is information on the role VOPEs have played in national evaluation capacity development. Many examples show VOPEs having significant influence on their governments during formulation of high-level evaluation-related policies and as national and provincial systems were established and implemented.  

 

Please share observations about evaluation policy setting in your federal agency, state, or city (or country) at the EPTF Discussion List or [email protected].

Diversity - MSI, It Gives You Wings: A Spotlight on Dr. JoAnn Yuen

From Zachary Grays, AEA Headquarters

 

The Minority Serving Institution Fellowship represents one of the many pillars of diversity here at AEA. This year, AEA was proud to grant fellowship to Edilberto Raynes, Denise Gaither-Hardy, Tamara Bertrand, Ana Pinilla, and Andrea Guajardo. Our newest fellows joined us for their first meetings of the cohort during Evaluation 2013. As our participants buckle in for their academic year at their respective institutions, they are also setting out on a year-long journey of professional development and academic enlightenment with many other aspirations in tow. I took a moment to chat with Dr. JoAnn Yuen, a former MSI fellow, to talk about her journey as a fellow, what inspired her to participate, the opportunities the fellowship presented, and why diversity is important to her and should be to the members of AEA. 

 

Dr. Yuen, a native Hawaiian, greeted me with a warm "aloha" upon my initial contact with her. An associate professor at the University of Hawaii and associate director for the Center of Disability Studies (CDS), Dr. Yuen was encouraged by her mentor and colleague, Morris Lai, to become a part of AEA and the MSI cohort. Dr. Yuen participated in the second MSI cohort during the 2006-2007 academic years. She wanted to join an organization with strong local affiliations in Hawaii while also extending her professional reach and visibility. "As a qualitative researcher, many of the practitioners that shaped my theoretical understanding of the methodology were established and active members of AEA. It was exciting to rub elbows and listen to these individuals in person," she said. 

 

Though Dr. Yuen (pictured, right) had many opportunities and experiences to share about being a MSI fellow, she highlighted the opportunity to work with Quality Education for Minorities (QEM) during her cohort. "QEM executive director, Dr. Shirley McBay, was a wonderful mentor and has engaged me over the years to collaborate with QEM on grants," she said. "QEM invited me to a training with the NSF program officer. He was in charge of an RFP involving individuals with disabilities and STEM in postsecondary education. Amazing opportunity, though the grant was not funded by NSF. What we produced formed the catalyst for two successful grant submissions."

 

Dr. Yuen painted an absolutely inspiring portrait of what the MSI Fellowship has done for her and, of course, why the program is important to her, AEA, and why it should be to you as a fellow evaluator. "In the nation, at my institution, and in the field of disability studies, there are concerted efforts to be more diverse, and being native Hawaiian, I have a responsibility to promote this effort," she said. "I have a responsibility to model to other native Hawaiians what it means to be a leader, researcher, a professional who can thrive in academe, and be considered a valued community member. I have a responsibility to mentor, as I have been, to students and faculty." 

 

Her unbridled passion roars in this empowered statement on diversity that drives home the purpose of this fellowship, providing evaluation resources to those who don't have direct access to the materials at their home institutions. Not surprisingly, participants take this opportunity to provide a voice to the unheard masses in a cross-cultural platform and through the research they conduct. "I feel lucky and grateful to have been included and supported. There continue to be pressures and major efforts (revisions to Sections 501 and 503 of the Rehabilitation Act) throughout the federal government to create diversity at all levels of institutions (University of Hawaii's efforts to increase the employment of Native Hawaiians) and organizations. This is a timeline of my participation. AEA has been ahead of the curve and should continue its commitment to and support of diversity," she said.

 

Highly regarded and successful prior to participation, Dr. Yuen utilized this opportunity to not only develop herself professionally, but to also provide diverse representation in the evaluation community on her professional projects and products. Today, Dr. Yuen is an advocate for center-wide issues impacting the Center for Disability Studies and CoE and has recently completed a college-wide evaluation on the impact of the Hawaii Department of Education (HDoE), Data Governance Policies (DGO) on student and faculty research. 

 

"A change in policies within the DGO has created major delays, on average 10 months, in research approvals," she said. "Students across the university are changing research topics and adding one year to their education. Research faculty are in jeopardy of losing extramural dollars because they can't access the data generated through classroom research. Schools will lose thousands of dollars provided by research-related contracts." 

 

Dr. Yuen will present this report to the dean and share it with the Hawaii DoE with hopes of improving the process. Busier than ever, it is clear that Dr. Yuen's pluck has not wavered since before and during her cohort. It is truly a pleasure to call Dr. Yuen an MSI graduate. After all, as she told me upon outreach, "[MSI] gave me wings." 

 

Learn more about the MSI Fellowship

eLearning Update - Discover Upcoming Coffee Break Demonstrations and eStudy Courses

From Alexa Schlosser, AEA Headquarters 

 

Our Coffee Break Webinars are short, 20-minute presentations of tools or tips we think evaluators will find helpful in their work lives. As November's webinars come to a close, let's take a look at what's in the pipeline for December:

 

CBD165: Practical Strategies for Managing Counterproductive Behavior in Small Groups - Robert Kahle
Thursday, Dec. 5, 2013

2-2:20 p.m. ET

 

Bob Kahle, author of "Dominators, Cynics and Wallflowers: Practical Strategies for Moderating Meaningful Focus Groups" (Paramount, 2007), will describe methods of managing dominant and cynical behavior in focus groups, planning sessions, and meetings of all types. Tips on how to quickly recognize and prevent bad behavior are presented as well as a conceptual tool, the Continuum of Correction, to gently guide small groups back to productivity and inclusiveness. 

 

CBD166: Pocket Chart Voting - Engaging vulnerable voices in program evaluation - Kerry Zaleski

Thursday, Dec. 12, 2013

2-2:20 p.m. ET

 

Many people interested in gathering data for program design or evaluation purposes find that their toolkit is insufficient in gathering accurate, reliable, and relevant data in order to give voice to persons living in situations of vulnerability, powerlessness, or marginalization. This coffee break session will provide an example of a participatory method known as "pocket chart voting" that can be used for a variety of purposes, including: needs assessments, program planning, monitoring, and impact evaluation. The method will demonstrate an effective process to confidentially engage vulnerable voices and affirm self-empowerment throughout a decision making process. The tool is adaptable to fit specific contexts and is a good demonstration of a simple, locally-resourced data collection method.

 

CBD167: Gender-Based Violence in Evaluation - Tessie Catsambas 
Thursday, Dec. 19, 2013

2-2:20 p.m. ET 

 

Recent attention on gender-based violence, a gender issue and a human rights violation, has shown it is so prevalent that, in some countries, an evaluator may come upon gender violence more than 25 percent of the time. Unprepared, the unsuspecting evaluator may misread the symptoms, draw irrelevant conclusions, and, in fact, contribute to more violence even when he or she is trying to help. This session will highlight methods and tools that an evaluator can use to identify gender violence if it is present, manage an evaluation touched by gender violence, and behave in a responsible way that protects victims. 

 

You can pre-register for these webinars by clicking the links above! 

____________________________________________________________________________________

 

Our eStudy program is made up of longer, more in-depth virtual professional development courses. Below are December's eStudy offerings:

 

eStudy 037: Essentials of Utilization-Focused Evaluation - Michael Quinn Patton

Dec. 9, Dec. 11, Dec. 13, and Dec. 16

2-3:30 p.m. ET

 

The evaluation standards call for evaluations to be useful, practical, accurate, ethical, and accountable. Utilization-Focused Evaluation (UFE) is a process that meets these criteria by promoting evaluation use from beginning to end with a focus on intended uses by intended users, and encouraging situational responsiveness and adaptability. In a series of four sessions, Michael Quinn Patton will present the basics, controversies, and cutting edge issues in conducting Utilization-Focused Evaluations. This eStudy workshop will be based on his latest book, "Essentials of Utilization-Focused Evaluation." 

 

Read more and register

Potent Presentations Initiative - Nominate the Best AEA 2013 Slideshows
From Stephanie Evergreen, Potent Presentations Initiative Coordinator  
Potent Presentations

 

Each year, it seems the presentation quality at the conference gets better and better. We've heard that some of you had practiced with our Presentation Assessment Rubric. Awesome! Some of you told me you designed your slides according to our Slide Design Guidelines. It shows! 

 

What well-designed slideshows did you notice at the conference? Who did a great job presenting with their slides as a supporting technology? Nominate your favorite by sending me at note at [email protected]. In next month's column, I'll share the exemplars so they can be our sources of inspiration.   

In a similar vein, the AEA Poster Competition found two winners this year! On the left is Christina Oelnik's winning entry; on the right is Ann Martin's winning entry.

 

        

 

Congrats to both winners! Head to the p2i site to read a short discussion of these two posters from a competition judge and what made them stand out from the crowd. If these have you itching to start work on your own poster, review our revised Poster Guidelines

Presenting Data Effectively: Communicating Your Findings for Maximum Impact

AEA member Stephanie Evergreen is the author of Presenting Data Effectively: Communicating Your Findings for Maximum Impact, published by SAGE Publications.

 

From the Publisher's Site:


This is a step-by-step guide to making the research results presented in reports, slideshows, posters, and data visualizations more interesting. Written in an easy, accessible manner, "Presenting Data Effectively" provides guiding principles for designing data presentations so that they are more likely to be heard, remembered, and used. The guidance in the book stems from the author's extensive study of research reporting, a solid review of the literature in graphic design and related fields, and the input of a panel of graphic design experts. Those concepts are then translated into language relevant to students, researchers, evaluators, and nonprofit workers - anyone in a position to have to report on data to an outside audience. The book guides the reader through design choices related to four primary areas: graphics, type, color, and arrangement. As a result, readers can present data more effectively, with the clarity and professionalism that best represents their work.  

 

From the Author:

 

I wrote this book for you. You, who know the data very well. You, who must communicate your findings about that data to an audience that wants the bottom line. You, who know evaluation well, but not graphic design. I wrote this book because evaluation is awesome and too much of what we do doesn't get the attention it deserves simply because of the way we have packaged it. So I used the opportunity of a dissertation to research the best practices in graphic design and how they can apply to evaluation communication. My aim was to make this book a manual of easily implementable steps that you can take without new software or programming skills to really boost how well people can read, interpret, and remember your evaluation work, whether you are making a report, a slideshow, a poster, or a standalone data visualization.

 

About the Author:

 

Stephanie Evergreen is an evaluator who also writes, trains, and consults on presenting data effectively. She recently coedited two volumes of "New Directions for Evaluation on Data Visualization" (Fall and Winter 2013). Evergreen is the founder of AEA's Data Visualization and Reporting TIG. She writes a popular blog at StephanieEvergreen.com/blog

 

Visit the publisher's site.

EvalPartners Promotes Inclusive Definition of "National Evaluation Capacity," Launches EvalYear

The EvalPartners Initiative continues to make progress toward its purpose of building equity-focused and gender-responsive evaluation capacity in civil society globally. Under EvalPartners, the AEA participated in the third National Evaluation Conference (NEC2013) that took place Sept. 30 through Oct. 2 in S�o Paulo, Brazil, sponsored by the UNDP Evaluation Office and SAGI, the Bureau of Information and Evaluation and Information Management - Ministry of Social Development and Fight Against Hunger (Brazil). The theme this year was Solutions to Challenges Related to Independence, Credibility and Use of Evaluation. From the AEA, the following people participated: President Jody Fitzpatrick, President-Elect Beverly Parsons, Stephanie Shipman (who presented in a well-attended panel on evaluation capacity in the US Government), and Tessie Catsambas, AEA representative to the IOCE Board. 

 

The first and second NEC conferences that focused on building national evaluation systems (organized by UNDP) involved mostly executive government participants. This third conference signified an important gain of EvalPartners. For the first time, the concept of "national evaluation systems" expanded to include parliamentarians (legislative), and civil society in the form of professional evaluation associations and societies. This is a major step forward in promoting demand for evaluation by governments and parliaments. At the end of the NEC 2013, UNDP Evaluation Office Director Indran Naidoo committed to including EvalPartners in all future NEC conferences.  

 

A highlight of the conference came when Asela Kalugampitiya of the Sri Lanka Evaluation Association and member of Parliament of Sri Lanka Kabir Hashim asked participants to stand and declare 2015 as EvalYear, the International Year of Evaluation! Read more

 

During the 2013 AEA Conference, in four sessions dedicated to different aspects of EvalPartners and EvalYear, participants contributed great ideas or how the AEA and AEA Affiliates might get involved in EvalYear. These ideas are being submitted to the AEA's International Working Group, chaired by Hubert Paulmer. Stay tuned about how to get involved! 

New Jobs & RFPs from AEA's Career Center  
What's new this month in the AEA Online Career Center? The following positions have been added recently: 

Descriptions for each of these positions, and many others, are available in AEA's Online Career Center. Job hunting? The Career Center is an outstanding resource for posting your resume or position, or for finding your next employer, contractor, or employee. You can also sign up to receive notifications of new position postings via email or RSS feed.

Register
Get Involved
About Us
AEA is an international professional association of evaluators devoted to the application and exploration of evaluation in all its forms.

 

The association's mission is to:
  • Improve evaluation practices and methods.
  • Increase evaluation use.
  • Promote evaluation as a profession.
  • Support the contribution of evaluation to the generation of theory and knowledge about effective human action.
phone: 1-202-367-1166 or 1-888-232-2275 (U.S. and Canada only) 
websire: www.eval.org