|Newsletter: April 2012||Vol 12, Issue 4|
International Input & Post-PAT Teams
I had the opportunity recently to attend the American Educational Research Association (AERA) conference in Vancouver, Canada. Not only was the conference intellectually stimulating (and I attended a session or two that I believe would be appropriate for us at Evaluation 2012 in Minneapolis 22-28 October, 2012), the international setting offered possibilities for thinking about how we welcome international colleagues at our evaluation conference. For instance, AERA featured sessions co-sponsored by other international educational research associations throughout the world and facilitated opportunities (networking sessions, closed meetings, and receptions) for colleagues who come from other countries to plan and deliberate around ways to work together in mutually beneficial ways. How can we better welcome our colleagues who come from international settings? And more generally, how might we continue to listen to our international colleagues in ways that support mutually beneficial agendas? This year, we will be featuring the work of our International Listening Project in Presidential Strand sessions that intend to advance this work of the Association.
Priority Area Teams (PATs) dissolution and next steps. It comes with great joy that I share with you the post-PATs structure, following a unanimous decision to accept the concept earlier this month. Below is a description of the Review and Working Groups that will be phased in to the internal organization of AEA, to take effect immediately:
- Finance Advisors to the Board. Serves as advisory to the Board on financial matters.
- Awards Working Group. Reviews nominations and selects winners for Association awards.
- Elections Working Group. Enacts the elections work of the Association.
- Diversity Working Group. Serves to help steer, as well as report on, the diversity work of the Association.
- Ethics Working Group. Serves to help steer, as well as report on, the Association's attention to ethics issues.
- Independent Values Review Group. Conducts review of the work of the Board for adherence to and fulfillment of the Association's core values (to be constituted at a minimum of every five years beginning in 2013).
While these groups may look the same, we have delineated Working Groups from Advisors and Review Groups. Working Groups will report to the Executive Director, who will annually report on the work of each group to the Board.
In carrying out this transitional structure, we are aware that some efforts (i.e. international, public affairs/engagement, and knowledge and professional support) are no longer represented. However, additional discussions are underway or will ensue to better ascertain the best structures moving forward. For instance, the International Listening Project Synthesis Task Force seeks to learn how AEA can engage in ways that are mutually beneficial to the association and entities internationally. The short-term goal of the ILP Synthesis Task Force is to identify a set of activities that AEA can undertake with partners in the next few years in pursuit of AEA's commitment to providing international service and programs. Additionally, there is a Professional Development Work Group under the auspices of the Executive Director that seeks to address priority next steps regarding professional development for evaluators and how AEA might contribute to these next steps. These and other efforts will help shape the direction of the Association in the ensuing months and years.
Let me add that this collective work could not have been done without the able support, wisdom, and dedication of our PATs leadership, the PATs Synthesis Task Force, and previous Board members and leadership of the Association over the last two years. My sincere thanks to all who contributed!
Thank you for the opportunity to serve. Enjoy your month ahead!
AEA President 2012
|Policy Watch - Federal Policy & Implications for Non-Federal Organizations|
From Patrick Grasso, Evaluation Policy Task Force Chair
Over the past few years, AEA's Evaluation Policy Task Force has been working with congressional committees, the White House, and federal agencies to promote sound evaluation policies in the federal government. These activities have ranged from informing the Office of Management and Budget's guidance on evaluation for all federal programs to helping draft language for the reauthorization of the President's Emergency Plan for AIDS Relief (PEPFAR) program in Congress. As Chair of the EPTF, I would like to share a few observations derived from both task force work and other examples.
While EPTF activities and their results are important for evaluation on their own merits, one often-overlooked aspect of this work is that it helps to leverage stronger evaluation policies beyond the federal government itself. This leveraging effect means that the reach of AEA's efforts to promote sound evaluation policy extends much further than the immediate issues on which it has been engaged.
There are at least three ways in which this happens. First, federal evaluation policies often are translated into mandates for entities receiving federal funding, including state and local government programs and non-governmental service providers. For example, local and regional mass transit agencies that get funds under the Federal Transit Administration's New Starts program are required to conduct before-and-after evaluations covering changes in service levels, ridership patterns, and project costs. To allow aggregation for reporting to Congress, these reports must follow specific federal guidelines.
Second, federal evaluation policies frequently encourage changes in state and local policies, even when they are not mandates. For example, the Department of Education's Race to the Top program for special grants included a requirement for evaluation of the performance of individual teachers and principals. While this requirement did not affect funding under the major federal aid to education programs, it encouraged some states to adopt evaluation systems tailored to this incentive, often going well beyond the specific requirements for that program.
Third, federal evaluation policies also reach outside the United States and affect international programs. For example, EPTF's contribution helped to set the language on evaluation policy for PEPFAR. That language, in turn, informs not only evaluations of the PEPFAR programs in the various recipient countries, but also efforts to build evaluation capacity at local, regional and national levels, achieving a longer-term effect that is not limited to the program itself.
So, federal evaluation policies have a wide influence across the development landscape. And to the extent AEA helps to shape those policies to reflect current thinking and practice, our profession also has influence on how evaluation gets done in many non-federal contexts.
Go to the EPTF Webpage
|AEA's Values - Walking the Talk with Maurice Samuels|
Are you familiar with AEA's values statement? What do these values mean to you in your service to AEA and in your own professional work? Each month, we'll be asking a member of the AEA community to contribute her or his own reflections on the association's values.
AEA's Values Statement
The American Evaluation Association values excellence in evaluation practice, utilization of evaluation findings, and inclusion and diversity in the evaluation community.
i. We value high quality, ethically defensible, culturally responsive evaluation practices that lead to effective and humane organizations and ultimately to the enhancement of the public good.
ii. We value high quality, ethically defensible, culturally responsive evaluation practices that contribute to decision-making processes, program improvement, and policy formulation.
iii. We value a global and international evaluation community and understanding of evaluation practices.
iv. We value the continual development of evaluation professionals and the development of evaluators from under-represented groups.
v. We value inclusiveness and diversity, welcoming members at any point in their career, from any context, and representing a range of thought and approaches.
vi. We value efficient, effective, responsive, transparent, and socially responsible association operations.
See AEA's Mission, Vision, Values
My name is Maurice Samuels and I conduct Science, Technology, Engineering, and Mathematics (STEM) educational evaluations as a research and evaluation associate at the Center for Elementary Mathematics and Science Education at the University of Chicago. My introduction to AEA was as an inaugural member of the Graduate Educational Diversity Internship (GEDI) Program in 2004. Since then, I have served as co-chair of the mentoring committee in the Multi-ethnic Issues in Evaluation (MIE) Topical Interest Group and have attended AEA retreats. Currently, I serve as co-chair of the MIE Topical Interest Group.
My service and commitment to AEA is a result of the organization's commitment to diversifying the evaluation profession. In particular, AEA has dedicated human and financial resources needed to make and sustain meaningful change within the evaluation field. The organization's commitment to diversity has helped to infuse the evaluation field with new insights, ideas, and knowledge on how programs are designed, implemented, and experienced by individuals and groups that are underrepresented. For me, it is AEA's commitment to inclusiveness and diversifying the field of evaluation that serves as the continuing catalyst for my service to the organization.
When I reflect on AEA's values statement, as it applies to my practice, I consider how I can include the voices of underrepresented groups in the evaluation process. For me, the inclusion of these voices is important because my evaluation practice is within the context of a large urban K-12 school district where the voices of certain stakeholder groups are often unheard, particularly in terms of decision-making processes, program improvements, and policy formulations. The statement also makes me aware of my own cultural values and how they converge or diverge from stakeholders which, in turn, shapes my relationships with stakeholders.
As an evaluator, I recognize programs are located in institutions characterized by a particular culture. As such, there are different cultural practices at the individual, group, and institutional level that can create tensions and present barriers to understanding and communication among stakeholders. Given this reality, the values statement encourages me to continuously seek out ways to share findings with stakeholders in an effective yet humane manner so that barriers can be reduced and, at the same time, an ethically defendable evaluation will be conducted.
Finally, as an evaluator from an underrepresented group, when I reflect on AEA's values statement, it encourages me to rethink ways in which I can assist AEA in its continuing efforts to attract and build the evaluation capacity of evaluation professionals from underrepresented groups.
I am truly thankful to be a member of AEA, and find it a great honor to serve the organization.
|Face of AEA - Meet Marco Segone, UNICEF |
|AEA's more than 7,000 members worldwide represent a range of backgrounds, specialties and interest areas. Join us as we profile a different member each month via a short Question and Answer exchange. This month's profile spotlights Marco Segone, who works within the UNICEF Evaluation Office and is a published author. |
Name, Affiliation: Marco Segone, Senior Specialist, Systemic Strengthening, UNICEF Evaluation Office
Degrees: Master's in Political Science (Milan University for International Political Science)
Years in the Evaluation Field: 16 years
Joined AEA: 1997
AEA Leadership Includes: International and Cross Cultural Topical Interest Group (ICCE TIG)
Why do you belong to AEA?
"I have always been impressed by the diversity of AEA members, the openness of AEA values and principles, and the keen interest to contribute to the evaluation profession. Frankly speaking, my engagement with AEA significantly contributed to my professional development."
Why do you choose to work in the field of evaluation?
"I always saw evaluation as an important element of good governance. Evaluation potentially has a tremendous power to influence the likelihood of public policies being designed and implemented based on evidence. Evaluation can be a strategic tool for Governments - and Civil Society - to learn what works and what does not work in different contexts. Evaluation can maximize the results of interventions. Through a strategic and relevant evaluation, you can enormously contribute to improve the lives of children and women, by influencing public budget allocation and expenditures, and improving the quality of public services, especially for disadvantaged and marginalized populations."
What's the most memorable or meaningful evaluation that you have been a part of - and why?
"Actually, it's not an evaluation, but the engagement in contributing to develop and strengthen evaluation associations in Niger (ReNSE), Brazil (Rebrama), Latin America and the Caribbean (ReLAC), Eastern Europe and former Soviet Union (IPEN). The most promising initiative I have been involved in so far is Evalpartners, an international partnership led by IOCE and UNICEF to strengthen the capacity of evaluation associations all over the world to contribute to the professionalization of evaluation and to strengthen the evaluation culture. Several evaluation associations (including the African Evaluation Association - AfrEA; the European Evaluation Society - EES; the Latin America M&E Network - ReLAC; the Middle East Evaluation Network - EvalMENA; The Canadian Evaluation Society; The Australasian Evaluation Society), UN agencies and other relevant stakeholders already endorsed it. I hope AEA will also join in the near future."
What advice would you give to those new to the field?
"Be always eager to learn from different perspectives and to keep abreast of the latest developments. To do so, you can access, free of charge, selected books published by UNICEF and partners, and authored by strong evaluators from all over the world. You can also engage live with those authors in the series of live webinars organized by MyM&E. But, most importantly, practice evaluation with respect and awareness of the potential impact evaluation can have - directly and indirectly - in the life of people. Especially to the most disadvantaged and marginalized. That's why equity-focused evaluations are so important."
The opinions expressed are the personal thinking of the author and do not necessarily reflect the policies or views of UNICEF.
If you know someone who represents The Face of AEA, send recommendations to AEA's Communications Director, Gwen Newman, at email@example.com.
|Evaluation 2012 Update - Potent Presentations|
At this year's annual conference, we're betting you'll be more excited than normal. Why? We're launching a new initiative - Potent Presentations - to help our members improve presentation quality. Keep your eye out because we'll be holding free professional development trainings before and during the conference on how to prepare, develop, and deliver awesome presentations that will better engage your audience and make your content stick. What do you think makes a presentation potent? Leave your comments over at our aea365 post on the topic: http://aea365.org/blog/?p=6116
|eLearning Update - Coffee Break Webinars At Your Convenience|
From Stephanie Evergreen, AEA's eLearning Initiatives Director
You probably know as an AEA member that you can attend our free, 20-minute Coffee Break Demonstration webinars. But did you know that the Coffee Breaks are also recorded and easily accessible for members, whenever you'd like and at your convenience? Check out the Webinars eLibrary. Most archived webinars are accompanied by downloadable slides or handouts. These 5 webinars have been viewed most often, so far:
DoView Software 2.0 presented by Paul Duignan in January 2010
In this demonstration, Paul gave a tour of the DoView logic model building software program.
Using the Fantastic Five Checklist to Write Better Survey Questions and Improve Survey Reliability presented by Amy Germuth in January 2010
Watch this recording for Amy's quick primer on increasing reliability and validity.
Moving Beyond Bullets: Making Presentation Slides Compelling presented by John Nash in March 2010
During this webinar, John detailed how to break free from standard slide layouts.
Tools and Tips for Teaching Evaluation Concepts to Non-evaluators presented by Ellen Taylor-Powell in June 2010
Looking for ways to talk about our work to others? You have to review this webinar from Ellen!
Easy as Prezi: A Powerpoint Alternative presented by Lyn Paleo in August 2011
If you've been curious about this hot alternative to PowerPoint, spend just 20 minutes getting a tour from Lyn.
Look at the schedule ahead for exciting webinar opportunities, including a series cosponsored with Christian Relief Services and the American Red Cross.
For longer online professional development, check out our eStudy lineup. In May, Tina Christie will present on Evaluation Theories and Michael Quinn Patton will talk about Developmental Evaluation for beginners. In June, Gail Barrington will guide us through Getting Started as a Consultant and Michael Quinn Patton will return to discuss Developmental Evaluation for intermediates.
|Diversity - What's Your Cultural Competence Story?|
I'm Karen Anderson, AEA's new Diversity Coordinator Intern, and in this role I support AEA's diversity programs, TIGs, and the AEA Public Statement on Cultural Competence in Evaluation Dissemination Working Group.
Power. Humility. Society. Voice. These terms echo throughout conversations with members of AEA's Public Statement on Cultural Competence in Evaluation Dissemination Working Group. Following six years of work by the Cultural Competence in Evaluation Task Force, members in April 2011 voted to approve the AEA Public Statement on Cultural Competence in Evaluation. Now, it is a priority to translate the Statement from paper to practice.
Part of my work involves supporting the Working Group. Towards that end, I am speaking briefly with each group member to learn about the variety of ways in which cultural competence has been introduced, integrated, and nurtured in evaluation practice. I wanted to share with you some of what I've learned.
Having a graduate education in areas such as clinical or community psychology is noted as a foundation for some Working Group members. Cindy Crusto, chair of the Working Group, sites mentors, such as Hazel Symonnette, and the numerous program evaluations that she has conducted in different contexts as key components of her cultural competence development, and she is now imparting her cultural competence in evaluation knowledge and expertise to pre-doctoral psychology and doctoral fellows at Yale School of Medicine through community-based evaluation projects.
Working Group member Dominica McBride's academic advisor, Stafford Hood, is another culture in evaluation context pioneer. Now, according to McBride, her "pie in the sky" dream for the Statement on Cultural Competence is for evaluators and stakeholders of evaluation "to incorporate the Statement to be reflective, respective, and proactive," noting the Statement's applicability to "affect how we interact with and treat people, being aware of biases and starting there to get to a broader goal, looking at social inequity and how evaluators can proactively contribute to social equality." Dominica is currently putting her cultural competence tools to work through her nonprofit organization, The Help Institute.
Academic program training, advisors, and mentors are components of the development of cultural competence in evaluation for some of the Working Group members. Though there is no instruction manual for developing cultural competence in evaluation, AEA does have a great resource that serves as a starting point. This month marks the one year anniversary of member approval of the AEA Public Statement on Cultural Competence in Evaluation. If you haven't already, I invite you to take a moment to read through the Statement to reflect upon your own practice and development of cultural competence.
If you would like to share your story or comments, please email me at Karen@eval.org.
|Marc Braverman Named AJE Associate Editor for Book Review Section|
Please join us in welcoming Marc Braverman, a professor at Oregon State University and a long time member of AEA, as the new associate editor for the Book Review Section of the American Journal of Evaluation (AJE).
"I am excited and honored to be joining the superb team at the nation's leading evaluation journal," says Marc, who can be reached directly at Marc.Braverman@oregonstate.edu. "AJE is a focal point for the evaluation community that promotes dialogue about all aspects of our field: how we advance knowledge and theory, how we practice our discipline, how we expand our skills, and how we promote change in communities."
Marc, currently a professor in the School of Social and Behavioral Health Sciences and an Extension Specialist in the Family and Community Health program at OSU, also served as an Extension Specialist at the University of California, Davis (1983-2005) and as an evaluator at the Northwest Regional Educational Laboratory (now Education Northwest) in Portland, Oregon. At UC Davis, he was the founding director of the 4-H Center for Youth Development (established in 1994) and the Tobacco Control Evaluation Center (established in 2004 with funding from the California Department of Health Services).
"Marc brings not only an extensive network of contacts that will prove helpful in securing reviews but his experience and research in adolescent health, health promotion theory and health interventions, program evaluation design and analysis, and design and delivery of community programs will be invaluable to our editorial team," said Tom Schwandt, editor of AJE.
Marc co-edited the book Foundations and evaluation: Contexts and practices for effective philanthropy (Jossey-Bass, 2004), and has edited or co-edited three volumes of New Directions for Evaluation. He has a PhD in educational psychology from the University of Wisconsin-Madison, has taught graduate classes on program evaluation, adolescent development, adolescent health behaviors, and research methods; and has published in the areas of evaluation theory, applied research methods, adolescent health and development, and tobacco control policy, among other topics.
"I have several goals for the book review section," Marc notes. "First, I will aim to continue the invaluable function of providing readers with lively reviews, assessments, and perspectives that can help them to sort through the many new books and other publications that we see each year. As the profile of evaluation continues to expand in the worlds of government, nonprofits, management, universities and other sectors, there is a great deal written about our field and the availability of incisive, thoughtful reviews becomes ever more useful. In addition, I will work to expand the scope of reviews so that AJE's book review section can be a forum for exploring broader ideas about evaluation and related topics, with books at the center of the discussion. As the readership of AJE grows and diversifies, the reviews should reflect that diversification as well, in terms of topics, products, purposes, and reviewers."
Go to the American Journal of Evaluation website
|Communicating with Clients - How Do You?|
How do you communicate with your clients? AEA likes to spotlight samples of great client and stakeholder communications. Here we connect with Susan Parker with Clear Thinking Communications, who shares more about how they connect with clients using a newsletter as well as social media.
"I started the Clear Thinking Communications newsletter because I saw my clients, who are often evaluators or evaluation officers, struggling to get their message across. They had great ideas and important findings that were buried in almost impenetrable jargon-laden writing. I saw evaluators wrestle with the minute details of explaining their methodology and spend less time thinking about how best to communicate their findings to key audiences. Few people were going to read those dense reports, no matter how much good information they contained.
My newsletter focuses on giving readers practical tips that they can put into practice immediately. I want to help evaluators and researchers make sure that the important work they've labored over is read by, and more importantly, used by, the people who could most benefit from their work.
The gap I fill is in helping people remember the basics of good communication, which can be lost in the midst of all of today's hype and confusion about using social media. By basics I mean good listening to clients or colleagues, clear thinking about the purpose of an evaluation, and simple (but not simplistic) writing. I also emphasize the human element of communication. That is, we have to remember that we are trying to reach real, live people, not "stakeholders." The human connection is just as important as clear writing or crisp executive summaries.
I draw my monthly article ideas from common experiences with my clients as well as extensive reading. Article topics have included three ways to make your research useful, how to use the "ladder of abstraction" to write great reports, and why kindness is central when you communicate.
Lately, I'm also thinking about how social media can help us reach people who learn in a variety of ways. What excites me about social media is its ability to connect us to people who want to learn and interact in forms other than simply reading a report. For example, YouTube and Vimeo lets us post executive summary videos about our work. Data visualization and infographics can reach people who are visually oriented. For me, the promise of social media is that we can tailor our messages in ways that greatly expand our reach to people.
As someone who also does evaluations such as case studies, I also think often about how we can make sure that the information we collect is used. It's one thing to read about or watch a video about our findings. It's a whole other level to make sure that what we learned is put into practice. I'm focusing much of my reading time this year on learning what works best in translating information to use. I will share what I'm learning in the coming issues of my newsletter."
To share samples of the ways that you interact with your audiences, email AEA's Communications Director, Gwen Newman, at firstname.lastname@example.org. We'd love to share the ways you communicate via this column as well as AEA's online eLibrary. Thanks!
|New - Youth Focused Evaluation Topical Interest Group|
The Youth Focused Evaluation Topical Interest Group (YFE) is AEA's newest TIG. The purpose of the Youth Focused Evaluation TIG is to create an inclusive and participatory space for all evaluators (both youth and adult) that focuses attention on practices and outcomes of positive youth development and participation in a wide array of informal and formal contexts. YFE also carves out a unique place for youth and adults to share their work, network with professionals and one another, thereby promoting a leadership pipeline within the field.
The YFE-TIG supports youth and adult researchers and evaluators to build best practices and methods related to:
- Enhanced program quality
- Youth/adult professional development, participation, and voice
- Improved measurement
- Research ethics education
- Youth participation in the AEA and YFE TIG, and
- Co-creation of a knowledge exchange (online) community
Co-chairs are Kim Sabo Flores and David J. White. Kim, associate director at the Thrive Foundation for Youth in Menlo Park, California, can be reached at Kim@thrivefoundation.org. David, an associate professor at Oregon State University, can be reached at email@example.com. For more information, contact them directly or visit the YFE website.
"In 2002, a small group of folks interested in youth participatory evaluation met during AEA to discuss the development of a New Directions in Evaluation edition on Youth Participatory Evaluation: A Field in the Making (Volume 98). Over the past decade," notes Kim, "we have been meeting at the annual AEA conference to discuss our work and how to evolve the field. As the group has grown, we began to make two realizations: 1, we were all presenting in different TIGs, which meant that we were not able to attend each other's sessions; and 2, we were missing out on other sessions more broadly focused on youth development that could greatly benefit our conversations. Last year, we had a meeting with a small group of supporters who wanted to begin this TIG. Today, we are already more than 50 strong."
Adds David, "The YFE-TIG has had one of the strongest showings ever for a brand new TIG with over 35 proposals in the pool. I think this is a clear indication that the YFE TIG is meeting the needs of AEA members. And, if youth participatory evaluation is still a "Field in the Making," then the YFE TIG will provide a place for evaluators and youth development practitioners to build evaluation capacity within organizations engaging youth in research and evaluation, improve program quality of youth serving and participatory programs, shape measurement of youth development and participation, assure the ethical involvement of youth in research and evaluation, routinely engage youth in conferences and publications, and promote intergenerational partnerships in evaluation."
Most recently, contributions from the YFE-TIG were featured on aea365. You can read more here.
Go to the Youth Focused Evaluation TIG Website
|Program Evaluation Theory and Practice: A Comprehensive Guide|
AEA member Donna Mertens is co-author of Program Evaluation Theory and Practice: A Comprehensive Guide, published by Guilford Press.
From the Publisher's Site:
This engaging text takes an evenhanded approach to major theoretical paradigms in evaluation and builds a bridge from them to evaluation practice. Featuring helpful checklists, procedural steps, provocative questions that invite readers to explore their own theoretical assumptions, and practical exercises, the book provides concrete guidance for conducting large- and small-scale evaluations. Numerous sample studies-many with reflective commentary from the evaluators-reveal the process through which an evaluator incorporates a paradigm into an actual research project. The book shows how theory informs methodological choices (the specifics of planning, implementing, and using evaluations). It offers balanced coverage of quantitative, qualitative, and mixed methods approaches.
Useful pedagogical features include:
- Examples of large- and small-scale evaluations from multiple disciplines.
- Beginning-of-chapter reflection questions that set the stage for the material covered.
- "Extending your thinking" questions and practical activities that help readers apply particular theoretical paradigms in their own evaluation projects.
- Relevant Web links, including pathways to more details about sampling, data collection, and analysis.
- Boxes offering a closer look at key evaluation concepts and additional studies.
- Checklists for readers to determine if they have followed recommended practice.
- A companion website with resources for further learning.
From the Author:
"There has been so much development in theories about evaluation that I felt there was a need to bring the theoretical developments together in an organized framework that would provide guidance for evaluators in their thinking and practice," says Mertens. "I also wanted to provide a way for evaluators from different theoretical perspectives to see the overlapping territory and provide opportunities for conversations across theoretical lenses as a way to develop better thinking in evaluation. The book ties together the theoretical perspectives that have emanated and draws on examples from a variety of disciplines to illustrate what evaluators from each branch actually do.
"For some of the studies I asked the authors to reflect on their practices and to share what they thought new evaluators might like to know. I am pleased with the thoughts that were shared and the ideas that were given to inform newer evaluators about our profession."
About the Author:
Donna M. Mertens, co-author along with Amy T. Wilson, is Professor in the Department of Educational Foundations and Research at Gallaudet University. The primary focus of her work is transformative mixed methods inquiry in diverse communities, with priority given to the ethical implications of research in pursuit of social justice. A past president of AEA, Mertens provided leadership in the development of the International Organization for Cooperation in Evaluation and the establishment of AEA's Diversity Internship Program with Duquesne University. She has received AEA's highest honors for service to the organization and the field, as well as for her contributions to evaluation theory.
Go to the Publisher's Site
|Data Den: Member Attendance at AEA's Annual Evaluation Conference|
Welcome to the Data Den. We're sharing in order to increase transparency, demonstrate ways of displaying data, encourage action based on the data set, and improve access to information for our members and decision-makers. This month, we're looking at annual conference attendance by AEA members.
The first stacked bar chart, on the top, shows the absolute number of AEA members attending the AEA annual conference over the past 10 years. It is worth noting here that this does not represent the full number of conference attendees, as there is a cadre who attend without joining the association. In absolute numbers, member attendance has remained relatively steady since 2007, varying little from 2007-2010 and with a small jump in 2011. Over the same period, however, the total number of members rose significantly, from 5281 members at the end of 2007 to 6924 members at the end of 2011, a 31% increase in four years that spanned one of the worst economic downturns in U.S. history.
The second stacked bar chart, on the bottom, represents the same data, but as a percentage of the total membership each year. One caveat here is that the 2005 data is incomplete in that it represents only those who registered through AEA for our joint conference with the Canadian Evaluation Society. AEA members who registered through CES are not represented in that year. To return to the chart, in 2002 some 48% of AEA's members attended the conference while in 2011 that number dropped to 35%, with just over 1 in 3 members attending the annual meeting.
Why are we looking at these numbers? They help to guide decisions around the types of services AEA needs to provide in order to serve the whole membership, for instance by ensuring that those who do not attend the conference have access to professional development as well as to the thought leaders in the field. We're also digging deeper to understand better who attends - and whether member attendance varies by such factors as race/ethnicity, gender, and longevity of membership. These segmentations will serve as fodder for a later Data Den.
|New Member Referrals & Kudos - You Are the Heart and Soul of AEA!|
|As of January 1, 2012, we began asking as part of the AEA new member application how each person heard about the association. It's no surprise that the most frequently offered response is from friends or colleagues. You, our wonderful members, are the heart and soul of AEA and we can't thank you enough for spreading the word.
Thank you to those whose actions encouraged others to join AEA in March. The following people were listed explicitly on new member application forms:
Aasha Abdill * Chelsea BaileyShea * Bridget Blount * Ayesha Boyce * Nathan Brown * Deborah Cohen * Drew Cameron * Sue Dawson * Anne Dozier * Frances Burden * Kirsten Evans * Leslie Gabay-Swanston * Kimberly Green * Deborah Grodzicki * Deborah Hardwick * Dennis Hocevar * Richard Krueger * Frances Lawrenz * Kate LaVelle * Lisa Leroy * Laura Linnan * Trace MacKay * Amy McGuire * Robert Medina * Candace Miller * Rakesh Mohan * Mary Murray * Xiaoxia Newton * Kathleen Norris & Clair Null * Zenda Ofir * Nancy Pellowski Wiger * Katye Perry * Sharon Rallis * Peter Redvers-Lee * Jane Reisman * Laurie Ringaert * Liliana Rodriguez-Campos * Laura Roper * Brian Rush * Merle Schwartz * Michael Schlesinger * Jana Sharp * Bill Shennum * Rob Sheppard * Kathleen Sullivan * Mary Anne Sydlik * David Turner * Amita Vyas * LaTanya Washington-Walker * Joseph Willey * Donald Yarbrough
|Volunteer Opportunity - Qualitative Data Analysts|
|Survey Qualitative Analysis Team: We're looking for a few good qualitative data analysts to dive into the qualitative responses from our 2012 member survey. You should be experienced at working with qualitative data, from coding to extracting meaning to reporting, and amenable to working with a small team to cross-validate findings. What do you get in return? First look at a meaty data set and possible directions for association programs, as well as the opportunity to hone your collaboration and analysis skills in a team environment while building your professional network. The work will be done between late May and July 2012 and will involve collaborative development of a coding system, coding and interpretation of findings, and development of a report. Communication will be via an estimated 3-4 one-hour conference calls as well as ongoing email exchange. If you are interested, please send an email noting your interest and your background in qualitative analysis and any examples of reports that you can provide to firstname.lastname@example.org by Tuesday, May 8, 2012.|
New Jobs & RFPs from AEA's Career Center
What's new this month in the AEA Online Career Center? The following positions have been added recently:
- Monitoring and Quality Assurance Specialist at United States Conference of Catholic Bishops (Washington, DC, USA)
- Assistant VP for Academic Enhancement at Texas A&M Health Science Center (Bryan, TX, USA)
- Evaluator at MD Department of Health & Mental Hygiene (UMBC) (Baltimore, MD, USA)
- Extension Evaluation Specialist at Oregon State University (Corvallis, OR, USA)
- Research Associate at James Bell Associates (Arlington, VA, USA)
- Evaluation Specialist at ACET (Minneapolis, MN, USA)
- Evaluation Officer at The Duke Endowment (Charlotte, NC, USA)
- Senior Evaluation Specialist at Asian Development Bank (Manila, PHILIPPINES)
- Evaluation and Research Consultants at Detroit Based Firm (Detroit, MI, USA)
- Evaluation and Learning Advisor at ACDI/VOCA (Washington, DC, USA)
Descriptions for each of these positions, and many others, are available in AEA's Online Career Center. According to Google analytics, the Career Center received approximately 2,900 unique visitors over the last 30 days. Job hunting? The Career Center is an outstanding resource for posting your resume or position, or for finding your next employer, contractor or employee. You can also sign up to receive notifications of new position postings via email or RSS feed.
|The American Evaluation Association is an international professional association of evaluators devoted to the application and exploration of evaluation in all its forms.
The American Evaluation Association's mission is to:
- Improve evaluation practices and methods
- Increase evaluation use
- Promote evaluation as a profession and
- Support the contribution of evaluation to the generation of theory and knowledge about effective human action.
phone: 1-508-748-3326 or 1-888-232-2275