|Newsletter: January 2011||Vol 11, Issue 1|
|A Year of "Valuing Evaluation"|
Greetings fellow AEA members and happy new year to all! I hope your winter holidays were restful, renewing, and enjoyable.
It is with excitement, and humility, that I assume the leadership of our association for 2011. AEA is currently a robust and vibrant organization. Our membership continues to grow, our financial health is excellent, and a wealth of internal activities provide our evaluation community with an expansive array of information and development opportunities - webinars, the eLibrary, aea365, EvalTalk, Thought Leaders, and more.
Further, evaluation itself also continues to experience explosive growth all around the world. And with its new policy governance model in place, AEA is now positioned to be more strategically active on the national and world stages. We now have the structures, commitments, and resources to actively advance the importance of evaluation in defensibly understanding the quality of policies and practices in the U.S. and elsewhere. Specifically, among our goals policies are ambitions to influence the character and role of evaluation through activities directed to evaluation users and the general public (beyond activities for members). The three-year-old Evaluation Policy Task Force is one such initiative, designed to influence evaluation policy at the national level. The Board recently launched an initiative designed to collaborate with international evaluation partners in defining an appropriate and consequential international presence for AEA. And a campaign to educate the general public about the value of evaluation is currently under development by a working group of association volunteers.
And speaking of values, the theme for 2011 is "Values and Valuing in Evaluation," a theme I hope will appear in multiple activities and conversations throughout the year. Acknowledging our own AEA values is a worthy starting point for this conversation. AEA values "excellence in evaluation practice - specifically, high-quality, ethically-defensible, and culturally-responsive evaluation practice, utilization of evaluation findings, and inclusion and diversity in the evaluation community."
Good wishes to all,
AEA President, 2011
|Meet AEA's 2011 President, Jennifer Greene|
Jennifer C. Greene, a Professor in the Department of Educational Psychology at the University of Illinois at Urbana-Champaign who brings more than three decades of experience both as an evaluator and educator, will serve as the 2011 President of the American Evaluation Association (AEA).
Author of Mixed Methods in Social Inquiry, published by Jossey-Bass in 2007, Greene is a recognized thought leader in the field and has authored hundreds of journal articles and book chapters. She currently serves as an Associate Editor on the Journal of Mixed Methods Research and is a member of the Advisory Board for the International Journal of Mixed Methods for Applied Business and Policy Research, an online journal in Australia. She serves on the Editorial Board for the American Journal of Evaluation and for the Journal of Education and is a senior editor for the Journal of Research Methodology, the latter two published in Thailand.
Greene has been a member of AEA since 1985 and was honored with the association's 2003 Paul F. Lazarsfeld Award, presented to individuals whose written work on evaluation theory has led to fruitful debates on the assumptions, goals and methods of evaluation. She also is a member of the American Educational Research Association and the European Evaluation Society.
Greene was honored with a Distinguished Teaching Award from Cornell University in 1997, a Distinguished Senior Scholar Award from the University of Illinois at Urbana-Champaign in 2003, and is the 2005-2006 recipient of the R. Stewart Jones Award for the Outstanding Teacher in Educational Psychology Award. She is a 1971 graduate of Wellesley College and earned her Master's and Ph.D. at Stanford University.
"I am honored to assume the leadership of the American Evaluation Association, especially at this time when - around the globe - evaluation has become a highly valued contributor to policy and program decision making," states Greene. "Evaluation is thus valued in significant part because it brings the reason of empirical data and the legitimate plurality of diverse perspectives to the central responsibility of societies to care well for their citizens."
|Policy Watch - Whatever Happened to GPRA and PART?|
From George Grob, Consultant to the Evaluation Policy Task Force
On January 4, the President signed HR 2142, the GPRA Modernization Act of 2010. A summary, prepared by the Congressional Research Service (CRS), is also available. The bill amends the Government Performance and Results Act (GPRA) and other statutory provisions related to performance reports, and incorporates some broad principles underlying the Program Assessment Rating Tool (PART) of President Bush's administration and many of President Obama's policies related to a highly performing government. The act provides a three tiered approach to performance management that includes four-year strategic plans, annual performance plans, and high priority goals.
Of special interest to evaluators are the following provisions:
1. Evaluation policies carried over from the original GPRA legislation, including a definition of "program evaluation" as "an assessment, through objective measurement and systematic analysis, of the manner and extent to which Federal programs achieve intended objectives;" and requirements to describe program evaluations used in establishing or revising general goals and objectives in agencies' strategic plans and provide a schedule for future program evaluations, evaluate agency performance plans against performance goals, and include in annual performance reports a summary of relevant program evaluation findings.
2. New evaluation requirements: A requirement for the "Director of the Office of Personnel Management, in consultation with the Performance Improvement Council, . . . [to] identify the key skills and competencies needed by Federal Government personnel for developing goals, evaluating programs, and analyzing and using performance information . . ."
3. New roles for the Office of Management and Budget (OMB): The responsibility to assess program performance and to inform the agency, the Congress, and the Government Accountability Office of unmet goals. The head of the agency may need to prepare plans to correct performance deficiencies.
4. Transparency: The establishment of a Federal website to publish performance goals and assessments.
There is much good news here in the bill's retaining (and thus emphasizing) evaluation as a central aspect of performance management. The new requirement to identify key skills and competencies for evaluating programs will also have a strong and enduring impact on Federal evaluation functions, depending on how it is implemented. A key concern of federal staff may well be in the magnitude of the administrative tasks and the feasibility of carrying out all the requirements of the law. There is also some uncertainty about how active OMB will be in its independent assessment of performance.
This is necessarily a very brief summary of a law that will profoundly impact the management and assessment of Federal programs for years to come. We will discuss this more in the future. Meanwhile, as always, it will be helpful for us to hear from you about your comments and concerns.
Go to the EPTF Website and Join the EPTF Discussion List
|TechTalk - Where Will Technology Take Evaluation?|
From LaMarcus Bolton, AEA Technology Director
I think we can all agree that technology is constantly changing the way we conduct evaluations. I spoke with several 2010 AEA/CDC Summer Institute attendees to discuss how they felt technology will impact the field of evaluation in the future. This may come as no surprise, but the general consensus was that technology will improve the overall efficiency and effectiveness of evaluations.
Many individuals feel that technology will continue to impact the way data are collected and analyzed. Zundra Bateaste-Sutton, Mississipi State Asthma Evaluator at the Mississippi State Department of Health, believes that technology will, "...make creating evaluation tools, collecting, and analyzing data more simple." David Kim, Heart Disease & Stroke Prevention Project Manager at the Virginia Department of Health, agreed: "Technology will definitely make it easier for evaluators to collect data and obtain quality data sources, such as crowd-sourcing and focus groups."
Several of the Institute attendees also believe that technology will promote more sharing and openness with regard to data. In her field, Zundra Bateaste-Sutton stated that in the long-term, technology would, "...provide information across the healthcare continuum so that everyone can see it." Stacy Carruth, Community Health Specialist at the Regional Center for Healthy Communities noted that technology would, "...make data more accessible via online tools." Dantrell Simmons, Data Collection and Retention Specialist at AID Atlanta, Inc., feels that technology will promote integrity within the field of evaluation: "Having increased availability of data would promote transparency among researchers." Felix Blumhardt, Lead Evaluator of the Evaluation Group, believes that the spiraling prevalence of data-sharing will ensure that the field of evaluation would continually expand.
Another common theme in my conversations with the Institute attendees was that technology is actively changing the ways in which results and research findings are presented to stakeholders. Attendees, such as Steve Fleming, Systems Analyst at the National Center for Educational Achievement, believes that in the future, technology will promote even better visualization of data than what is currently available. Charlotte Kabore, CDC Public Health Advisor at the Oklahoma State Department of Health, echoed: "Technology will allow us to take data we are given and put it into specific formats--maps, visual aids, whatever it may be." Rebecca Buzzard, Program Coordinator and Evaluator at Craven Smart Start, Inc. explained the importance of visualization in evaluation research: "Visualization tools, such as motion charts, for instance, really help people understand complex data."
This article is one part of a short series on technology use in evaluation. If you have any predictions as to how technology may impact evaluation in the future, please consider sharing within our technology forum! Though, if you have any other questions, I can be reached at email@example.com.
AEA Honors Author and Editor Jonathan Morell
|Join us as we congratulate the recipients of AEA's 2010 Awards. Recognized were four awardees who've helped heighten international evaluation efforts, spearheaded a groundbreaking new journal, influenced a health initiative that impacted the lives of children and families in five urban communities, and influenced a new generation of evaluators. We will spotlight one recipient below and will highlight others in upcoming issues of AEA's monthly newsletter. Thank you all for your generous contributions to AEA, to the field, and to our greater communities.
Jonathan A. Morell was honored as the recipient of AEA's Paul F. Lazarsfeld Evaluation Theory Award. Morell, an influential author and editor, co-founded the international journal Evaluation and Program Planning (EPP) in 1978 and continues to serve as its editor three decades later. His newest book, Evaluation in the Face of Uncertainty: Anticipating Surprise and Responding to the Inevitable once again contributes to the cutting edge of evaluation theory. AEA's Lazarsfeld Award is presented to an individual whose written work on evaluation theory has led to fruitful debates on the assumptions, goals, and practices of evaluation.
"Among the many factors considered in his selection was his sustained influence on evaluation theory," states Tarek Azzam, chair of AEA's Awards Committee. "Through his writings, Morell consistently integrated other disciplines to help inform evaluation practice and has addressed many issues in evaluation theory that focus on ethics, the role of methodology, and systems theory. He has had a sustained influence on evaluation theory and has helped shape the debates surrounding its development."
Morell, an Ann Arbor, MI resident and graduate of McGill University and Northwestern University, is an organizational psychologist who has spent his professional life trying to integrate hands-on evaluation work and theoretical interests in evaluation methodology. As a practitioner he evaluates organizational change, research and development, and safety programs. His theoretical interests include the nature and use of logic models, the role of Lean Six Sigma methodologies in evaluation, complex system behavior, and the nature of practical action. Morell is a long standing member of AEA, where he has been instrumental in founding two of its topical interest groups -- Systems and Business and Industry, and is the 1995 recipient of AEA'S Robert Ingle Service Award. Morell's present focus is on trying to answer two questions: How can the integrity and power of evaluation be maintained in the face of unexpected behavior in programs and their evaluations? What would ensue from a tight integration of agent based modeling and traditional evaluation methods?
"Two great pleasures in my life are thinking about what it means to evaluate a program, and doing hands-on practical evaluation work," says Morell. "I have always been intrigued, and struggled, with how these two interests could be made to support each other. I am very pleased that others have seen some merit in the results of my efforts at resolution. I hope I can contribute to a community of interest that will grow our collective ability to do better evaluation for our clients and stakeholders."
We invite you to think about AEA's 2011 Awards and to consider nominating colleagues for this distinction. You'll find an overview at the link below, as well as specific nominating instructions.
Go to AEA's Awards Page
|Journal Highlights to Watch for in 2011|
As we enter into the New Year, we touched base with the editors of New Directions for Evaluation and the American Journal of Evaluation. Read on to learn more about the special issues ahead.
Highlights for AJE
Volume 32 coincides with the 25th anniversary of AEA. To help mark this noteworthy occasion, AJE will feature several invited papers in each of the four issues of 2011.
In Issue 1 (March 2011):
- Michael Morris, founder and long-time editor of the Ethical Challenges section of the journal, offers a retrospective examination of 25 years of writing about ethics in evaluation. This paper will also be accompanied by a podcast, details to be shared at a later date.
- Barbara Means, Director of the Center for Technology at SRI International, and William Penuel, Director of Evaluation Research for the Center, will present a thoughtful look at the ways in which evaluations can make use of large-scale databases that have been developed in recent years.
Subsequent issues of Volume 32 will feature other invited papers. Stay tuned for more details or visit the AJE website.
Upcoming in NDE
Look for NDE issues that address some long-standing issues in evaluation, but from an updated perspective. There will be issues on how evaluation use is affected by participation and involvement in the evaluation process; that take a contemporary look at the Campbellian notions of validity; and that revisit the nature and scope of internal evaluation. NDE will also publish a special commemorative issue to celebrate AEA's 25th anniversary. Instead of reminiscing about the past, this special issue will anticipate the future through the eyes of young and new evaluators. This issue will cast an eye to a sunrise rather than a sunset, and NDE will feature some twenty young evaluators to share what matters to them, theoretically, conceptually and practically as they begin their professional lives as evaluators. We received over 150 proposals, an indication the future of evaluation looks bright.
A few other things to look for in 2011:
- NDE is currently being reviewed for inclusion in SCOPUS, the largest abstract and citation data base, and we hope to be accepted in the upcoming year.
- The best selling issue of NDE, Evaluation Models, is likely to be translated into Korean.
- The search for the next editor-in-chief will begin.
Stay tuned for more details or visit the NDE website.
|AEA Welcomes Data Visualization and Reporting Topical Interest Group|
Media, web design, and marketing have all created an environment where our stakeholders - clients, program participants, funders - expect high quality graphics and reporting that effectively convey the valuable insights from our evaluation work. Global growing interest in improving communications has begun to take root in the evaluation field as well. A new Data Visualization and Reporting Topical Interest Group (TIG) has been formed with the following purposes in mind, to:
· Define data visualization as it applies to evaluation
· Be a voice and resource for high quality visual displays of data
· Develop and promote high standards for data visualization and reporting in evaluation
· Keep AEA and evaluators current with the exponential growth of data visualization
· Facilitate exploration of data visualization as an emerging need among stakeholders
· Illustrate the use of data visualization for evaluation planning and analysis
· Lead by example
· Translate complicated data into understandable visual media
· Explore alternative reporting options, including social media
· Make evaluation more beautiful and useful
"We believe the proposed purposes align closely with AEA's mission to improve evaluation practices and use, promote the profession, and support the field's contribution to theory and knowledge," says TIG co-chair Stephanie Evergreen. "We feel like this new TIG is a home for many evaluators who want to improve communication of their work." The TIG, she adds, offers additional benefits to its members and to the broader AEA community, including:
· Improving graphic design, layout, and technology skills
· Creating guidelines for slideshows and written reports
· Compiling and disseminating a repository of outstanding examples
· Providing peer-critique of reports and data visualizations
· Establishing partnerships and bringing in speakers from outside of evaluation
· Sponsoring workshops or "boot camps"
· Identifying, promoting, and demonstrating resources for improved data visualization and reporting
If you'd like to join the Data Visualization and Reporting TIG or learn more, you are welcome to contact the TIG co-chairs at firstname.lastname@example.org or AmyGermuth@EvalWorks.com.
|AEA Volunteer Opportunities|
|Make a difference in your association for 2011!|
Data Confidentiality Working Group: AEA seeks to be more transparent, sharing monitoring data and aggregated demographic information so that our members, the public, and volunteer leaders may better understand the association. In order to do so, we seek to establish policies around data handling and confidentiality with respect to our internal datasets. Are you knowledgeable about data reporting standards? Would you like to help establish policy that will help AEA to increase its transparency and accountability? Please send a note of interest to AEA Executive Director Susan Kistler at email@example.com by Friday, February 11, indicating your interests and expertise in this area. The working group will meet via phone over a period of approximately three months and prepare a short report for the Board's June meeting.
New Jobs & RFPs from AEA's Career Center
What's new this month in the AEA Online Career Center? The following positions and Requests for Proposals (RFPs) have been added recently:
- Director, Center for Research and Evaluation on Education and Human Services at Montclair State University (Montclair, NJ, USA)
- Applied Research/Criminal Justice Research Analyst at SANDAG - San Diego Association of Governments (San Diego, CA, USA)
- Senior Policy Analyst at The Hilltop Institute, University of Maryland Baltimore County (Baltimore, MD, USA)
- Research Analyst at University of Massachusetts Donahue Institute (Hadley, MA, USA)
- Evaluation Specialist at Management Systems International (Washington, DC, USA)
- Research Analyst at Independent Project Analysis Inc. (Ashburn, VA, USA)
- Project Director at Human Services Research Institute (Cambridge, MA, USA)
- Capacity Analysis of African Agricultural Institutions, West Africa at Management Systems International (Washington, DC, USA)
- RFP: Cross-national Evaluation of Girls' Education Program at Room to Read (San Francisco, CA, USA and International)
- Senior Research Associate at Public/Private Ventures (Philadelphia, PA, USA)
Descriptions for each of these positions, and many others, are available in AEA's Online Career Center. According to Google analytics, the Career Center received approximately 5,000 unique page views in the past month. It is an outstanding resource for posting your resume or position, or for finding your next employer, contractor or employee.
|The American Evaluation Association is an international professional association of evaluators devoted to the application and exploration of evaluation in all its forms.
The American Evaluation Association's mission is to:
- Improve evaluation practices and methods
- Increase evaluation use
- Promote evaluation as a profession and
- Support the contribution of evaluation to the generation of theory and knowledge about effective human action.
phone: 1-508-748-3326 or 1-888-232-2275