|Newsletter: April 2010
||Vol 10, Issue 4|
|Increasing Member Involvement - and Input |
|Dear AEA Colleagues,
I am proud to be president of an association that actively engages its members and provides opportunities for interaction with such a wide range of professionals. Since I became a member over two decades ago, AEA membership has tripled (at least!), reflecting the heightened importance of evaluation nationally and internationally. Throughout that time, AEA has been defined in great part by a "big tent" philosophy demonstrated by efforts to keep membership and conference costs down while increasing membership benefits.
When AEA's Coffee Break Series was introduced in January, for example, we immediately had more than 300 attendees participate. A quick scan showed participation from Australia, Barbados, Brazil, Canada, Denmark, Egypt, Israel, Italy, the Netherlands, New Zealand, Peru, Philippines, South Africa, Switzerland, Tunisia and the United States! The session was attended by both new and senior members and the feedback has been quite positive. If you haven't already, I invite you to check out AEA's Coffee Break webinar schedule - which you receive by email and is available online. And, if you can't attend, you have access to recordings via the archive that you can download at your convenience.
While AEA's emphasis has been and will continue to be on ways to serve you, under our new governance structure the board is also committed to (a) increasing ways for members to participate in and lead community activities, like the Coffee Break Demonstrations, and (b) getting more input from members on strategic directions and priorities for the future. Evidence of the former is the Membership Involvement Initiative and one of its strategies, our online volunteer opportunities page. To begin to make progress on the second goal, a board task force on member engagement was created in February. Its first steps are to develop policy that codifies the expectations that this and future Boards will reach out to members for guidance. I hope that it will follow up with some specific plans for carrying out that commitment.
Of course, the old and new services, the Member Involvement Initiative, and the new board governance structure and policies need to be evaluated. We have been so busy over the years making sure that we were providing services, that we were not always as focused on evaluation as you might expect from an association devoted to the practice! A systematic approach to evaluating our work through an ongoing review and assessment of policies is another hallmark of the new governance structure. Stay tuned to learn more about the board's progress on that front.
In the meantime, thank you all for your active participation. It's one of the defining features of AEA and one that can make us all proud!
|Policy Watch - An Invitation to Comment on the Evaluation Roadmap |
From George Grob, Consultant to the Evaluation Policy Task Force
Earlier this week AEA Evaluation Policy Task Force (EPTF) Chair Patrick Grasso sent a message to AEA members inviting all of you to comment on the Evaluation Roadmap. I encourage you to take advantage of this opportunity if you have not already done so.
The Evaluation Roadmap is the name now commonly used for a document entitled "An Evaluation Roadmap for a More Effective Government," which three AEA Presidents (Leslie Cooksy, Debra Rog, and William Trochim) sent to Peter Orszag, Director, Office of Management and Budget, in February 2009, shortly after he took office. The paper explains the importance of evaluation as an essential ingredient of good government, and it describes concrete steps to make this happen.
This document is important. It has been widely circulated among government policy makers and has been recognized as a useful reference source and practical guide for promoting effective use of evaluation in government. However, it can be made even better with your input.
The EPTF prepared the original version of the Roadmap to reach incoming officials of United States President Obama's administration as they were formulating their management agenda. We are now interested in framing the Roadmap as a foundational document of AEA to guide evaluation policy development work well into the future. The EPTF has redrafted the Roadmap to eliminate references to the Obama and Bush Administrations in order to eliminate any partisan interpretation of it and to emphasize the importance of evaluation in the context of abiding, fundamental principles of governance.
The AEA Board, its Presidents (the three mentioned above plus President-elect Jennifer Greene), and the EPTF invite your comments. On for before Friday, May 21, please login at the link below to provide your feedback.
|Meet Tarek Azzam - Assistant Research Professor |
AEA's 5,700 members worldwide represent a range of backgrounds, specialties and interest areas. Join us as we profile a different member each month via our Questions and Answer column. This month's profile spotlights a southern California evaluator and assistant research professor who also leads AEA's Awards nominations process. If you haven't yet, consider putting forth submissions for those you feel worthy of recognition. Nominations deadline is Friday, June 4.
Name, Affiliation: Tarek Azzam, Claremont Graduate University
Degrees: Ph.D. in Social Research Methodology and B.A. in Psychology and Economics
Years in the Evaluation Field: 8 Years
Joined AEA: 2002AEA Leadership Includes:
Chair of the Awards Committee and Member of AEA's Values Priority Area Team, Co-founder and Program Chair of the Research on Evaluation TIG, Former Program Chair of Theories of Evaluation TIG
Why do you belong to AEA?
I chose to belong to AEA because it represents the only professional organization in the U.S. that focuses on evaluators and offers a forum for practitioners, researchers and theorists to share new ideas, and debate the issues that face our profession and its ultimate impact on society.
Why do you choose to work in the field of evaluation?
Early on in my career I was involved in basic research studies that were rarely applied to real-world problems. I wanted to do work that had a direct positive impact on people's lives. As a graduate student I enrolled in an evaluation theory course that completely changed my thinking about the role of evaluation and how it can be used for program improvement. In many ways I had found my calling, as an evaluator. I have had the opportunity to work on challenging projects, develop and implement interesting methods, and potentially improve programs and policies. These elements perfectly fit my professional and personal interests.
What's the most memorable or meaningful evaluation that you have been a part of - and why?
I conducted an evaluation of a university-level academic support program that targeted entering college freshman and provided them with peer mentors, supplemental instruction, and academic skills training. The evaluation of the program utilized a randomized control trial design where students were randomly selected into either a control or treatment condition. It was memorable because this evaluation clearly illustrated the political and logistical difficulty of implementing such a design in a real world setting. During the conduct of this evaluation issues of unfairness, attrition, and treatment group contamination emerged that quickly turned a simple clean design into a more complicated endeavor and it taught me a lasting lesson about our lack of control and the importance of recognizing the contextual factors that impact our evaluations.
What advice would you give to those new to the field?
New evaluators should be aware of the various evaluation theories that have shaped the thinking behind the profession. These theories can provide relevant guidance on how to resolve difficult issues that arise during the conduct of an evaluation project. I would also suggest that they approach any evaluation with a willingness to listen and respect stakeholder needs and interests and to allow the evaluation questions to drive the evaluation design and methods.
|Evaluation 2010 Hotel Room Block Now Open|
Are you joining us at Evaluation 2010 in San Antonio this November? Be sure to book your hotel reservations at your earliest convenience to be assured a room at your preferred hotel. AEA's room blocks regularly sell out well in advance of the event and, while we do usually secure additional overflow housing, it is likely at a further distance and/or a greater price than our two blocks reflected below.
The Grand Hyatt San Antonio at 600 East Market Street is the headquarters hotel for the annual conference. All sessions and workshops will be held at the Grand Hyatt and its four-star reputation and location on the quiet end of the River Walk makes the Hyatt a perfect home base for the conference. Our room block there is offered at a great price of $179 for single or double occupancy.
The La Quinta Inn and Suites at 303 Blum is perfect for those on a tighter budget. A short block and a half away, it is the nearest mid-price hotel to the Hyatt and reservations include free Internet and a limited breakfast each morning. At $117 single or double, the discounted conference rate is within the government per diem for San Antonio.
Warning! There is one other Hyatt and two other La Quintas in San Antonio, each of which is significantly farther away from the conference. Be sure to book within the conference block to get the discounted rates and ensure you are at the correct hotel.
Go to the Evaluation 2010 hotels page to learn more and reserve at the discounted rates
|TechTalk - Getting a Handle on Headlines |
|From LaMarcus Bolton, AEA Technology Director
If you are a subscriber to our popular EVALTALK listserv you may have noticed recent weekly emails entitled, "AEA Headlines and Resources." However, many of you may be left with several questions, with perhaps the biggest being, "What exactly is it?"
AEA's Headlines and Resources list is an aggregated source of evaluation and methodology news, resources, events, and volunteer opportunities. The headlines are posted from our Twitter account and are simultaneously sent to a variety of sources, including our LinkedIn Community. The goal of the Headlines and Resources list is to present information that is useful to the evaluation community at large, in a timely manner, and with a link to where to learn more. For example, in addition to providing links to upcoming AEA webinars and the past week's AEA365 posts, we feature things such as new eLibrary submissions, updates on member blogs, links to evaluation articles in the news, and notices of new major evaluation reports and handbooks.
Curious how to subscribe? Well, I'm happy to say that getting Headlines and Resources is easy! As noted earlier, if you already subscribe to EVALTALK, the headlines and resources are posted each Sunday. You can also follow @aeaweb on Twitter, or join our LinkedIn community and check out the news section.
However, perhaps the easiest option, is to sign up to receive the Headlines and Resources list via a weekly email each Sunday delivered right to your inbox. Use the link at the bottom of this article to sign up for the Headlines and Resources list via our signups page. It is the third item down - but check out the other options for staying connecting and informed while you are there.
Have a suggestion for headlines content? By all means, send us an email at email@example.com. And, if you have a question or suggestion regarding the technology and/or process, please do not hesitate to contact me at firstname.lastname@example.org.
Go to the Alerts Signup Page to Subscribe to Resources and Headlines
|Handbook of Program Evaluation for Social Work and Health Professionals|
|AEA member Michael J. Smith is the author of a new book published by Oxford University Press. Handbook of Program Evaluation for Social Work and Health Professionals is a reference guide with concrete examples that includes both qualitative and quantitative data in evaluation.
From the Publisher's Site:
"Evaluation is crucial for determining the effectiveness of social programs and interventions. In this nuts and bolts handbook, social work and health care professionals are shown how evaluations should be done, taking the intimidation and guesswork out of this essential task. Current perspectives in social work and health practice, such as the strengths perspective, consumer empowerment, empowerment evaluation, and evidence-based practice, are linked to evaluation concepts throughout the book to emphasize their importance.
This book makes evaluation come alive with comprehensive examples ... Equal emphasis is given to both quantitative and qualitative data analysis with real examples that make statistics and concepts in qualitative analysis un-intimidating.
By integrating both evaluation and research methods and assuming no previous knowledge of research, this book makes an excellent reference for professionals working in social work and health settings who are now being called upon to conduct or supervise program evaluation and may need a refresher on research methods. With a pragmatic approach that includes survey design, data collection methods, sampling, analysis, and report writing, it is also an excellent text or classroom resource for students new to the field of program evaluation."
From the Author:
"I wanted to write a comprehensive text of basic evaluation concepts for evaluation courses and staff trainings. There is a whole chapter which emphasizes pure program description to balance what I feel is the extreme use of logic models in the field. There are exhibit/examples of every type of evaluation study in the appropriate chapters. To balance the quantitative analysis, examples of the process of analyzing qualitative data are presented."
About the Author:
Michael J. Smith is a Professor at the Hunter College School of Social Work and the Ph.D. Program of the Graduate Center of the City University of New York. For 15 years he taught a program evaluation course in the Masters program in Health Advocacy at Sarah Lawrence College in Bronxville, NY. He has conducted evaluations in the fields of child welfare and youth programs, aging, family support programs for developmentally disabled children and employee assistance programs.
AEA members can get a 20% discount by entering in promo code 28444 and the book's ISBN 9780195158434 on the Oxford University Press site.
Go to the Publisher's Site
|OPEG Schedules Spring Exchange for Friday, May 14|
|The Ohio Program Evaluators' Group (OPEG) will hold its 2010 Spring Evaluators' Exchange on Friday, May 14. The conference theme is Mixed Methods in Evaluation with keynote speaker Donna M. Mertens, recipient of AEA's 2009 Paul F. Lazarsfeld Award for Evaluation Theory.
Mertens is a professor at Gallaudet University in Washington, DC, author of Transformative Research and Evaluation, editor of the Journal of Mixed Methods Research, and is one of the co-chairs of AEA's new Mixed Methods in Evaluation topical interest group.
OPEG is a statewide organization of professionals that was founded in 1980 and today represents a cross section of agencies, institutions and fields. OPEG is:
- A non-profit network for evaluators in Ohio
- A regional affiliate of the American Evaluation Association
- An organization with more than 100 active members and a network of more than 700 evaluators
- Seeks to promote quality evaluation and research as critical components of service delivery programs in Ohio
- Supports those who conduct evaluations, whether part- or full-time
- Is open to anyone with an interest in evaluation, students as well as professionals
The Evaluators' Exchange provides a forum for OPEG members to present their research and evaluation work to colleagues statewide. It also provides a valuable opportunity to learn from expert keynote speakers and network with evaluators and other professionals working in various fields and disciplines. The event will be held at the Quest Business Center in Columbus. For more information or to register, visit http://www.opeg.org
|Many Ways to Volunteer - As Manager, Curator, Speaker|
Looking for ways to get involved in the life of the association? AEA's Member Involvement Initiative (MII) has the following updates related to volunteer opportunities:
AEA365 Blog Managers and Curators:
intern, John LaVelle, has gotten the aea365 tip-a-day blog up
and on its feet with readership growing daily. John will
complete his time with us at the beginning of the summer and we
are looking for a few good members to serve on a working group
to take over nurturing this resource through the coming year.
Blog managers and curators help to solicit content, reaching out
to the range of members across the breadth of the field. They
perform a quick edit on new contributions, upload them to the
website, and identify appropriate tags for cross-referencing.
You'll build your volunteer management skills and learn how to
develop and moderate a blog. Those interested in serving should
have strong communications skills, but need not have written for
a blog before. If you are interested in serving on this working
group, please send an email to
email@example.com. (Deadline: April 15, 2010)
Washington Local Affiliate Call for Speakers: The Washington DC Evaluators (WE) Local Affiliate is planning a year's worth of brownbag sessions around the theme "Phases of Evaluation" from pre-evaluation planning to use of evaluation results. If you will be in the DC area at some point between March and December of 2010 and are working on an interesting project with lessons learned, or are an author of a book (completed or in progress) that ties to their theme, this may be the perfect opportunity to engage with a community of professionals. The brown bag meetings are two hours in length with approximately 90-minutes of that time devoted to the presentation and discussion. Presentations are free to attendees and speakers are not paid. Please contact Brian Yoder at firstname.lastname@example.org to discuss presenting your work in this context.
New Jobs and RFPs from the AEA Career Center
What's new this month in the AEA Online Career Center? The following positions and Requests for Proposals (RFPs) have been added recently:
- Research Associate II at University of Southern Maine (Augusta, ME, USA)
- Senior Evaluations Specialist at ICF Macro (Calverton, MD, USA)
- Energy Efficiency Evaluation of 2010-2012 IOU Portfolios at California Public Utilities Commission (San Francisco, CA, USA)
- Workforce Development Evaluation Consultant at ZERO TO THREE (Los Angeles, CA, USA)
- Evaluator at David Heil & Associates Inc. (Portland, OR, USA)
- Capacity Building Assistance, Evaluation Specialist at AIDS Project Los Angeles (Los Angeles, USA)
- Evaluator at Western Michigan University Evaluation Center (Kalamazoo, MI, USA)
- Social Network Analyst at Academy for Educational Development (Washington, DC, USA)
- Research Associate I at HighScope Educational Research Foundation (Ypsilanti, MI, USA)
- Evaluation Consultant (Contractor) at TCC Group (New York, NY, USA)
Descriptions for each of these positions, and many others, are available in the AEA Online Career Center. According to Google analytics, the Career Center received over 4000 unique visitors last month. It is an outstanding resource for posting your resume or position, or for finding your next employer, contractor or employee.
Job hunting? You can also sign up to receive notifications of new position postings via email or RSS feed.
Go to the AEA Online Career Center
|The American Evaluation Association is an international professional association of evaluators devoted to the application and exploration of evaluation in all its forms.
The American Evaluation Association's mission is to:
- Improve evaluation practices and methods
- Increase evaluation use
- Promote evaluation as a profession and
- Support the contribution of evaluation to the generation of theory and knowledge about effective human action.
phone: 1-508-748-3326 or 1-888-232-2275