Newsletter: September 2013
|Vol 13, Issue 9|
|We're Almost There!|
It's fall, and the annual AEA conference is upon us! In two weeks, we'll be meeting in Washington, D.C., and, ideally, experiencing some wonderful fall weather, leaves, and color and being educated and inspired by speakers and panels.
The local arrangements chairs, David Bernstein and Valarie Caracelli, have bent over backward to welcome us to D.C. For your leisure time, see their "DE Restaurants and Arrangements" on the AEA website under the conference tab. They have suggestions for how to get around D.C., where to eat, and what to do. See also their "Evaluators Visit Capitol Hill Initiative." It provides much useful information on what's involved in visiting your representatives and how AEA can have an impact. Almost 70 AEA members have signed up to participate in these visits this year. Although the deadline for signing up this year has passed, we encourage you to visit your congressman or senators in the future when you're in D.C. and educate them about evaluation. The AEA Board recently voted for AEA conferences to meet in D.C. every fourth year, so we will be building on David and Val's work in future years.
Enough about what's going on in D.C.! What about what's going on at the conference? For those of you attending the conference for the first time or others still overwhelmed by the number of sessions, let me give you an overview and make some suggestions: The conference begins on Wednesday afternoon and continues all day until late Saturday afternoon. For networking and for fun, the Wednesday and Friday sessions end with a giant reception for all attendees. During these receptions, you will have opportunities to "meet the authors" of evaluation books and view and chat with people presenting poster topics (Wednesday) and to participate in our fabulous silent auction, including beautiful things from other countries as well as evaluation books (Friday). On Thursday evening, TIGs and other groups will be sponsoring many different receptions. Attend your TIG's business meeting to learn more. AEA members are a very friendly group. So, take the opportunity to make some contacts who do the kind of work you do.
Now, what about the sessions themselves? How do you choose? First, during at least one time period each day, there will be only one session: the plenary for that day. Plenaries begin and end the conference. I'll introduce the conference and talk about the practice of evaluation today in the first plenary on Wednesday at 3:10 p.m. On the last day, Saturday at 4:30 p.m., a panel of AEA members, led by Jonny Morell, will critique the conference and talk about what they've learned. The plenaries on Thursday and Friday at 9:50 a.m. bring two esteemed speakers to AEA. On Thursday, Arthur Lupia, professor of political science at University of Michigan, will speak on his interdisciplinary research and practice in communicating complex information to decision makers. On Friday, John Easton, the director of the national Institute of Education Sciences (IES) will talk about future directions in evaluation and research in education.
So, plenaries are an easy choice — big speakers, big topics, everyone is there! The harder choices concern the times when there are many competing sessions. If you're intrigued with the theme, look for the presidential strand session during each time period. (See list on conference website). This is a session developed by the program committee to reflect something about the broad range of evaluation practice today and the disciplinary and contextual factors that influence that practice. We have some wonderful panels and speakers, ranging from individuals describing anthropological approaches to evaluation, the practice of policy analysis, foundations' concerns, issues in environmental evaluation and disaster and emergency management today, data visualization in the 21st century, and others.
Another strategy: If you've found one or more TIGs that match your interest, look for their programs in a time slot or search for them in our searchable program. Finally, if you know you want to hear a particular speaker, you can search for that person's name on the program. There are so many options. I'm always torn between three or four sessions each period.
I know you'll have a wonderful time and great learning experiences at AEA. I've come to AEA conferences for more than 25 years and, through them, have met co-authors, colleagues, and lifelong friends, and learned so much about evaluation. My best wishes to you for a wonderful experience at AEA 2013!
AEA 2013 President
Meet Stewart Donaldson - Incoming President-Elect
At Claremont Graduate University, Stewart Donaldson develops and leads one of the most extensive and rigorous graduate programs specializing in evaluation, teaching numerous university courses and professional development workshops. He has mentored and coached more than 100 graduate students and working professionals during the past two decades. In addition, he directs the Claremont Evaluation Center and has provided evaluation services to more than 100 different organizations and has been principal investigator for more than 30 extramural research and evaluation grants and contracts.
Donaldson is no stranger to AEA. An active member since 1987, he has lent his voice to no shortage of boards and task forces for the advancement and betterment of evaluation theory and practice. Donaldson is currently the proud director of the AEA Graduate Education Diversity Internship (GEDI) Program and recently served a three-year term on the AEA Board. He leads the Certificate for the Advanced Study of Evaluation Program at Claremont. He is a fellow of the Western Psychological Association, on the boards of EvalPartners and the International Positive Psychology Association, and is on the editorial boards of the American Journal of Evaluation; New Directions for Evaluation, Evaluation and Program Planning; and the Journal of Multidisciplinary Evaluation. He is the recipient of AEA's 2013 Paul F. Lazarsfeld Evaluation Theory Award for his written contributions to the advancement of evaluation theory and was recently elected president of AEA.
In his ballot statement, Donaldson stated:
"I believe my extensive leadership and management experience, participation in the development of AEA over the past 25 years, and the many contributions I have made toward advancing evaluation theory, practice, and education have prepared me well to serve as President of AEA. If I am fortunate enough to be elected, I will give you my strongest commitment that my presidency will embody AEA's mission, vision, and values; serve and engage our diverse membership; promote diversity and inclusion; further strengthen our new policy-based governance practice; meet the highest ethical standards; emphasize developing a productive and meaningful relationship with our new association management partner; focus on increasing the demand for evaluation; and I will strive to strengthen our relationships with like-minded organizations and international partners in an effort to expand evaluation's positive influence worldwide."
We welcome Stewart Donaldson and thank you all who participated in this year's election process.
|AEA's Values - Walking the Talk with Katrina Bledsoe|
Are you familiar with AEA's values statement? What do these values mean to you in your service to AEA and in your own professional work? Each month, we'll be asking a member of the AEA community to contribute her or his own reflections on the association's values.
AEA's Values Statement
The American Evaluation Association values excellence in evaluation practice, utilization of evaluation findings, and inclusion and diversity in the evaluation community.
i. We value high quality, ethically defensible, culturally responsive evaluation practices that lead to effective and humane organizations and ultimately to the enhancement of the public good.
ii. We value high quality, ethically defensible, culturally responsive evaluation practices that contribute to decision-making processes, program improvement, and policy formulation.
iii. We value a global and international evaluation community and understanding of evaluation practices.
iv. We value the continual development of evaluation professionals and the development of evaluators from under-represented groups.
v. We value inclusiveness and diversity, welcoming members at any point in their career, from any context, and representing a range of thought and approaches.
vi. We value efficient, effective, responsive, transparent, and socially responsible association operations.
Hello to my evaluation colleagues!
My name is Katrina Bledsoe, and I am a research scientist and senior evaluation specialist at Boston-based Education Development Center Inc. (EDC). When asked to articulate how AEA's values are represented in my own practice as an evaluator, I asked how much time I had to contemplate this request. We spend so much of our time doing the work that we have little time to reflect upon what guides it and how it is carried out. Thus, this request gave me the opportunity to consider how my professional (and personal) values help to define my work.
I grew up in a family that in many ways epitomized the values of AEA. Whether doing volunteer or professional work, my parents were guided by a code of ethical and moral practice. The projects in which they were involved were those focused on social responsibility, and improving and changing peoples' lives through social justice programs and policies. I have been fortunate to work in a field and belong to an organization that is reflective of my lifelong credo.
I originally came to Washington, D.C., buoyed by friends and colleagues in the field living in the area who are making a difference in the evaluation work they do whether for federal agencies, universities, foundations, or consulting organizations. Despite the current political climate in the District, I am proud to practice in Washington in a field that is dedicated to social resiliency and social change. I strive to work collaboratively with organizations, communities, governments, and societies in a way that emphasizes quality, responsiveness, ethical practice, and understanding cultural contexts. Evaluation should be conducted in a manner that "does no harm," and, in the best-case scenario, encourages and enhances social justice and social betterment. I believe that in dynamic and challenging times evaluation should be flexible in how it is practiced in a variety of contexts.
Much of the work I do is community-based and community-focused, and requires collaboration and inclusion. For instance, I build collaborative relationships by conducting evaluations primarily within the context in which the program and citizens reside. To insure responsiveness of the evaluative design, I try to engage and include all levels of stakeholders in the development of appropriate questions and methods. I also ask them to consider the impact of the evaluation findings, positive and negative, on the topic of interest. This consideration is key; results of the evaluation might impact not only the program, but also how people may choose to live their lives.
Although AEA is known as the American Evaluation Association, I view it as a global community of evaluators, all working toward effective, responsive, and socially responsible evaluation. I value the opportunity to learn from and to help to develop a field of competent evaluators who are both within and outside of the association. For instance, EDC is working with the Graduate Education Diversity Internship (GEDI) and continues to support AEA's values of developing a diverse professional field. I guess I would like to believe that all evaluators, experienced and new, seek to accomplish the same agenda: to provide services that help societies to meet both the strengths and challenges of the changing times.
|Policy Watch - Welcome New EPTF Members|
From Cheryl Oros, Consultant to the Evaluation Policy Task Force
Good news regarding expanding the expertise and size of the EPTF - please join me in welcoming three new members. They will help with our work on evaluation policy in the U.S. federal arena, as well as expansions to the state and international levels.
Rakesh Mohan has 22 years of experience in evaluation, performance auditing, and policy analysis, serving for the past 11 years as the director of the Idaho Legislature Office of Performance Evaluations. Rakesh and his colleagues have received the National Conference of State Legislatures Excellence in Research Methods and Excellence in Evaluation Awards, as well as AEA's Alva and Gunnar Myrdal Government Award. Rakesh is co-author of a book on promoting the use of government evaluations in policymaking and co-editor of Responding to Sponsors and Stakeholders in Complex Evaluation Environments. He has served on the U.S. Comptroller General's Advisory Council on Government Auditing Standards; AEA's Board of Directors and Publications Committee; the New Directions for Evaluation Editorial Board; AEA's State and Local Government TIG (as chair); and the Executive Committee of the National Legislative Program Evaluation Society.
Cynthia Clapp-Wincek has 28 years of experience in evaluation and performance monitoring. She has served as director, Office of Learning, Evaluation and Research at the U.S. Agency for International Development (USAID) since 2011. Cynthia has placed priority on quickly building the evaluation capacity of USAID to produce a number of higher quality evaluations as soon as possible. Cynthia previously served as an evaluation officer in the USAID Africa Bureau, developing a rapid appraisal approach for impact evaluations; and in the State Department, focusing on evaluation of assistance in Europe/Eurasia. Cynthia has also provided evaluation training and expert analysis as an independent consultant for USAID, International Labor Organization, the UN Population Fund, and the Organization for Economic Cooperation and Development. She co-authored "Beyond Success Stories; Monitoring and Evaluation for Foreign Assistance Results" in 2009.
Jonathan D. Breul has 23 years of experience in evaluation and performance measurment, including serving as an adjunct professor in the Georgetown University graduate Public Policy Institute since 2004. During the 19 years Jonathan served at OMB, he developed and launched the original Government Performance and Results Act (GPRA, 1993), designed to enable the federal government to manage and budget for results. He served 10 years as the director of the IBM Center for The Business of Government. Jonathon played an instrumental role in the design and operation of the President's Management Council; the development and implementation of President Bush's Management Agenda; and in efforts to increase the use of performance information in the resource allocation process. Jonathon has authored a number of publications aimed to assist government managers and assess government reforms.
In addition to welcoming these new members, the EPTF sends a special thanks to Eleanor Chelimsky, who is rotating off the task force, for her extensive and insightful contributions and her commitment to AEA's involvement in developing US evaluation policy.
Remember to join us at several events at AEA's Annual Meeting, Oct. 16-19, where you can learn about the EPTF's efforts and plans, as well as provide us with your comments and suggestions for evaluation policy development!
|Face of AEA - Meet Mary Nash|
AEA's more than 7,800 members worldwide represent a range of backgrounds, specialties, and interest areas. Join us as we profile a different member each month via a short question-and-answer exchange. This month's profile spotlights Mary Nash.
Name: Mary Nash
Affiliation: Program Development Consultant
Degrees: B.A., English, Connecticut College; MBA, public and nonprofit management, Boston University; pursuing evaluation certificate through The Evaluators' Institute at George Washington University
Years in the Evaluation Field: I've worked in nonprofit management and consulting for the past 25 years. Evaluation has been a component of much of my work, and I am now getting formal training in the field so I can expand this area of my practice.
Joined AEA: July 2013
Why do you belong to AEA?
I recently joined AEA so that I can learn more about the evaluation field and connect with others doing similar work. Membership in AEA was highly recommended to me by other evaluators! I'm looking forward to my first AEA conference and am participating in several of the Topical Interest Groups. I also plan to check out the coffee breaks and the various online learning options that AEA offers.
Why do you choose to work in the field of evaluation?
There is an increasing emphasis on the need for evaluation as a tool for improving and expanding programs, making decisions and setting policies. With limited resources available to public and nonprofit organizations, having evaluation data that demonstrates program effectiveness is vital. Working in the evaluation field enriches my work in developing and managing programs, as it helps me to think about a program's goals and ultimate effectiveness from the early planning stages.
What's the most memorable or meaningful evaluation that you have been a part of?
For the past several years, I worked closely with the Berkshire Compact for Education, a countywide consortium in Western Massachusetts focused on education and college access. This initiative is spearheaded by the Massachusetts College of Liberal Arts and includes collaboration with public and private partners throughout the region. I was involved in efforts to measure the Berkshire Compact's success in raising students' aspirations to pursue higher education and in improving access to education. I enjoyed working on a project that involved collaboration toward common goals and finding methods to measure both short-term successes and longer-term outcomes.
As a new member, what kinds of things are you interested in learning about in regard to AEA?
I am hoping to learn about ways to broaden my network of contacts to extend beyond my home territory of Western Massachusetts. I would love to work as part of a team on education and workforce development evaluations with a national scope or assist a seasoned evaluator on projects that are larger than what I can take on alone. As a solo consultant, I look forward to becoming part of the evaluation community and finding ways to exchange ideas and partner with others.
|eLearning Update - Discover Upcoming Coffee Break Demonstrations|
From Alexa Schlosser, AEA Headquarters
Our Coffee Break Webinars are short, 20-minute presentations of tools or tips we think evaluators will find helpful in their work lives. As September's webinars come to a close, let's take a look at what's in the pipeline for October:
CBD160: Using a Project Management Tool for your Evaluation Projects - Michelle Landry and Judy Savageau
Thursday, Oct. 10, 2013
2-2:20 p.m. ET
Organizing an evaluation project or multiple projects can be daunting when balancing staffing needs, timelines, responsibilities, budgets, and deliverables. Maintaining a quality and structured work environment benefits the entire team and clients. This coffee break webinar will share lessons learned during a recent review of project management tools, as well as our experience in the selection and implementation of one of these tools for managing the many aspects of evaluation work. Attendees will get to view samples of outputs from our project management tool and how this information has supported discussions and decision making for new project development, staff allocation, and budgeting.
CBD161: Logic Models As Simple Pictures - and So Much More - Jonny Morell
Thursday, Oct. 24, 2013
2-2:20 p.m. ET
Draw a simple picture. Use it to guide evaluation and work with stakeholders. Use a logic model like this and you will get a long way, but attention to a few principles will get you even further. This webinar will touch on the following principles:
- The same model can be constructed at different levels of detail.
- Modesty is important. Don't pretend that you know more than you do about how a program works or what it does.
- Values and ideology are implicit in those pictures.
- There is a relationship between visual display and information density.
- Don't get confused about the different ways logic models can be used: evaluation, advocacy, planning. What works for one use may not work for the others.
CBD162: Rights Based Evaluation: Implications for Evaluation Questions - Donna Mertens
Thursday, Oct. 31, 2013
2-2:20 p.m. ET
Rights-based evaluation means that evaluators focus on the rights of stakeholders as identified in such documents as the UN Convention of the Rights of People with Disabilities or the Convention to Eliminate All Forms of Discrimination Against Women. Adopting such an approach has implications for the types of evaluation questions that are used to frame the study. Principles that underlie rights-based evaluations will be discussed in terms of their influence on the development of evaluation questions. Exemplary evaluation questions will be used to illustrate the application of these principles.
You can pre-register for any of these webinars by clicking the links above!
|Diversity - Return of the GEDI: Former GEDI Participants Hosts a GEDI Site|
From Zachary Grays, AEA Headquarters
I have the privilege of working closely with this year's GEDI cohort alongside the GEDI coordinators. Each year's participants embark on a scholastic journey for professional development as interns at host sites across the United States. Host sites vary from year to year, but what they have in common is their stature as the homes of some of the most extraordinary minds in evaluation. Host sites play a crucial role in the development of each year's lucky candidate. For the first time in GEDI history, Dr. Maurice Samuels has become the first ever GEDI graduate to host a GEDI site.
Samuels participated in the first ever GEDI cohort. A shining example of the impact and outcome of the GEDI program, Samuels has gone on to grow into an accomplished evaluation professional and active AEA member. Since completing his GEDI track, Samuels has actively volunteered with AEA, first as a co-chair of the mentoring committee and now as co-chair of the MIE TIG. Currently the Lead Evaluator and Researcher at The Center for Elementary Mathematics and Science Education in Chicago, Samuels will take Crystal Coker, one of this year's participants, under his wing to impart the same experience he had as an intern years ago.
"During the internship experience, I learned about nontraditional approaches to evaluation and was introduced to seminal reading to culturally responsive evaluation approaches to evaluation. This helped to start my initial thinking for my own graduate dissertation work," said Samuels of his GEDI experience. "Fall and spring seminars gave me an opportunity to meet and discuss critical issues in evaluation with peers and established members of the field. These meetings facilitated my thinking as to how I can build my own capacity to work across culturally diverse groups."
Samuels was inspired to become a mentor because of the opportunity to provide and introduce an individual to the evaluation work that he does at The Center for Elementary Mathematics and Science Education and to provide an intern the experience of conducting evaluations for different types of organizations. Samuels stated: "While our evaluation projects have been in the areas of science, technology, engineering, and mathematics (STEM), we [The Center for Elementary Mathematics and Science Education] have conducted evaluations with a variety of organizations ... For each of these organizations, evaluation is used in different ways to help them improve their program," Samuels also wanted to expose the intern to more than the nuts and bolts of conducting evaluations citing the importance of developing personal relationships and being a critical friend to stakeholders.
I have been so impressed with the GEDI program, the dedication of participants present and past, and the lasting relationships that are spawned from this initiative. There is no denying AEA's investment in shepherding new evaluators from across the spectrum to become the most thoughtful and well-groomed professionals in this field. Samuels is an absolutely fine example of how being a part of such a competitive program can impact a participant's future endeavors. Samuels passing the torch in the way that he has reinforces this sentiment and makes the hunt for new applicants every year that much more enjoyable.
|Potent Presentations - It's Almost Showtime!|
From Stephanie Evergreen, Potent Presentations Initiative Coordinator
This month, before your conference session:
PRACTICE! Grab together a practice audience and distribute the Presentation Assessment Rubric to solicit their feedback on your presentation.
At the conference:
- Arrive at the session early and connect with the other presenters and session chair so that the session may start on time.
- Identify who will be holding the timing cards so that you may watch them during your presentation. Timing cards in each room identify "3 minutes," "1 minute," and "Stop" to prompt presenters.
- At the beginning of your talk, let the audience know whether you have physical handouts to distribute and/or whether your slides or handouts will be posted in AEA's eLibrary.
- Deliver your presentation. Speak clearly, maintain eye contact with the audience, and relax. Stick to the agreed upon time for your portion to ensure that everyone has the opportunity to present and interact with the audience. Shine.
- Respond to questions. Be aware of the limited time and offer concise responses, noting when appropriate that you may be able to follow-up post-conference to continue the conversation.
- Depart on time. Leave the room and continue discussion in the foyer so the next session can set up.
Back Again This Year!
p2i and the Data Visualization and Reporting TIG will cohost a Slide Clinic on Wednesday evening, 6:10-6:55 p.m. Bring your conference slides for some quick one-on-one triage with our trained volunteers. Look for us in the lobby.
You are going to rock your session! I can't wait to see you there.
|Systems Concepts in Action: A Practitioner's Toolkit|
AEA members Bob Williams and Richard Hummelbrunner are co-authors of Systems Concepts in Action: A Practitioner's Toolkit, published by Stanford University Press
From the Publisher's Site:
"Systems Concepts in Action: A Practitioner's Toolkit" explores the application of systems ideas to investigate, evaluate, and intervene in complex and messy situations. The text serves as a field guide and is structured in three distinct parts: describing and analyzing, learning about, changing and managing a challenge or a set of problems.
The book is the first to cover in detail such a wide range of methods from different parts of the systems field. The book's introduction gives an overview of systems thinking, its origins, and its major subfields. In addition, the introductory text to each part provides background information on the selected methods. "Systems Concepts in Action" may serve as a workbook, offering a selection of tools that readers can use immediately. They can also be investigated more profoundly, using the recommended readings provided. While these methods are not intended to serve as "recipes," they do serve as a menu of options from which to choose. Readers are invited to combine these instruments in a creative manner in order to assemble a mix that is appropriate for their own strategic needs.
From the Authors:
The idea for the book came about when we realized that evaluators knew little about the methodological options offered by the systems field. As interest in systems ideas grew, we saw evaluators grabbing whatever systems approach they came across and trying to use it whether or not it was appropriate for the job. As in the evaluation field, one size doesn't fit all, and evaluators pride themselves on using the most appropriate tool for the task. So we took about 20 methods and methodologies from across the systems field, succinctly described each, identified the kinds of evaluation questions they addressed and what kinds of things they were good (or not so good) at, and added a case example to demonstrate their application.
All of the methods are underpinned by three generic systems concepts: an awareness of inter-relationships; an acknowledgement of multiple perspectives; and an exploration of boundary choices.
We explain in the book how to apply those principles to any method of inquiry, whether it comes from the systems field, the evaluation field or another field.
See us at the Meet the Authors session at the AEA conference.
Visit the publisher's site.
Watch the Eleanor Chelimsky Forum Video Series
The Eastern Evaluation Research Society (EERS), an AEA affiliate, recently launched the Eleanor Chelimsky Forum with the support of the Robert Wood Johnson Foundation. The forum derives from a plenary paper authored by Chelimsky, Balancing Theory and Practice in the Real World, which was written for the 2012 EERS Annual Conference.
The forum produced a series of videos documenting inaugural event presentations and a 30-minute interview with Chelimsky. Visit the EERS YouTube channel to watch the video series.
In August 2012, the EERS Board of Directors established the forum as an annual conference event. From the EERS website: "The annual EERS Conference attracts practicing evaluators at various stages in their careers representing a range of disciplines (e.g., education, health, juvenile justice) and constituencies (e.g., consultancy, think-tanks, non-profits, for-profits, universities, funders)."
Read more about the Eleanor Chelimsky Forum.
You're Invited - Friday AEA Awards Luncheon Honors Six
AEA offers awards in seven distinct areas. In any given year, there may be no winner or multiple winners for a particular award. Each proposal is judged on its individual merit.
Below are the winners of AEA's 2013 awards:
Dominica McBride, Ph.D., Founder/ CEO and Evaluation Specialist, Become Inc.
2013 Marcia Guttentag Promising New Evaluator Award
Daniela Schroeter, Director of Research, The Evaluation Center, Western Michigan University
2013 Marcia Guttentag Promising New Evaluator Award
Thomas J. Chapel, MA, MBA, Chief Evaluation Officer, Centers for Disease Control and Prevention (CDC)
2013 Alva and Gunnar Myrdal Government Evaluation Award
Stewart I. Donaldson, Professor and Dean of the School of Social Science, Policy & Evaluation, and Director of the Claremont Evaluation Center at Claremont Graduate University
2013 Paul F. Lazarsfeld Evaluation Theory Award
Rebecca Campbell, Ph.D., Professor of Psychology and Evaluation Science, Michigan State University
2013 Outstanding Evaluation Award
Susan Kistler, Executive Director Emeritus, American Evaluation Association
2013 Robert Ingle Service Award
Join us for the presentation of this year's AEA Awards, scheduled for 11:55 a.m. to 1:35 p.m. in the International Ballroom Center Section. This forum provides a relaxing venue in which to socialize, network, and to hear from the best of the best in our field.
This is a ticketed event. You can purchase tickets ($35) online or at registration. See you there!
You're Invited - Poster Exhibit & Meet the Authors Reception
Join us for AEA's Poster Exhibition and Reception on Wednesday, Oct. 16, 7-8:30 p.m. in the International Ball Room Center Section at the Washington Hilton.
You'll find more than 200 posters on display and ready for judging. The top picks will win free tickets to Friday's Awards Luncheon! And, as if that weren't enough, you'll have an opportunity to meet and greet some of the top evaluation authors in the field.
And did we mention that there will be food and drink? Come and enjoy the posters, relax and socialize. You never know you who might meet and what you might learn all in an evening of low-key fun and camaraderie.
You're Invited - Silent Auction Friday Night to Benefit Traveling Presenters
From Hubert Paulmer, Silent Auction Event Coordinator
One of the highlights of every Evaluation conference is Friday night's Silent Auction. Coordinated by AEA's International and Cross-Cultural Evaluation Topical Interest Group, the proceeds benefit AEA's International Travel Awards and are used to offset travel expenses for presenters who reside in developing countries or countries in transition and to encourage their participation internationally.
What are some of the highlights of the night? Well, in addition to a vast array of handcrafted clothing, jewelry, and accessories as well as regional delights including coffees, chocolates, and other treats, the event offers an opportunity to bid and win one-on-one time with several well-known evaluators, as well as opportunities to attend sessions at The Evaluator's Institute (TEI) and sessions sponsored by the Claremont Evaluation Centre.
This year, you'll also be able to bid on one-hour talks with Stewart Donaldson, Michael Patton, Jim Rugh, Donna Mertens, David Fetterman, Michael Scriven, Gail Barrington, and Hallie Preskill (wow, that's like an Evaluation Royalty List!). The Canadian Evaluation Society (CES) and The Australasian Evaluation Society (AES) have donated 2014 conference registrations.
In addition to all of this, you can bid on a range of books signed by your favorite "evaluation gurus." So come prepared to either bid for yourself, or on behalf of the organization you work for, so that your colleagues can also benefit. Not to mention all the goodies from around the world you can take home for your friends and family! AEA's silent auction also gives you a great opportunity to socialize and network, in addition to munching all the yummy finger food plates (free) that go around the room.
Feeling inspired? Drop by on Friday, Oct. 18, at 6:30-8 p.m. before you head out for the night. You will enjoy it, and it's for a great cause. See you at Evaluation 2013!
In addition to coming to bid at the Silent Auction, if you would like to donate something, please feel free to bring it along and drop it at the AEA conference registration desk. If you have any queries, please email me at email@example.com.
Evaluation 2013: Plenaries and Presidential Strand
AEA is excited to announce the following plenary and presidential strand sessions at this year's annual conference. All but four plenary and presidential strand sessions will be in the International Ballroom. The four sessions not located in the International Ballroom will be in Jefferson West and are noted below.
Opening Plenary: The Breadth of Evaluation Practice Today: Arenas, Sectors, and Influences, by Jody Fitzpatrick
Evaluating Pre-Kindergarten Education: What Have We Learned?, by William Gormley, Jennifer Brooks, and Ellen Peisner-Feinberg
Keeping the Flow of Evaluative Learning within Organizations, by India Swearingen
Big Data and Program Evaluation, by Nancy Potok, U.S. Census Bureau; and Jack Buckley, U.S. Department of Education
Plenary: Which Evaluations Should We Believe: Origins of Credibility and Legitimacy in Politicized Environments, by Arthur Lupia, Hal R. Varian Collegiate Professor of Political Science, University of Michigan
Exploring the New Frontiers of Research Design to Improve Evaluation Practice, by Tom Cook
Using Science as Evidence in Public Policy, by Tom Schwandt, Ken Prewitt, and Miron Straf
Data Visualization in the 21st Century, by Stephanie Evergreen
The Practice of Evaluation Today: An Anthropologist's Perspective, by Jacqueline Copeland-Carson, African Women's Development Fund-USA
Measuring Resilience at the Project and Program Level
Plenary: The Practice of Educational Evaluation Today: A Federal Perspective, by John Easton, Institute of Education Sciences, U.S. Department of Education
The State of Qualitative Methods in the Early 21st Century: The Top 10 Developments over the Last Decade and Emergent Challenges, by Michael Patton
Addressing Complexities in Impact Evaluation of Environmental Programs: Big Scales, Non-linearities, Spatial Spillovers, Persistence and Navigation in a Litigious Environment, by Paul Ferraro
Bridging the Divide between Measurement and Evaluation, Between Policy Design and Implementation, by Shelly Metzenbaum
Policy Analysis: How Is It Like and Unlike Program Evaluation, by Paul Decker, President of Association of Policy Analysis and Management (APAM) and Gary Henry
Evaluation Practice in International Organizations: The Case of the United Nations Children's Fund (UNICEF), the United Nations Development Program (UNDP), the Rockefeller Foundation, and the World Bank
Pushing the Boulder to the Top of the Mountain: Forging a Holistic Evaluation Strategy, by Demetra Nightengale, Chief Evaluation Officer in the federal government, with the federal Department of Labor
Getting it Right: Using Systematic Reviews of Evaluations for Evidence-Based Decision Making, by Jill Constantine
Evaluation Practice in Different National Contexts: The Case of Mexico, South Africa, and the U.S., by Thania de la Garza Navarette, Jabu Mathe, and Jody Fitzpatrick
Hearing from Others in Evaluation: A Dialogue with Foundation Administrators, Panel: Ricardo Millett, Rosemary Gibson, a representative of the Community Foundation for the National Capital Region, and a representative from the Bill and Melinda Gates Foundation
The State of Evaluation Practice in the Early 21st Century. How Has the Theme of Evaluation 2013 Influenced Our Beliefs?, Panel: Len Bickman, Anne Vo, Leslie Fierro, Rakesh Mohan, and Michael Morris
Read more about each session.
New Jobs & RFPs from AEA's Career Center
- Research Manager at Walter R. McDonald and Associates Inc. (Sacramento, Calif.)
- Program Director at Michigan Fitness Foundation (Lansing, Mich.)
- Associate Program Officer, MLE Job at Bill & Melinda Gates Foundation (Seattle)
- Client Engagement and Attrition Survey RFP at Children's Services Council of Palm Beach County (Boynton Beach, Fla.)
- Senior Director, Consulting Services at Arabella Advisors (San Franciso)
- Research Associate at Harder+Company Community Research (San Diego, Calif.)
- Assistant Professor at Michigan State University (East Lansing, Mich.)
- Chief Accountability & Performance Management Officer at Baltimore County Public Schools (Middle River, Md.)
- Part Time Research Associate at Strategic Research Group Inc. (Columbus, Ohio)
- Evaluation Analyst at UC Santa Cruz - Educational Partnership Center (Santa Cruz, Calif.)
Descriptions for each of these positions, and many others, are available in AEA's Online Career Center. Job hunting? The Career Center is an outstanding resource for posting your resume or position, or for finding your next employer, contractor, or employee. You can also sign up to receive notifications of new position postings via email or RSS feed.
|AEA is an international professional association of evaluators devoted to the application and exploration of evaluation in all its forms.
The association's mission is to:
- Improve evaluation practices and methods
- Increase evaluation use
- Promote evaluation as a profession and
- Support the contribution of evaluation to the generation of theory and knowledge about effective human action.
phone: 1-202-367-1166 or 1-888-232-2275 (U.S. and Canada only)