|
Newsletter: July 2013
| Vol 13, Issue 7
|
|
|
|
What Do You Wish for AEA in Five Years? |

Dear colleagues,
I recently spent time with Beverly Parsons, our president-elect, and Leah Neubauer, the head of our Chicago local affiliate and an up-and-coming evaluator, interviewing candidates to be our executive director. You have received a notice of our choice and her background. This column is about something these interviews prompted me to think about: Where do we want AEA to be five years from now?
We were asked that question by one of our candidates, and I'd like to talk a bit about my response here. Certainly, I hope AEA continues to retain its members, recruit new ones, excite them about their careers, provide them with some new ideas to improve their practice, and serve them well. I would like us to be in sound financial shape and have a new generation of leaders moving onto the board and in other governance positions bringing us new ideas. But, most important to me, I would like AEA to become the "go-to" place for people and organizations that collect information for decision making. More specifically, I would like like government agencies, foundations, and nonprofits, when contemplating a new evaluation policy or effort, to think, "Let's go to AEA and see what they have to say about this!" or "Let's see if AEA has written any position papers on this issue or can advise us." Or even, "Let's send one of our staff to the AEA Evaluation Institute or conference or to one of the meetings of a local affiliate and see what they can learn."
Why do I want this? Because the purpose of AEA is to enhance the public good and to do so through good evaluation. To do that, we have to be known, respected, and called upon by those who make decisions that affect the public good. We have to provide training and guidance to agencies, organizations, and stakeholders because evaluation is new and, to many, unknown. Many people view evaluation as drudgery, a necessary chore — "Oh, right, we have to send in that report [that data] again!" We realize that evaluation is not the only source of influence on decisions, but we want to make it a more important, valued source of information — one that people are curious about, want, and use.
What are your goals for AEA in five years? Although mine are rather lofty, there are many other legitimate, important goals — tending to our members' needs, working with other countries to develop their evaluation capacity and efforts, increasing the value of the profession through credentialing programs or individuals to help organizations pick qualified evaluators. What do you want for AEA? I'd love to hear your thoughts now (email me!) or in the listening session the board is planning for the conference in October. It's called "listening" because we want to listen to where you think AEA should be in five years. We're moving into a new era of managing AEA with a new association management company and executive director. What do we want to accomplish in this new era regarding evaluation?
Sincerely,
Jody
Jody Fitzpatrick
AEA 2013 President
|
|
 |
|
|
AEA Selects Denise Roosendaal as New Executive Director |
After conducting a nationwide search, AEA selected Denise Roosendaal to serve as executive director.
Roosendaal (pictured left) brings substantial knowledge, skills, and experience to AEA. She has directed professional associations since 1999 and is a Certified Association Executive (CAE). Granted by the American Society of Association Executives (ASAE), this prestigious designation is conferred upon select individuals who possess the higher education, work experience, and professional development studies deemed as essential to the chief staff executive role. In addition, she has training in policy-based governance used by the AEA board, and also is experienced in strategic planning. Denise has served as AEA's interim executive director since April. She earned a master's degree in public administration from Syracuse University, the leading school in public affairs, and has a bachelor's degree in communications. As an employee of SmithBucklin, Denise will oversee all aspects of AEA's operations, lead the SmithBucklin staff team, and assume responsibility for meeting the strategic goals set by the AEA board. In addition, she will continue to develop association membership and benefits, collaborate with affiliates to forge effective partnerships, and increase AEA's presence as the leader in evaluation in the United States. If you have any questions, please contact AEA's headquarters at +1.888.232.2275 or +1.202.367.1166, or email info@eval.org. To learn more about the transition to SmithBucklin and what it means for AEA, read the Transition FAQ.
|
AEA's Values - Walking the Talk with Jonny Morell
|
Are you familiar with AEA's values statement? What do these values mean to you in your service to AEA and in your own professional work? Each month, we'll be asking a member of the AEA community to contribute her or his own reflections on the association's values.
AEA's Values Statement
The American Evaluation Association values excellence in evaluation practice, utilization of evaluation findings, and inclusion and diversity in the evaluation community.
i. We value high quality, ethically defensible, culturally responsive evaluation practices that lead to effective and humane organizations and ultimately to the enhancement of the public good.
ii. We value high quality, ethically defensible, culturally responsive evaluation practices that contribute to decision-making processes, program improvement, and policy formulation.
iii. We value a global and international evaluation community and understanding of evaluation practices.
iv. We value the continual development of evaluation professionals and the development of evaluators from under-represented groups.
v. We value inclusiveness and diversity, welcoming members at any point in their career, from any context, and representing a range of thought and approaches.
vi. We value efficient, effective, responsive, transparent, and socially responsible association operations.
I am Jonny Morell, director of evaluation at the Fulcrum Corporation and editor-in-chief of the journal Evaluation and Program Planning. I was asked to think about how AEA's values "guide and impact my work." I had trouble with this request because, for me, the words "guide" and "impact" imply something external — a sign on a road, an instruction from a friend, a push or a nudge from the outside. Of course, "guide" can also be internal — the whisper of conscience, the questioning of belief. But the internal view does not work for me, either, because it connotes a duality of mind that I am not comfortable with.
The metaphor that does work for me is a reflection in a mirror — and an imperfect reflection at that. When I look at AEA's values statement, I see reflected back parts of myself. Because of the work I do, not all the values reflect with equal clarity. But what I see in the mirror makes me realize that I am drawn to AEA because of the correspondence between what I do as a professional and what the organization stands for.
My hands-on evaluation activity draws on beliefs about "quality," "ethics," "decision making," "program improvement," and "policy." If a novel safety program can be shown to work, if the evaluation is good enough, people with power and money might commit to action, and the public good will be served. I write a lot about complex systems and unintended consequences. It is impossible to deal with these topics without being aware of the varied interests of different groups. Varied interests generate complexity, and complexity generates unintended consequences.
Evaluation and program planning draws a truly global group of authors and reviewers. Doing this work gives me an overwhelming sense of how multifaceted evaluation is and how varied evaluators are. I was recently elected to the AEA board. I never thought much about how important it is that AEA be efficient, effective, responsive, transparent, and socially responsible. But I think those values now. I think of them a lot.
|
Policy Watch - Government Use of Evaluation |
From Cheryl Oros, Consultant to the Evaluation Policy Task Force

While EPTF activities and their results are important for improving federal evaluation, they also help to leverage stronger evaluation policies beyond the federal government. Federal policies often are translated into mandates for entities receiving federal funding, including state/local government and non-governmental service providers; federal policies frequently encourage changes in state/local policies, even when they are not mandates; they also reach outside the United States and affect international programs. To the extent AEA helps to shape those policies, it has influence on how evaluation gets done in many non-federal contexts.
The EPTF Roadmap was referenced in the recent U.S. Government Accountability Office (GAO) recent report, Program Evaluation: Strategies to Facilitate Agencies' Use of Evaluation in Program Management and Policy Making, GAO-13-570, June 26, 2013. The GPRA Modernization Act of 2010 (GPRAMA) aims to ensure that federal agencies use performance information in decision making and holds them accountable for achieving results and improving government performance. GAO, tasked to evaluate the act's implementation, examined the extent of agencies' use of program evaluations, factors that may hinder their use in program management and policy making, and strategies that may facilitate their use.
GAO found that most federal managers lack recent evaluations of their programs; yet, most that did have evaluations reported that they contributed to a moderate or greater extent to improving program management/performance. Examples of evaluation use and the strategies agency evaluators use to facilitate evaluation influence are given — noting that it usually takes a number of studies to influence change in programs or policies. GAO emphasized three basic strategies respondents described to facilitate evaluation influence: demonstrate leadership support of evaluation for accountability and improvement; build a strong body of evidence; and engage stakeholders throughout the evaluation process. OMB staff interviewed for the study noted that agencies vary so much that they cannot deliver a top-down evaluation mandate on what to do; instead, they work with other OMB and agency staff on how to use evaluations and institutionalize evaluations "as part of agencies' DNA."
On July 17, The White House Office of Management and Budget (OMB) sponsored another Evaluation Working Group meeting, addressing The Evidence and Evaluation Agenda in the FY2015 Budget. OMB plans to facilitate cross agency teams to draw on a federal/non-federal community of practice in evaluation to find tools and solutions to needs for ways to conduct and use evaluations. The next meeting of the 300-member group will be held in early September 2013. If you work in the federal executive branch, Register for the Evaluation Working Group distribution list. On this site, OMB provides evidence and evaluation guidance based on its interpretation of GPRAMA 2010; group activities; an OMB blog; a discussion board; and other evaluation resources and links.
|
Face of AEA - Meet Valerie Jean Caracelli |
AEA's more than 7,800 members worldwide represent a range of backgrounds, specialties, and interest areas. Join us as we profile a different member each month via a short question-and-answer exchange. This month's profile spotlights Valerie Jean Caracelli.
Name: Valerie Jean Caracelli
Affiliation: U.S. Government Accountability Office
Degrees: B.A., Ph.D.
Years in the Evaluation Field: 27 years
Joined AEA: At its inception
Why do you belong to AEA?
AEA is our national association and my professional community. As an evaluator in a transdisciplinary and international field of study, there may be many associations I could join, but AEA is the only association that brings us together to share, challenge, critique, and learn from one another.
Why do you choose to work in the field of evaluation?
It's not so much that I chose to work in evaluation as that I was compelled to work in this field because of hope that it offers. It is the "societal good" under construction. In graduate school, I first learned how evaluation theory and practice can make a difference. Now, as an evaluator in a supreme audit institution where oversight is the sine qua non, I see the important role of evaluation in policy decision making is being increasingly acknowledged. The executive branch support of evaluation makes it more important than ever to be a voice for evaluation.
What's the most memorable or meaningful evaluation that you have been a part of?
I think each of us wants to think that the answer to this question lies ahead. Much of my work has been at the meta-level in the form of research on evaluation and how the findings can inform our practice. My collaboration with Jennifer Greene and Wendy Graham in 1989 and the attention we focused on mixed methods planted some seeds that have continued to bear fruit over the years. The initial work was mainly a review of the literature and evaluations that honored the substantial contributions of theory and method, both qualitative and quantitative. Further reviews with Leslie Cooksy underscored the importance of quality and the task of meta-evaluation to assure we did not stray in our commitment to quality. The burgeoning literature in both of these areas tells us there is much more that needs to be done.
What advice would you give to those new to the field?
I'm still following more important advice than any I can give. Speak truth to power (propagated by Aaron Wildavksy); Be of courage (Eleanor Chelimsky); Consider values and value dialogue (Jennifer Greene); Don't fear uncertainty (Charles McClintock); Quality has many facets, don't forget beauty and justice (Leslie Cooksy, Ernie House); Honor the responsibility; Culture and language matter (Rodney Hopson); Responsibility for the general welfare ultimately depends on evaluating the status quo (unknown); Articulate the program theory, its systemic nature, reveal interrelationships through a concept map (Bill Trochim); Focus. There is no time to lose (Stephanie Shipman); Don't criticize if you can't do it better (Urie Bronfenbrenner).
|
eLearning Update - Dig in with a Topical Series |
From Stephanie Evergreen, AEA's eLearning Initiatives Director
Most of the time, our Coffee Break Webinars are demonstrative. We like to show some kind of tool or tip that makes the daily life of an evaluator a little bit easier. Occasionally, though, we deviate from demonstration and provide something more like a Coffee Break Tour, a brief explanation of a theory, an evaluation, or even a campus.
We have grouped these series for you on our webinars home page. You can click any title to open the pane containing the webinars in that series.
Each year, AEA highlights an Outstanding Evaluation Award Winner at the annual conference awards luncheon. You can catch up on these fascinating, prize-winning evaluations in 20-minute chunks with our Evaluation Award Winners series.

In 2012, we co-sponsored a webinar series on international monitoring and evaluation, featuring tours of the modules and tool kits established by Catholic Relief Services and the American Red Cross.
Thinking about grad school? The Evaluation Graduate Program Profiles features a snapshot of the programs at six U.S. universities, explained by a student and a professor.
Finally, if you've been looking to bone up on your theory, we have a series for you. Watch Tina Christie explain the evaluation theory tree, or review Culturally Reponsive Evaluation, Utilization-Focused Evaluation, and Empowerment Evaluation.
Our upcoming Coffee Break lineup includes a tour of Statistics Without Borders and a demonstration of social network analysis to answer causal questions.
In our Professional Development eStudy series, we are excited to introduce two new topics: Data Cleaning with Jennifer Morrow and Geographic Information Systems with David Robinson and Tarek Azzam. Check it out now. Registration closes Aug. 7.
|
Diversity - The Minority Serving Institution Fellowship |
From Zachary Grays, AEA Headquarters
AEA has always aimed to create a community of evaluators of varied disciplines and practice. Now, more than ever, AEA is investing heavily in increasing diversity in the field of evaluation. Cultural inclusiveness is undoubtedly one of the key parts of that initiative and is something we pride ourselves on as the association continues to grow.
AEA promotes diversity through various initiatives, such as the Graduate Education Diversity Internship (haven't you always wanted to be a GEDI?), dissemination of our Cultural Competence Statement (check out aea365 for highlights from this month's Cultural Competence Week), and our Minority Serving Institution (MSI) Fellowship. The fellowship spans a long history but has only lived here at AEA since 2005.
The aim of the MSI Fellowship is to recruit, train, and develop the skills and competencies related to program evaluation of faculty and students from minority-serving and under-represented institutions. These institutions often lack the resources to facilitate such significant knowledge dissemination and professional development support for their faculty and students. This, however, has not stymied the growth and professional development of participants past. To date, 58 faculty and their students have obtained AEA professional development through the MSI Fellowship. Many past participants have continued to add to the association through both individual efforts and AEA-supported initiatives, contributing immensely to our Cultural Competency Statement.
"As a previous MSI fellow and as a recent program director, I have found my development of skill and knowledge related to evaluation theory and practice to have made a significant positive difference in my academic and professional practice and my association with other fellows, AEA leadership, and others associated," said Art Hernandez, former MSI fellow and program director. "I have come to consider AEA as one of my primary professional associations and regularly recommend the organization and its activities. I know that my experience is not unique."
Coming soon, you will see a call for participants for this year's intensive fellowship. Fellows participate in a 12-month, guided fellowship focusing on the areas of evaluation teaching, scholarship, and the practice of evaluation as a profession. In that time, fellows can expect to participate in various association activities, including our upcoming conference in October and the opportunity to participate in either an individual or group cohort culminating evaluation exercise. It is our mission to maintain fellow retention and participation during and far long after a fellow has completed the program.
Download the MSI Fellowship application.
|
Potent Presentations - p2i Has You Covered |
From Stephanie Evergreen, Potent Presentations Initiative Coordinator

The Evaluation 2013 conference program looks awesome! I can't wait to learn from your presentations. Get started focusing on a clear message, solid design, and crisp delivery now. Here are some of my favorite tools p2i has to support you:
Presentation Preparation Checklist - This short checklist outlines what to do and when to do it so you aren't rushing at the last minute. Share it with your co-presenters to stay on track.
Messaging Model Handout - This document suggests how much time to spend on the different parts of a typical 15-minute evaluation conference paper session and the order in which you should go through them.
Ignite Planning Worksheet - Download this one-page worksheet to think through the 20 slides you'll get for an Ignite session.
Presentation Assessment Rubric - Ready to start practicing your talk? This two-page rubric details key aspects of message, design, and delivery. Give it to your supportive practice audiences to gather their feedback.
Poster Guidelines - Even if you have decided to go the poster route instead of a paper, don't underestimate how long a poster can take to develop. Read through our revised Guidelines for Posters and then explore some strong example posters.
Whether this is your first presentation or your 500th, p2i has advice gleaned from the top presenters in our field and the best research out there today.
We'll dig into the three pillars of great presentations with live webinars in August. Click the links to register and attend live or watch the recording afterward.
Message - Tuesday, Aug. 13, 2-3 p.m. ET
Design - Tuesday, Aug. 20, 2-3 p.m. ET
Delivery - Tuesday, Aug. 27, 2-3 p.m. ET
|
Evaluation and Turbulent Times: Reflections on a Discipline in Disarray |
AEA member Jan-Eric Furubo is editor of the book Evaluation and Turbulent Times: Reflections on a Discipline in Disarray, published by Transaction Publishers.
From the Publisher's Site:
Now more than ever, policy evaluation is an important component in addressing the world's economic crisis. Before it can do so, the discipline must adapt to changing economic and political environments. The contributors address a basic question: What impact do crises have on evaluation and how can evaluation contribute in times of turbulence?
Examining the state of evaluation today, the volume's editors cover a broad range of topics, including post-hoc evaluation; shifting economic paradigms; the World Bank Group's response to the global economic crisis; challenges in evaluating financial literacy; evaluating counter-terrorism programs; evaluation in the context of humanitarian crises; and why civil society organizations in sub-Saharan Africa matter in evaluating poverty interventions.
The contributors explore the role of evaluation in the search for solutions to global instability. They recognize, however, that in order to address unprecedented crises, evaluation itself needs to be evaluated and updated as part of the process of change and reform. This volume is the latest in Transaction's well-respected Comparative Policy Evaluation series."
From the Editor:
Today, we are all aware that the world has become a very turbulent place. Like many others, the editors of this book ask the question "what does this mean for evaluation?" However, the answer to this question is different in this book than in other contributions.
The book points out that evaluation as a form of societal praxis was very much defined based on a notion about incremental change. Evaluation has very much been about improving and adjusting programs and actions in order to handle social problems. Turbulence means that these incremental change processes have reached a turning point and that the earlier courses of action are closed. And in the creation of new polices, new instruments, and new institutions, the question is not so much about how earlier instrument and institutions worked. In turbulent times it is therefore unavoidable to use many other forms of knowledge other than evaluation.
The book, therefore, also discusses what turbulence means for the relationship between evaluation and the social sciences and for our discussion about use and usefulness of evaluation itself.
About the Editor:
Jan-Eric Furubo has held many different positions within the National Audit Office in Sweden. He is co-editor of the International Atlas of Evaluationand Evaluation - Seeking Truth or Power.
Visit the publisher's site.
|
EvalPartners Supports Donor Efforts to Use Country Evaluation Systems |
From Tessie Tzavaras Catsambas, EnCompass LLC President
In the Paris Declaration, international funding agencies have signed a commitment to use country systems (including for evaluation) and respect national priorities and policies. The Evaluation Capacity Development Task Force of the OECD/DAC (EvalNet) came together in Helsinki, Finland, in June 2013 to discuss donor efforts to implement this commitment, and to approach development investments in a way that builds local evaluation capacity. I am pleased to report that EvalPartners was invited to attend, and sent four members: Marco Segone, Natalia Kosheleva, Issaka Traore, and Tessie Catsambas (two are co-chairs of EvalPartners, and three are co-chairs of the Enabling Environment for Evaluation Task Force).
The two-day meeting began with an update on new developments in donor efforts in Evaluation Capacity Development (ECD), explored the current status of enabling factors and obstacles for using national evaluation systems by donors, discussed EvalPartners and the IOCE's activism in building evaluation civil society's capacity, and reviewed special proposals by countries. Specifically, France is interested in supporting evaluations undertaken jointly by more than one funder to prom
 |
Riittaa Oksanen (Finland's Ministry of Foreign Affairs) hosts participants in Helsinki.
|
ote harmonization, while the United Kingdom is studying the demand and supply of evaluation and evaluative research in selected sub-Saharan African countries.
The significance of this meeting is that international funders of evaluation are now recognizing the importance of the emerging global evaluation civil society. We — the associations and societies of evaluation — are now invited to the table. Our voice is part of discussions of donor policies and strategies to strengthen evaluation in our countries. The work has just begun, but we need to celebrate our participation as recognition of our value and strengthened image. This is the contribution of EvalPartners. It has united Voluntary Organizations for Professional Evaluation around the world. Together, we are more powerful. Kudos to the visionary leaders of the AEA, and of other countries' evaluation associations and societies, who are supporting a larger and more strategic role for evaluation that promotes inclusion, equity, and gender equality!
|
GAO Report: Program Evaluation: Strategies to Facilitate Agencies' Use of Evaluation in Program Management and Policy Making |
The U.S. Government Accountability Office recently released a report, Program Evaluation: Strategies to Facilitate Agencies' Use of Evaluation in Program Management and Policy Making. Access the report.
GAO surveyed a random sample of federal civilian managers to assess the extent of their use of program evaluations and barriers to their use. GAO also interviewed experienced evaluators on barriers and strategies to facilitate evaluation use at five agencies in the Departments of Agriculture, Labor, and Health and Human Services. Governmentwide, GAO found that most federal managers lack recent evaluations of their programs. Yet, 80 percent of managers that did have evaluations reported that they contributed to a moderate or greater extent to improving program management or performance. The report describes examples of evaluation use and the strategies agency evaluators use to facilitate evaluation influence.
AEA asked Stephanie Shipman, assistant director for the Center for Evaluation Methods and Issues at GAO, to elaborate on the study. Below are her responses.
Q: Why did GAO perform this study?
A: The GPRA Modernization Act of 2010 added several provisions to encourage greater federal agency use of program and agency performance information to meet pressing government challenges. We have seen increases in performance data since GPRA was enacted in 1993, but not increases in program evaluation that could provide the kind of targeted guidance needed to drive program improvement. Thus, as part of GAO's mandated study of the 2010 Act's implementation, we wanted to learn more about the availability and use of program evaluation governmentwide.
Q: What is the most important finding from the study?
A: I think our study's most important finding is that there is a lot that evaluators can do to increase the use of their findings by working closely with program stakeholders throughout the planning, conduct, and reporting stages to meet their information needs. The evaluators we interviewed describe many readily adoptable practices, but their influence on programs and policy also reflects their agencies' commitment to promoting and supporting the use of evidence in decision making, which we hope is infectious!
Read the full report.
|
Evaluators Visit Capitol Hill Initiative |
From Brian Yoder, American Society for Engineering Education
W hen I attended the American Evaluation Association conference in Minneapolis last year and people learned that I live in Washington, D.C., and I serve as president-elect of the local AEA affiliate, the Washington Evaluators, they asked me two things: First, they would love to have President Obama come and speak at AEA and was wondering if I could make that happen. My answer: No. Not going to happen. The second question was if they could have opportunities to speak with their members of Congress about evaluation. My answer: Yes. This was more realistic.
I pitched the idea, originally called Evaluators Meet Their Congress Person Initiative, to Susan Kistler and members of the Evaluation Policy Task Force, and they liked it. It turns out Congress will be in recess during the AEA conference, so we've put less emphasis on meeting members of Congress and more on scheduling meetings with congressional staffers and providing materials, and we've changed the name to Evaluators Visit Capitol Hill.
I've lived and worked in D.C. for the past seven years working as a contractor, in government and a professional society, and I believe government processes can be helped through the use and application of evaluation. As the saying goes, there are no problems, only opportunities, and I've seen plenty of opportunities to improve government processes and the improved use of evaluation to assess government programs.
I've always admired the work of AEA's Evaluation Policy Task Force (EPTF). I view their guidance on evaluation of government programs and processes sensible, and should be more widely known and applied within government. A quick synopsis of EPTF's guidance: Government programs and processes are complex; so good, professional judgment is required when applying social science methods and principles to better understanding what government does and outcomes of government programs. Unfortunately, good judgment is too often trumped by the simplistic application of "preferred" methods and politics.
Traditionally, I think evaluators have tried to keep their role separate from implementation and the policy-making processes. But, based on my work in D.C., I've come to believe that policy makers and program implementers would be well severed by evaluators being involved more closely and directly in policy making and program implementation processes. When you work in an environment where the answers to important questions were needed yesterday, and questions that need to be answered keep changing, the traditional approach to evaluation with formative evaluation leading to summative evaluation becomes too slow and irrelevant.
My hope is that this initiative can accomplish three things:
- Make more policy makers aware of AEA and the work of EPTF.
- Expand the reach of EPTF to creating connections for EPTF.
- Give evaluators the opportunity be part of the early policy-making process by providing materials on evaluation to policy makers prior to the policy being made.
I look forward to keeping you informed of our progress!
Brian Yoder is the director of assessment, evaluation, and institutional research at the American Society for Engineering Education (ASEE). He oversees annual data collection for ASEE's benchmark surveys of engineering schools and leads research and evaluation of federally funded and foundation-funded projects in areas related to engineering education and STEM education. Yoder serves as president-elect of the Washington Evaluators, a local affiliate of the American Evaluation Association. |
Did You Know? AEA is a Member of COSSA
|
The American Evaluation Association is a member of the Consortium of Social Science Associations (COSSA), an advocacy organization that promotes attention to and federal funding for the social and behavioral sciences. It serves as a bridge between the academic research community and the Washington policy-making community. Its members consist of more than 100 professional associations, scientific societies, universities, and research centers and institutes.
In its many activities, COSSA:
- Represents the needs and interests of social and behavioral scientists;
- Educates federal officials about social and behavioral science;
- Informs the science community about relevant federal policies; and
- Cooperates with other science and education groups in pursuit of common goals.
COSSA works with federal agencies and with the relevant congressional committees and offices to explain the importance of social and behavioral sciences to America's economic and national security.
"COSSA has long done important advocacy work in support of federal funding for research, for the appropriate use of evidence in decision making, and for increased diversity in research fields. COSSA also helps inform federal officials about the value of research and evaluation, and it shares valuable information, such as about appropriations, with its member organizations. Membership in COSSA is a way for AEA to join with many other professional and scientific organizations in advocating for evaluation and more broadly for the social and behavioral sciences," said Melvin Mark, co-chair of AEA's Evaluation Policy Task Force.
"AEA's making common cause with COSSA makes us both stronger. The joining of our voices will enhance public support for what we do and increase the odds that policy makers will heed the advice of scientists like us," said George Grob, co-chair of AEA's Evaluation Policy Task Force. "At the same time, we stand to learn a lot from our comrades in related professional disciplines."
|
Registration Now Open for Evaluation 2013
|
Evaluation 2013
Conference: Oct. 16-19, 2013
Professional Development Workshops: Oct. 14-16 & 20, 2013
Early Registration Discounts Available Through Sept. 12
Registration is now open for Evaluation 2013, to be held at the Washington Hilton in Washington, D.C. At Evaluation 2013, connect with thousands of industry professionals from across the United States and around the world. Join your colleagues for 650 sessions spanning the breadth of the industry. Program Content
The conference program is available to view online. You can search the program and start to build your agenda before you arrive in Washington, D.C., this October. Search by keyword, session type, TIG or sponsoring group, or presenter. Conference sessions take place Oct. 16-19.
In addition to educational sessions, more than 55 professional development workshops are offered with in-depth content presented by experts. Workshops take place Oct. 14-16 and 20.
Hotel Accommodations
All Evaluation 2013 conference activities will take place at the Washington Hilton. Discounted hotel rates are available at the Washington Hilton and other properties based upon availability through Sept. 18, 2013. Please be sure to book your room early!
Don't wait! Register and book your hotel today to access the greatest savings. To learn more and register for Evaluation 2013, visit the AEA website. |
New Member Referrals & Kudos |
Last January, AEA began asking as a part of the AEA new member application how each person heard about the association. It's no surprise that the most frequently offered response is from friends or colleagues. You, our wonderful members, are the heart and soul of AEA, and we can't thank you enough for spreading the word.
Thank you to those whose actions encouraged others to join AEA in July. The following people were listed explicitly on new member application forms:
Haroon Balwa * Allison Van * Althea Pestine
|
New Jobs & RFPs from AEA's Career Center
|
Descriptions for each of these positions, and many others, are available in AEA's Online Career Center. Job hunting? The Career Center is an outstanding resource for posting your resume or position, or for finding your next employer, contractor, or employee. You can also sign up to receive notifications of new position postings via email or RSS feed.
|
About Us | AEA is an international professional association of evaluators devoted to the application and exploration of evaluation in all its forms.
The association's mission is to:
- Improve evaluation practices and methods
- Increase evaluation use
- Promote evaluation as a profession and
- Support the contribution of evaluation to the generation of theory and knowledge about effective human action.
phone: 1-202-367-1166 or 1-888-232-2275 (U.S. and Canada only)
|
|
|
|
|
|
|