Newsletter: December 2011Vol 11, Issue 12

Banner600

Beginnings and Endings - 2011

Greene.11

Greetings evaluation colleagues near and far, 

 

We are nearing the end of a calendar year, and we will soon embrace the beginning of a new one. We are in a time of endings and beginnings, a time where one circle ends and the next one begins, although the seam may be hard to find. We are in a time when the rhythms of our social and cultural rituals and the rhythms of earth are in sync, if not in harmony. These circling rhythms afford spaces for reflection, for looking back to what has been and for looking forward to what can be. 

 

Looking back, this has been a turbulent year on our little planet. We have experienced multiple violent protestations of our mother earth with tragic consequences for both landscapes and life. We have suffered the stark vulnerability of our financial institutions to raw greed. We have witnessed, with both horror and helplessness, the continued savagery of humans against other humans ... in the Mexican drug wars, the arrest and torture of citizen protestors by some Arab governments, and the ongoing wars encircling the globe.

 

Looking forward, I ask again, what does evaluation have to do with these contemporary global catastrophes and tribulations? And I respond again:  

  • "If you're not part of the solution, then you're part of the problem" (Eldridge Cleaver). Evaluation offers opportunities for inclusive engagement with the key social issues at hand. 
  • Most evaluators are committed to making our world a better place. Most evaluators wish to be of consequence in the world.

And I say, one more time, that evaluators can fulfill this yearning to be of consequence largely through the values inherent in our practice. Our values show up in our decisions about whose interests are addressed in our work and in our statements of what constitutes a quality program. Our values show up in our designs and our methodologies. And our values show up in our vision of evaluation's role and contributions to society. We are blessed with a rich and generative plurality of values commitments and a rich diversity of standpoints, talents, and visions within our own community. We have celebrated these blessings this past year through the theme of Values and Valuing in Evaluation. May the cyclical rhythms of endings and beginnings continue as we turn toward next year's AEA theme of Evaluation Ecologies, offered by 2012 president Rodney Hopson. May we engage fully with our ecological sense of place and possibility, even as we continue to count our blessings.

 

In gratitude for all that is AEA,

Jennifer  

Jennifer Greene

AEA President 2011  


In This Issue
Policy Watch - EPTF 5 Year Review
EPTF Evaluation Input
AEA 2011 Award Winners
AEA's Values - Walking the Talk
TechTalk - Embed Codes
2012 eStudy Indicators
Meet 2013 President Jody Fitzpatrick
Meet Deborah Ling Grant
Book: The Science of Science Policy
Book: Restoring the Innovative Edge
New Job Postings
Evaluation Humor
Get Involved
About Us
Quick Links
Policy Watch - EPTF's Five-Year Review Underway
From George Grob, Consultant to the Evaluation Policy Task Force

GrobAs the year draws to a close, it is always a time for reflection. In terms of AEA's Evaluation Policy Task Force (EPTF), this conveniently coincides with the Board's first phase of evaluation of the EPTF's work for which we were asked to compile documentation of our efforts to date, spanning the past five years. This compilation will be shared with an external expert review panel as one piece to inform its broader review. 

Today, I wanted to share with you the current draft of that compilation, and also to note a few questions that it addresses:

  • What is the EPTF supposed to be doing? Look at the charge on page 4
  • Who exactly is on the EPTF? See the task force member profiles that begin on page 10
  • What's happening with the EPTF's work in the federal arena? Check out the section on the consultative campaign, beginning on page 16
  • How is the EPTF shaping the discussion and culture regarding evaluation policy in the evaluation community? See the section on the public presence initiative that starts at page 32
  • Who's actually using the Evaluation Roadmap for a More Effective Government? Check the citations and use record beginning at page 68
  • What is this costing the association? The budget expenditures since 2007 are on page 196

I encourage you to explore the compilation further, asking your own questions, looking at the documents the EPTF has developed and the record of my own work, and clicking through to the external items as well where you'll gain a better understanding of the policy arena in which we're working.

 

We're proud of what we've accomplished, and look forward to the review from the expert panel. It is the panel, the membership, and ultimately the Board, that must determine whether the work completed represents good value for the resources invested. We wanted to be as transparent as possible in sharing with you the evidence that we have available, and to encourage you to make your own determination and to share your input to the evaluation via the internal task force's call for comments that may be found elsewhere in this newsletter or via their January survey if you are a member of the Evaluation Policy Topical Interest Group or the EPTF discussion list.

 

Download the compilation from the AEA Public eLibrary here 

 

Go to AEA's Evaluation Policy Task Force website page 
EPTF Evaluation - Input Due January 17

We are writing here as members of the Internal Evaluation Team that is charged with evaluating the Evaluation Policy Task Force (EPTF). We are trying to obtain feedback from AEA members with an opinion about the EPTF's endeavors. Members of the EPTF discussion list and of the Evaluation Policy Topical Interest Group (EP TIG) will receive an in-depth survey in January. If you are not a member of the discussion list or of the EP TIG, please feel free to offer any comments or input by emailing them to eptfeval@eval.org by Tuesday, January 17.

 

If you are interested in obtaining more information about the task force's work, you can see the materials that they compiled in the eLibrary here.

 

Patricia Rogers (Chair), Stewart Donaldson, and Kathryn Newcomer 

AEA Recognizes Outstanding Service & Contributions 

The American Evaluation Association honored four individuals and three groups for outstanding work at its Evaluation 2011 conference in Anaheim, CA. Honored this year were recipients in six categories who have been involved with cutting-edge evaluation/research initiatives that have impacted citizens around the world. We spotlight three below. Others were profiled earlier or will be profiled next issue. Thanks to all for their nominations and their support. Our congratulations to all!    

 

2011 Alva and Gunnar Myrdal Evaluation Practice Award

Leonard Bickman, Psychology Professor, Peabody College, Vanderbilt University

 

BickmanBickman has a distinguished 40-plus year career as a social psychologist and is recognized as a pioneer in applied research and evaluation. He spearheaded a comprehensive study of children's mental health services more than a decade ago that involved 1,000 children and their families over a five-year period - one of the largest mental health services demonstration projects ever conducted on children and adolescents. He and his colleagues have since developed the Contextualized Feedback Systems (CFS) model in an effort to improve mental health services and educational leadership. "Driven by the lack of positive findings for children's mental health services, Len has dedicated this stage of his career to trying to improve this area of intervention," notes Deb Rog, AEA's 2009 President who nominated Bickman for the award. 

 

2011 Robert Ingle Service Award 

 

Robin Lin Miller, Psychology Professor, Michigan State University

Miller 

Miller is being recognized for her more than 15 years of active service to AEA. She helped transform AEA's annual conference from a small intimate gathering of professionals and colleagues to an international gathering that today attracts more than 2,500 attendees worldwide and also oversaw the conversion of its print-only journal (American Journal of Evaluation) into an electronic alternative that has experienced an increased number of editorial submissions, a greater diversity of content reviewers and impressive gains in subscribership worldwide. She served as Conference Program Chair from 2000-2003, as associate editor for the American Journal of Evaluation (AJE) from 2001-2004 and was appointed Editor-in-Chief for two consecutive terms, serving from 2005-2009.  

    

2011 Marcia Guttentag Promising New Evaluator Award 

 

Margaret Hargreaves, Senior Health Researcher, Mathematica Policy Research, Cambridge, MA

  

HargreavesHargreaves, the mother of a college student herself, brings more than 20 years experience with state and local government in Minnesota first as an EEOC investigator for a state human rights agency, then as a management analyst and then a public health planning supervisor. Hargreaves, who has considerable experience as a professional trainer of systems evaluation - in both face-to-face seminars as well as online webinars - in 2010 wrote Evaluating System Change: A Planning Guide, used in graduate courses at Harvard, by the University of Chicago Medical and Social Service Administration Schools, and other institutions including the National Institutes of Health and the Living Cities' consortium of foundations.

 

   

"It is an honor to lead an association with the caliber of dedicated professionals like our award-winners," says AEA President Jennifer Greene. "Their work demonstrates the substantial value of evaluation to diverse policy and program arenas in our society and around the globe." 

  

Stay tuned for more information on AEA's awards nominations and upcoming deadlines.    

  

Go to AEA's Awards Page  

 

Introducing: AEA's Values - Walking the Talk

Are you familiar with AEA's values statement? What do these values mean to you in your service to AEA and in your own professional work? Each month, we'll be asking a member of the AEA community to contribute her or his own reflections on the association's values. We start with AEA's 2011 President, Jennifer Greene. 

 

AEA's Values Statement

The American Evaluation Association values excellence in evaluation practice, utilization of evaluation findings, and inclusion and diversity in the evaluation community.

 

             i.        We value high quality, ethically defensible, culturally responsive evaluation practices that lead to effective and humane organizations and ultimately to the enhancement of the public good.

             ii.        We value high quality, ethically defensible, culturally responsive evaluation practices that contribute to decision-making processes, program improvement, and policy formulation.

            iii.        We value a global and international evaluation community and understanding of evaluation  practices.

            iv.        We value the continual development of evaluation professionals and the development of evaluators from under-represented groups.

             v.        We value inclusiveness and diversity, welcoming members at any point in their career, from any context, and representing a range of thought and approaches.

            vi.        We value efficient, effective, responsive, transparent, and socially responsible association operations.

 


I'm Jennifer Greene, currently a professor of educational psychology at the University of Illinois at Urbana-Champaign. I'm delighted to be a part of Illinois's long legacy of evaluation education and practice. And I'm equally honored to be currently serving as AEA's president, following on from previous participation as a member of the board, and several committees and task forces.

AEA's values statement is one of the jewels in AEA's crown. The statement features values relevant to our varied and far-ranging organizational activities, to our organization's well being, and to how members of this association interact and engage with one another. With the values statement, our activities, our organizational health, and our internal interactions are importantly characterized as ethical, respectful of diverse cultures, welcoming, inclusive, socially responsible, and more. For me, this values statement thereby energizes and nurtures my participation in AEA activities and leadership, as such service is importantly positioned in service to these values.

 

AEA's values statement also provides inspiration and guidance for my professional practice as an evaluator. The statement offers a broad vision of quality and excellence in evaluation while not prescribing any particular form of or approach to evaluation. This vision challenges me to think well and critically about several integral dimensions of my own practice, including the following.

  • How can I insure that my evaluations are "ethically defensible," a concept that goes well beyond IRBs and informed consent? This question invokes my own valuing of the relational dimensions of evaluation - the character of my presence and interactions with others in the evaluation context. And I begin to ponder the ethical strands of these relationships that I carefully and mindfully establish in my evaluation practice.
  • I have long valued cultural responsiveness and respect in evaluation, but how well have I directed these commitments toward the enhancement of "effective and humane organizations." How could I do so?
  • I aspire that my evaluation work is practically consequential, that is, it contributes to wise "decision-making processes, program improvement, and policy formulation." I also aspire to an evaluation practice that is morally and politically consequential, contributing to "the enhancement of the public good." I construe the public good as the quality of public reason and the inclusiveness of public discourse about important public issues. I am proud to be a part of an organization that can help me fulfill my aspirations and dreams.
TechTalk - Using Codes to Embed Video and Other Items in the AEA eLibrary
From LaMarcus Bolton, AEA Technology Director

BoltonWhat's an embed code? It is a snippet of programming code that allows you to embed an item - often a video or an interactive visual developed in Flash - into a website or other online system. Why do you care if you aren't a webmaster? You can use embed codes in the AEA eLibrary!

 

Three steps for embedding items in the eLibrary:

  1. Locate the embed code
  2. Paste the embed code in the DESCRIPTION field of the "submit an entry" form when you upload into the eLibrary (need further help uploading? See our 'How to upload' video)
  3. Proceed to complete the remainder of the submit an entry form

When you view your entry, the embedded item will appear right in the eLibrary entry.  

 

Locating the embed code can be a challenge sometimes. Here are a few examples, and drawing from these you likely can get the idea for ways to identify them in other contexts.

 

iconPearlTrees: One of Susan's favorite tools, PearlTrees creates a map of linked web resources you have curated. When viewing a PearlTree on the PearlTrees site, click on the center node in any tree to bring up the team box and in that box look for the icon at the right - click on the icon to get the code you'll need to copy. Here's a link to a PearlTree in the eLibrary showing over 30 low-cost/no-cost Tech Tools from a session that Susan and I presented at Evaluation 2011.

 

YouTube: On YouTube, you're going to click the "Share" button under the video and then the "Embed" button. This will make a little box show up with the code in it which you can copy or customize further.

 

Vimeo:  Vimeo is an alternative to YouTube that is often used by nonprofits (including AEA!) because of its more professional toolset and look and feel. Jolene VillaLobos talks about Vimeo on aea365 here. On Vimeo, you'll see the same icon as in PearlTrees, on the right side of the video. Click on the link to get the ready-to-copy code. Organizational Research Services has a great video in which stakeholders talk about The Outcome Map. The video is hosted on Vimeo, and embedded in their listing with their related session handouts, in the AEA eLibrary here.

 

I have two quick items left to note. If you want to ramp it up and customize your embed codes, most give you the option to do so. The best width for embedding items in the eLibrary is 600 pixels. Finally, be sure that you have the rights or permissions to anything that you embed, and that if the item is from another source, that you give appropriate attribution in the description.

 

Bonus! If you are an AEA Topical Interest Group (TIG) webmaster, you can learn more about embedding items on your TIG's site from the tutorial prepared by TIG site coordinator Ben McClanahan and available online here.

 

If you have questions about embedding items in the eLibrary, please do not hesitate to email me at marcus@eval.org (but please note that we can't provide general tech support or assistance with embedding on other sites).

2012 eStudy Indicators
From Stephanie Evergreen, AEA eLearning Director

Evergreen2012 is almost upon us and I wanted to share with you our plan for AEA's eStudy program. Our mission is to bring you high-quality, affordable, evaluation-focused training in real time right at your desktop. How will we know if we've succeeded? Here are the indicators we're tracking:

  • The eStudy program will plan, schedule, and execute 12 or more eStudy workshops in 2012
  • Average satisfaction ratings for the eStudy program will meet or exceed those for workshops offered at the annual conference
  • Average attendance across the eStudy offerings will be greater than or equal to 20 registrants

2012 will represent the first full year of our eStudy program. We're focused on learning from other associations regarding their best practice suggestions, as well as from our own presenters, attendees, and staff about how to improve the program with each iteration. So, at the same time that we're tracking quantitative indicators, we're reflecting on the process and actively listening to our facilitators and registrants for suggestions, ideas, and concerns.

 

Want to improve your skills? Join us for an eStudy workshop in 2012. We have two January and February offerings that are both 6 hours in length, divided into four 90-minute sessions, and registration is $150 for full members or $80 for students.

  • Social Network Analysis with Kimberly Fredericks
    January 10, 17, 24, 31, 1:00-2:30 Eastern Time
  • Applications of Correlation and Regression: Mediation, Moderation, and More with Dale Berger
    February 8, 15, 22, 29, 1:00-2:30 Eastern Time

We're also offering one 3-hour course, divided into two 90-minute sessions for $75 for full members and $40 for students.

  • Empowerment Evaluation with David Fetterman
    February 21 & 23, 3:00-4:30 Eastern Time20

Do you have questions, concerns, ideas, or insights regarding our eStudy? Please don't hesitate to contact me. I'm Stephanie Evergreen, AEA eLearning Director, and I may be reached at stephanie@eval.org.

 

Go to the eStudy page to learn more and register 

Meet Jody Fitzpatrick - 2013 President

After our election, we promised a quick introduction of our three incoming Board members as well as the 2013 President. We'll spotlight each individually and thank them for their commitment to service.  

   

Fitzpatrick

Jody Fitzpatrick will serve as AEA's 2013 president. Based at the University of Colorado Denver in its School of Public Affairs, Jody has long been active with AEA. She has served on its Board, as Chair of the Teaching Evaluation TIG and on numerous committees. She has also served as Associate Editor of the American Journal of Evaluation and sits on the editorial boards of AJE and New Directions for Evaluation. She is co-author of Program Evaluation: Alternative Approaches and Practical Guidelines and Evaluation in Action.

 

"My interest in becoming President of the American Evaluation Association is inspired by my desire to thank the Association for the many years of learning and community it has given to me. Each year, I return from the annual conference stimulated by all I have heard and learned and with admiration for the important work so many evaluators are doing. I also love our organization - its welcoming and collegial atmosphere, the increasing diversity of our members, and the spaces AEA provides for exciting exchanges of ideas and experiences across many different venues. I am proud of the way AEA has grown to be a vibrant, thriving organization with creative, energetic staff and thoughtful, inspiring leaders. As a long-term member involved in many areas, I want to contribute to its continued good health."

 

In her ballot statement, Jody pledged to maintain continuity with AEA's transition to the policy-governance model, to explore ways to continue to achieve the diversity vital to valuable evaluation work that reflects the values of its clients and stakeholders and has expressed an interest in expanding engagement with evaluation policies to state and local levels and with nonprofits.

 

"We have learned that evaluation policies have a major effect on our work and have undertaken strong, successful initiatives with our Evaluation Policy Task Force. But, we have only just begun. In today's globalized world, members of AEA must be knowledgeable about evaluation policies and practices in other countries, different states and different types of organizations to consider the ways evaluation can work most effectively in our own contexts."

 

Go to AEA's Leadership Page 

 

 

Face of AEA - Meet Deborah Ling Grant, Project Manager 
AEA's more than 7,000 members worldwide represent a range of backgrounds, specialties and interest areas. Join us as we profile a different member each month via a short Question and Answer exchange. This month's profile spotlights Deborah Ling Grant, a former intern whose experiences helped shaped her outlook.

Ling

 

Name, Affiliation: Deborah Ling Grant, UCLA School of Public Health
Degrees: MBA, University of Southern California Marshall School of Business; MPH, UCLA School of Public Health; PhD, UCLA School of Public Health
Years in the Evaluation Field: About 5 years
Joined AEA: In 2009, after being accepted as part of the sixth cohort of the Graduate Education Diversity Internship (GEDI) Program - we named ourselves "Evolution."
AEA Leadership: I am involved in the Multiethnic Issues in Evaluation TIG and the Health Evaluation TIG.
 
Why do you belong to AEA?

"AEA has been a wonderful way to meet other students in evaluation and to interface with mentors in the field. The GEDI program helped pave the way by introducing me and my fellow cohort-mates to AEA conference activities and fostered camaraderie with all the past cohorts involved in the GEDI program and the Robert Wood Johnson Fellowship program as well. We had the opportunity to meet informally for dinner with the AEA Board and my cohort-mate Lisa Aponte-Soto and I presented the results of our cohort's evaluation of the program to the Board at the AEA Evaluation Conference in San Antonio the following year."


Why do you choose to work in the field of evaluation?

"I worked as a business management consultant before switching gears to focus on public health evaluation. I worked in a small firm which meant that I could get hands-on experience in every aspect of the project, but also meant that I was often tasked with doing nearly every job. Through this experience, I discovered I enjoy project-oriented work and after my training in public health, I wanted to participate in evaluations that were philanthropic. I choose to work in public health evaluations and clinical translational science evaluations specifically because I hope that the results of my evaluation work will bring meaningful change to the way that health services are delivered in our communities."


What's the most memorable or meaningful evaluation that you have been a part of - and why?

"My most memorable evaluation experience has definitely been with my fellow GEDI program cohort. We had a chance to work together on a group evaluation of the program and also worked independently in internship placements. My internship placement was with the Health Services Department at my own school so it was a natural extension for me to continue to work on health evaluations related to my doctoral dissertation work. My GEDI cohort mates have been a fabulous support network and we remain close both personally and professionally. In fact, a group of us just presented at the last AEA conference on building leadership in culturally responsive evaluation. We have a strong bond and I am certain that we will continue to work together on future evaluations and endeavors!"

 

What advice would you give to those new to the field?

"My doctoral program is heavily research oriented and it has been difficult to find strong mentors in evaluation or a variety of learning venues that focus on evaluation skills and techniques. I encourage students and new evaluators to actively seek mentors through AEA who can expose them to opportunities in various types of organizations and evaluations. In this way, new evaluators can see what types of evaluations and environments are the best fit for their own cultural lenses and preferences."

 

If you know someone who represents The Face of AEA, send recommendations to AEA's Communications Director, Gwen Newman, at gwen@eval.org.

 The Science of Science Policy

Science PolicyAEA member Stephanie Shipp is an editor of The Science of Science Policy, a new book by Stanford University Press.

 

From the Publisher's Site:

"Basic scientific research and technological development have had an enormous impact on innovation, economic growth, and social well-being. Yet science policy debates have long been dominated by advocates for particular scientific fields or missions. In the absence of a deeper understanding of the changing framework in which innovation occurs, policymakers cannot predict how best to make and manage investments to exploit our most promising and important opportunities.

 

"Since 2005, a science of science policy has developed rapidly in response to policymakers' increased demands for better tools and the social sciences' capacity to provide them. The Science of Science Policy: A Handbook brings together some of the best and brightest minds working in science policy to explore the foundations of an evidence-based platform for the field."

 

From the Editors:

"An editorial in the May 2005 Science magazine by John H. Marburger III, then the Director of the Office of Science and Technology Policy in the Executive Office of the President, encouraged the formalization of the Science of Science Policy within the federal and academic communities. His editorial, Wanted: Better Benchmarks asked "How much should a nation spend on science? What kind of science?" His challenge highlighted the lack of analytical capacity to make the best science policy decisions. He also spoke on this topic in many other venues. He wanted to see the science of science policy evolve into a discipline and "community of practice," allowing for academics, administrators and policymakers alike to contribute to and benefit from the school of thought. The National Science Foundation and the National Science and Technology Council have each initiated new programs in response. The goal for the Science of Science Policy Handbook is to provide essays authored by scientists and policy practitioners that showcase the multiple perspectives and inter-disciplinarity of the field."

 

About the Editors:

The volume was co-edited by Kaye Husbands Fealing, Julia Lane, John H. Marburger III (now deceased) and Stephanie Shipp. Stephanie is a Senior Research Analyst at the Science and Technology Policy Institute. She is a former Director of the Economic Assessment Office in the Advanced Technology Program at the National Institute of Standards and Technology, and an AEA member since 2002. 

 

Go to the Publisher's Site

Restoring the Innovative Edge

HageAEA member Jerald Hage is author of Restoring the Innovative Edge, a new book by Stanford University Press.

 

From the Publisher's Site:

"Considerable evidence indicates that the U.S. is falling behind when it comes to innovation. In part, this shift stems from the globalization of research and the advancement of other nations. But, it also arises from a widespread failure to adapt to the competitive environment generated by the evolution of science and technology.

 

"The objective of this book is to provide possible remedies for eight key obstacles that the U.S. faces in restoring its innovative edge. Understanding that these remedies are complex, each chapter also discusses the dilemmas and impediments that make change a challenge. Unlike other books that suggest simple fixes to the U.S. innovation crisis, this book argues that the management of innovation requires multiple interventions at four different levels: in research teams, organizations, economic and non-economic sectors, and society at large."

 

From the Author:

"If one starts with a very simple assumption that the world of science and technology grows more complex across time then the prescription of simply spending more money and training more scientists is inadequate. The innovation process must be managed. But, to manage this process requires that evaluators identify obstacles that are preventing good scientific research from being accomplished. As I like to joke, all the scientists are above average but the problem still remains as how to make them more innovative. The obstacles exist at multiple levels, including American society and its use of the neo-classical economic model as both a policy model and a model for evaluation. Hence the book attempts to indicate how evaluators can identify obstacles to innovative research at multiple levels and provide solutions. However, rather than simply give a single solution to each obstacle, the book considers alternatives. Choices vary by the kind of research organization and the sector of science and technology. They also have negative consequences. There are no simple answers. Thus, in sum, the book provides a new evaluation model, a new policy model based on the management of innovation, and behind it, a new socio-economic paradigm."

 

About the Author:

Jerald Hage is Director of the Center for Innovation, University of Maryland. He started studying organizational innovation in the 1960s, and since then has also worked on institutional analysis in health, education, and welfare - primarily with comparative studies of Europe. Hage has authored or co-authored 16 books and more than 100 papers. Presently he is directing two major research projects funded by the National Science Foundation, as well as a project for the STAR division of the National Oceanographic and Atmospheric Administration. And, he has just started a major evaluation project of malaria control in India.

 

Go to the Publisher's Site

New Jobs & RFPs from AEA's Career Center  
What's new this month in the AEA Online Career Center? The following positions have been added recently: 
  • M&E Director at IBTCI (International Business & Technical Consultants, Inc.) (Nairobi, Kenya, AFRICA) 
  • Educational Researcher/Program Evaluator at University of Delaware (Newark, DE, USA) 
  • Ethnographic Researcher at GreatSchools (San Francisco, CA, USA)   
  • Senior Human Services Analyst at County of Santa Cruz (Santa Cruz, CA USA)
  • Director of Research and Evaluation at Cornell University (Ithaca, NY, USA)   
  • Quantitative Research Associate at Research for Action Inc (Philadelphia, PA USA) 
  • Third-Party Evaluator for Great Lakes Program at National Fish and Wildlife Foundation (Washington, DC, USA)
  • Research Associate I at National Center for Juvenile Justice (Pittsburgh, PA, USA)
  • EOI: Global Evaluation of the Emergency Response Fund (ERFs) at United Nations (New York, NY, USA) 
  • Research and Evaluation Specialist at Pima County Juvenile Court (Tucson, AZ, USA)

Descriptions for each of these positions, and many others, are available in AEA's Online Career Center. According to Google analytics, the Career Center received approximately 3,300 unique visitors over the last 30 days. It is an outstanding resource for posting your resume or position, or for finding your next employer, contractor or employee. Job hunting? You can also sign up to receive notifications of new position postings via email or RSS feed.

 

Evaluation Humor
We talk a lot about impact - perhaps we should laugh a bit about it as well!
cartoon
The above is used with permission from the Research Counselling blog which shares original cartoons exploring the lighter side of working with the United Kingdom's Research Councils which are a bit like the United States' National Science Foundation. You can also find the cartoonist on twitter at   @ResearchCounsel - say thanks if you see him there.

 

Have a cartoon to share? Send your suggestions to AEA's newsletter editor, gwen@eval.org. 

Get Involved
About Us
The American Evaluation Association is an international professional association of evaluators devoted to the application and exploration of evaluation in all its forms.

 

The American Evaluation Association's mission is to:
  • Improve evaluation practices and methods
  • Increase evaluation use
  • Promote evaluation as a profession and
  • Support the contribution of evaluation to the generation of theory and knowledge about effective human action.
phone: 1-508-748-3326 or 1-888-232-2275