Newsletter: August 2011Vol 11, Issue 8

Banner600


Greene.11

Dear Colleagues,

 

In many parts of the northern hemisphere, this is the time of year when the beginning of the new school year dominates the rhythms of daily life. A young parent, with both excitement and anxiety, takes her 5-year old child to kindergarten for the first time. A pre-adolescent youth makes the big transition from elementary to middle school and has to find his locker and the computer lab amidst all the "big kids." Traffic in college towns saturates the streets on moving-in day for students. And teachers everywhere take a last deep breath of summer and prepare to greet this year's students and re-engage them in the wondrous process of learning.

 

Like this annual renewal of the academic school year, rhythms are an important part of life. They mark seasons, rituals, holidays, and life passages. They mark relationships, past and present, and they both comfort and challenge our spirit. Rhythms also accompany many spheres of work and practice - for example, elections are integral to politics, annual physicals to medicine, performances to the arts, and autumn harvests to agriculture.

 

Let us imagine the rhythms of evaluation as integral to our own respective domains of practice. What might constitute such rhythms of evaluation?  How would the integration of evaluative rhythms into program planning, implementation, and decision making be manifest?  And how would they matter? My own thoughts suggest that evaluation's rhythms are perhaps best captured in the notion of evaluative thinking - a stance of critical, reflective, and inclusive engagement with the logic and consequence of program design, with defensible method and evidence, with contextuality, with diverse stakeholder perspectives and values, with environmental responsibility, with relevant policy issues and decisions, and with relationships in the contexts at hand. For me, thus, evaluation rhythms are on-beat and dialogic, though not necessarily harmonic. For me, thus, evaluation rhythms afford regular (on-beat) opportunities for inclusive engagement with the key social issues that the program is endeavoring to address. 

 

What do others think? What off-beat, syncopated, or more symphonic evaluation rhythms do you imagine? In service of what role for evaluation in society and with what accompanying value stances?

 

Happy Fall!

 

Jennifer

Jennifer Greene

AEA President, 2011

[email protected]

In This Issue
Policy Watch with George Grob
2011 Election Results
AEA Member Survey
TechTalk with LaMarcus Bolton
International Listening Update
Meet Cheryl Oros
aea365 Kudos
Tribute to Staats
Book: ASTD Handbook of Measuring and Evaluating Training
Data Den: AEA Size & Gender
New Job Postings
Get Involved
About Us
Quick Links
Policy Watch - Do You Want to Help?
From George Grob, Consultant to the Evaluation Policy Task Force

Grob

I sometimes get offers from AEA members to help in promoting evaluation policies or questions about what they can do to get policy makers interested in the results of their evaluations. I thought it might be useful to offer some suggestions along these lines.

 

First, let's consider evaluation policy. This has to do with rules, whether formal or informal, that an organization establishes for conducting or using evaluation. Evaluation policies include such things as authorizations, requirements, funding, methods, planning, publishing, and quality assurance for evaluation. They may be promulgated through laws, regulations, administrative procedures, budgets, organizations, and standards. They may be established at the Federal level by the Congress or executive agencies. Similar rules can be promulgated by State and local governments, by foundations, or any organization that wishes to make evaluation part of the way they do business.

 

AEA has established the Evaluation Policy Task Force (EPTF) to promote efficacious evaluation policies, with a particular focus on Federal policies. It is an advisory body and has no authority to speak on behalf of AEA except when specifically authorized to do so by AEA's President and Board of Directors.

 

AEA's Board is particularly anxious to encourage AEA members' input to the formulation of evaluation policies. At its most recent meeting (in June) it approved a policy to facilitate such involvement under a variety of circumstances, including long term efforts to produce carefully vetted position papers, such as AEA's Evaluation Roadmap for a More Effective Government, and shorter term policy opportunities such as comments on proposed government regulations. You can go to AEA's website to get a sense of the kinds of public evaluation positions that AEA takes and on which the Board wants AEA members' input.

 

One way for AEA members to participate in promoting effective evaluation policies is to alert the EPTF about policy influencing opportunities. This is what happened, for example, with regard to AEA's advice to the Office of Management and Budget about Paperwork Reduction Act requirements that affect evaluators' abilities to conduct surveys. Another AEA member alerted the EPTF about an opportunity to provide technical assistance to congressional staff on funding evaluation of the President's Emergency Relief Program for AIDS. This in turn led to opportunities to influence the development of evaluation policies of USAID.

 

These are just two examples of AEA members' contributions to evaluation policy. Please do not be shy about alerting the EPTF to policy influencing opportunities. You can reach the Evaluation Policy Task Force via email at [email protected], or by joining the evaluation policy discussion list.

 

AEA members can also influence evaluation policy by providing policy makers (such as congressional and Executive Branch staff) copies of AEA's formal evaluation policy positions found on the AEA website, including the Evaluation Roadmap.

 

In a future column I will discuss what AEA members can do to get their evaluation studies in the hands of policy makers.

 

Go to AEA's Evaluation Policy Task Force website page 
AEA 2011 Election Results

Please join us in welcoming and congratulating AEA's 2013 President and three new Board Members at Large - and thank you all for taking the time to participate in this year's online election. AEA's new officers will be sworn in at our annual conference in Anaheim this fall. If you know them, take a moment to welcome them aboard. And if you don't, take time out at Evaluation 2011 to attend their official swearing-in. In a subsequent issue, we'll speak with each of them more personally. They begin their three-year terms in January.

 

AEA President-elect-elect 

  • Jody Fitzgerald (Colorado)

AEA Board Members at Large 2012-2014

  • John Gargani (California)
  • George Julnes (Maryland)
  • Kathryn Newcomer (Washington, DC)
Member Survey Open for Input - Comment on AEA's Transition, Direction

JVACalling all members! Want to have a say in the future of AEA? By now, each of you should have received an invitation by email to participate in a short survey that will help assess AEA's transition to policy-based governance, provide a forum for member input/feedback, and offer a mechanism for member engagement. The invitation was issued by AEA President Jennifer Greene, and your input is vital.

 

"The survey has been purposefully crafted to collect perspectives about the transition and about member satisfaction with AEA, in general," says Ellen Taylor-Powell, chair of the Task Force overseeing the evaluation. "So, whether individuals know much about the policy-based governance transition or not, this is a chance to be heard. JVA was selected from a competitive pool and has proposed a comprehensive evaluation to help AEA govern in ways that reflect member needs. Let's take this opportunity to share our perspectives and help in that process." 

 

Founded in 1987 and based in Denver, Colorado, JVA Consulting, LLC partners with clients across the U.S. to identify challenges, develop solutions and make sure they have a lasting effect on communities. JVA's work has been spotlighted in the Chronicle of Philanthropy, Research on Social Work Practice, the Philanthropy Digest, American Indian Report, Denver Business Journal, Cause Planet, the Charity Channel, federal government evaluation reports, and newspapers and online media across the country. JVA staff routinely presents at national conferences and their capabilities include research and survey design, professional interviews, and governance and organizational leadership. JVA has undertaken work on behalf of the Association for the Education of Hispanic Theologians, the Native American Rights Fund, the Rose Community Foundation, Susan G. Komen, and more; has conducted more than 200 organizational assessments since 2002; and has guided even more organizations on governance and effective leadership methods. A portion of JVA's Executive Director Academy addresses shared leadership and effective governance practices.

 

JVA has designed a mixed-methods approach that will garner input from stakeholders and provide an expert review of current practices and impacts. The survey will measure member engagement with AEA (both in length of membership and through participation in AEA activities), and knowledge of the AEA governance transition, as well as specific areas of involvement either now or over the years through such avenues as a Priority Action Team, Task Force, TIG, working group, committees or engagement with a local affiliate. JVA will also determine how AEA's policy governance compares to other associations of similar-sized memberships, gauge its transparency to members and the public, and identify successful governance indicators that can be monitored over time. The evaluation is to be completed by November, with results shared with the membership thereafter.

 

To participate, click here. The survey takes up to 20 minutes to complete and allows you to have a vital say in the future of the association.

TechTalk - Commons Licensing
From LaMarcus Bolton, AEA Technology Director

Bolton

 

When you upload files into any of AEA's eLibraries, you'll get to a step where you'll be asked to "Select Your License."  

 

The eLibrary system is working to assign a Creative Commons license to your uploaded file. According to the Creative Commons website, such licenses "provide simple, standardized alternatives to the 'all rights reserved' paradigm of traditional copyright." With Commons licensing you retain copyright, but can allow others to copy and distribute your work, and can do so only if they too agree to make it sharable in the same way that you have done, at your discretion.  

 

There are two questions that you'll be prompted to answer. Let's look at each and see what the response options mean.

 

Question 1: Allow Commercial Uses of your work?

 

YES: The licensor permits others to copy, distribute, display, and perform the work, including for commercial purposes.

NO: The licensor permits others to copy, distribute, display, and perform the work for non-commercial purposes only.


Question 2: Allow Modifications of your work?


YES: The licensor permits others to copy, distribute, display and perform the work, as well as make derivative works based on it.
YES as long as others share alike: The licensor permits others to distribute derivative works only under the same license or one compatible with the one that governs the licensor's work. Note that this option keeps others from editing your work and using it commercially if you have said that you don't want commercial use, and it keeps others from editing your work and then attaching their own restrictive copyright or calling it solely their own. Finally, this option maintains the concept of the "commons" as a space for sharing, building, and extending the work of others - but be forewarned, if you select this option you are indeed allowing others to modify your work as long as they give attribution where it is due.
NO: The licensor permits others to copy, distribute, display and perform only unaltered copies of the work - not derivative works based on it.


Note that none of the options for either question prevents copying and distribution of the submitted work. A public eLibrary, such as AEA's, is an appropriate space to upload only those items for which general distribution and copying is permitted.


So, go ahead, upload your handouts for the conference, your example instruments, your data sets, and your reports. There are over 600 items in AEA's eLibrary, and likely to be more than 1,000 by the end of this year. We want to encourage you to share and share-alike. But, think ahead so you are ready to answer when asked for your licensing preferences. And, in the meantime, if you have questions either about commons licensing or technology in general, feel free to email me at [email protected]. 

News from Evaluation Associations Around the World
From Jim Rugh, International Listening Project Coordinator

RughIn our July newsletter we mentioned AEA's International Listening Project in which suggestions were being solicited for how AEA might strengthen its collaboration with other actors on the global level. We received insightful input from 362 respondents!  All that rich data is still being processed, but preliminary data is accessible at http://bit.ly/ILPreport.

 

In the meanwhile, AEA readers who are interested in learning about what evaluation associations in other countries are doing are invited to read the latest IOCE newsletter. It is accessible at www.IOCE.net.  Highlights include the following:

  • On behalf of all evaluation associations, IOCE is exploring stronger forms of collaboration with the OECD/DAC Evaluation Network (EvalNet) and the United Nations Evaluation Group (UNEG).
  • The International Program Evaluation Network (IPEN), which includes evaluators from countries of CIS (the former Soviet Union) has organized a number of conferences and workshops (including one on Transformative Mixed Methods Evaluation, led by Donna Martens).
  • The Brazilian M&E Network (BMEN) held its 3rd seminar in Brasilia in June.
  • The South Asian Community of Evaluators (CoE) organized a very successful Conclave last October, and is growing in strength and membership.
  • The Sri Lanka Evaluation Association (SLEvA) held its 3rd international conference in Colombo in June.  The IOCE Board held its annual face-to-face meeting there.
  • A new Thailand Evaluation Association (TEA) is being formed.
  • In the Middle East and North Africa (MENA) there is an exciting form of "Arab Spring" movement among evaluators at the national and regional levels.
  • The Evaluation Capacity Development Group (ECDG) is organizing an International Workshop Agreement (IWA) planning workshop in October in Geneva with an eventual goal of setting global standards for Evaluation Capacity Development (ECG) for ratification by the ISO.
  • On September 6 UNICEF and many partner institutions will launch a series of monthly webinars on Equity Focused Evaluations.  Information can be found at www.mymande.org.
There's a lot going on in the big wide world of evaluation! 
Face of AEA - Meet Cheryl Oros

AEA's 6,500 members worldwide represent a range of backgrounds, specialties and interest areas. Join us as we profile a different member each month via a short Q&A. This month's profile spotlights Cheryl Oros, an independent consultant long active in the field and in the association.

 Oros

Name:
Cheryl J. Oros 

Affiliation: Oros Consulting

Degrees: PhD

Years in the Evaluation Field: 35

Joined AEA: At inception

AEA Leadership Includes: Co-Chair, Research & Technology Development TIG

 

Why do you belong to AEA? 

"To advance the field; share knowledge and learn from others via networking, meetings, shared materials, etc.; and help AEA reach policymakers regarding best evaluation approaches."


Why do you choose to work in the field of evaluation? 

"To assist policy makers in understanding programs and making science-based decisions."


What's the most memorable or meaningful evaluation that you have been a part of - and why?

"Evaluations that allowed me to innovate, solve design/methodological problems, as well as opportunities to set evaluation policies for federal agencies and provide training to expand evaluation capacity, improve agency functioning and advance the evaluation field. 

 

"When brought on to direct a new evaluation/strategic planning office at an agency focused on extramural research, extension and education at the moment the PART was instituted, I created the PREP (Portfolio Review Expert Panel) R&D evaluation system using logic models, self reviews, and evaluation studies to explain and assess programs.  This material was presented to expert panels for defined, systematic review providing scoring (for OMB) and advice to management regarding program improvement.   As part of this effort, we reformed Extension grant proposals and results reporting by requiring inclusion of logic models, performance measures, evaluation plans, and updating data systems, thus making submissions easier/more useful for states as well as agency managers. 

 

"Another favorite evaluation was a nationwide examination of an HHS drug prevention program evaluation for HHS in which an advisory panel (including Tom Cook, David Cordray, and Abe Wandersman) assisted my team in creating a state-of-the-art design for a multi-tiered study.  These studies have been described in AEA sessions and journal articles.

 

"I also developed evaluation courses and trained (and am currently training) agency managers and field staff at federal agencies as well as GAO, Georgetown University, and AEA (R&D evaluation in 2010).  I chaired the group in the GAO-sponsored Federal Evaluators to develop and provide training to OMB staff during PART, which included a chart of common evaluation methods for the various types of federal programs. I was appointed to interdepartmental committees, such as the White House OSTP Science of Science Policy Committee, to develop and foster evaluation of government R&D programs.   I am currently consulting on the setup and operations of federal evaluation offices."

 

What advice would you give to those new to the field? 

"Innovate and expand evaluation capacity; be relevant to decision makers and address their program questions; orient evaluations to clients' needs; be useful to managers in understanding their programs and how they can better manager; network and innovate."

 

If you know someone who represents The Face of AEA, send recommendations to AEA's Communications Director, Gwen Newman, at [email protected]. 

In the News - AEA's aea365 Tip-a-Day Alerts

ASAE

AEA was recognized in the August issue of Associations Now magazine for its aea365 daily alerts. Launched January 1 last year, aea365 is open to the public and its subscriber base has grown from 760 in its first six months to more than 2,200 today. Driven by member-authored entries, the blog tallied more than 280 contributors last year and that's been the secret to its success. Read the Associations Now article here

 

Go to aea365 to subscribe to receive a tip-a-day by and for evaluators  

A Tribute to Elmer Staats, Former Head of GAO

From AEA Members Lois-ellin Datta and Eleanor Chelimsky

 

Today, we mourn the passing of Elmer B. Staats, former Head of the Government Accountability Office, who died on July 23rd at the age of 97. He is remembered by many for the transformation of GAO. Staats entered federal service in 1939, was appointed Comptroller General by President Lyndon B. Johnson in 1966, and served in that capacity for 15 years. He was honored in 1980 with the Evaluation Research Society's Federal Executive Award.

 

"For all of us evaluators, this is a very great loss," comments AEA member and 1995 President Eleanor Chelimsky. "I worked for him between 1980 and 1981 when I began creating an evaluation unit at the GAO, saw him every week at meetings in which I described my hopes, fears or problems, and learned from him about government, strategy, tactics, networking and negotiation. He was amazingly available for advice when I needed it, and I think this is because his interest in evaluation went very deep.

 

"He told me more than once that he saw evaluation as the best tool available for examining the quality of public programs and policies, and the fact that evaluation, like any other analytical tool, has its limitations did not trouble him. Indeed, an important lesson I received from him was his acceptance that in a democracy, you had to work with endless compromises, balances, and trade-offs, and evaluation is a part of that. His emphasis was on dealing with and moving beyond limitations, and this is why he supported my efforts to develop new methodologies to cope with prospective congressional questions.

 

"When reading our reports, he focused most on that slippery, undocumented place between evaluative conclusions and their transmutation into findings and recommendations. And although he never missed a hedge or a twist in our logic, his basic concern was the likely application of the findings. He brought to every discussion an enduring curiosity, a detestation of hyperbole, a delightful and unexpected chuckle, and a truly vast knowledge of bureaucratic discontents and responses. What remains with me most, after all these years, is the realization of his uniqueness: a profound political understanding combined with an old-fashioned, touching and unbelievably reassuring integrity. We will miss him."

 

Adds AEA member Lois-ellin Datta: "This was a man who did not flinch while working with quite a cast of characters - Nixon, Kennedy, Carter, and Reagan, about 50 Congresses, scores of agency leaders. He brought participatory leadership, breadth of vision (international capacity building), and transformed "accounting" into "accountability" through program evaluation. Perhaps his greatest impact on our field cascaded from the creation of the Program Evaluation and Methodology Division within GAO, whose good friend he always remained. When he retired, program evaluation was in our vocabularies and accountability for excellence in government in our own visions."

ASTD Handbook of Measuring and Evaluating Training

PhillipsAEA member Patti Phillips is editor of the ASTD Handbook of Measuring and Evaluating Training, published by the American Society for Training & Development.

 

From the Publisher's Site:

"A follow-on to [the American Society for Training & Development's] ASTD's best-selling ASTD Handbook for Workplace Learning Professionals, the ASTD Handbook of Measuring and Evaluating Training includes more than 20 chapters written by preeminent practitioners in the learning evaluation field. This practical, how-to handbook covers best practices of learning evaluation and includes information about using technology and evaluating e-learning. Broad subject areas are evaluation planning, data collection, data analysis, and measurement and evaluation at work."

 

From the Author:

"ASTD contacted me to discuss a potential book that would be different from others we had developed. They wanted a collaborative text developed by experts and practitioners of training evaluation. While in 1983, Jack Phillips wrote the first Handbook of Training Evaluation and Measurement Methods published in the U.S. (Gulf Publishing), and we have edited many case study books describing the application of training evaluation, but there was no handbook that brought all of it together from the combined perspective of experts and practitioners. So it was a great opportunity."

 

"Like all of our publishing projects, there were many rewarding aspects. This particular book is now the primary textbook for one of the advanced workforce analysis courses taught in The University of Southern Mississippi's PhD Human Capital Development program. But the most rewarding aspect of this project," Phillips adds, "was the interaction with the contributors. I had the opportunity to work with training evaluation's best and brightest. Contributors represent organizations such as Merck & Co., Federal Aviation Administration, Cincinnati Children's Hospital Medical Center, Accenture, The University of Southern Mississippi, Center for Creative Leadership, HumRRO, and Southern Illinois University Carbondale. I also had the opportunity to work with leaders whose names are synonymous with training evaluation and human capital analytics: Jack Phillips, Donald Kirkpatrick, Mary Broad, Robert Brinkerhoff, Roger Kauffman, Dana Gaines Robinson, Bill Rothwell, and Jac Fitz-enz. This group of individuals set the stage for training evaluation. Their efforts paved the way and it is their work that serves as the foundation for today's progress and that of the future. Working with Rebecca Ray, Vice President and Managing Director, Human Capital, at The Conference Board was a privilege. Her contribution interviewing the evaluation experts and synthesizing their views of the training evaluation world is invaluable."

 

About the Author:

Patti P. Phillips, Ph.D. is President and CEO of the ROI Institute. An expert in measurement and evaluation, she helps organizations implement the ROI Methodology around the world. Phillips teaches others to implement the ROI Methodology through the ROI Certification process as a facilitator for ASTD's ROI and Measuring and Evaluating Learning Workshops and faculty for graduate-level evaluation courses. She is author and co-author of several books on measurement, evaluation, and ROI, including Beyond Learning Objectives, Show Me the Money, and The Value of Learning.

 

Go to the Publisher's Site
Data Den: AEA Membership Size and Gender
From Susan Kistler, AEA Executive Director

This month we're looking at the membership's composition, as it has grown since 2002, and broken down by gender.

We have gender data for over roughly 98% of the AEA membership (thank you to all of those who are kind enough to share this information on your membership profile). In 2002, females made up 62% of the membership and today that percentage has grown to approximately 68%. Over the same period, the membership grew 96%, from 3042 members in 2002 to 5975 members in 2010 (the last year for which we have a full set of data). Today, August 31, 2011, we show 6838 members on the records.


 

New Jobs & RFPs from AEA's Career Center  
What's new this month in the AEA Online Career Center? The following positions and Requests for Proposals (RFPs) have been added recently: 
  • Research Assistant at Public/Private Ventures (Philadelphia, PA, USA) 
  • Senior Study Director at Westat (Rockville, MD, USA) 
  • Researcher at McREL (Denver, CO, USA)   
  • Evaluation Director at NC Partnership for Children (Raleigh, NC, USA)
  • Research Coordinator at Ontario Tobacco Research Unit (Toronto, ON, CANADA)   
  • Associate Director at University of Massachusetts Medical School, Center for Health Policy and Research (Shrewsbury, MA, USA) 
  • FSG, Strategic Learning and Evaluation Center Consultant at FSG (Boston, MA, Seattle, WA, San Francisco, CA, USA)  
  • Senior Associate - Public Health at BTW informing change (Berkeley, CA, USA)
  • Education Reform Expert at ICF International (Fairfax, VA, USA) 
  • Evaluation Coordinator at Public Health Law & Policy (Oakland, CA, USA)

Descriptions for each of these positions, and many others, are available in AEA's Online Career Center. According to Google analytics, the Career Center received approximately 6,900 unique page views in the past month. It is an outstanding resource for posting your resume or position, or for finding your next employer, contractor or employee. Job hunting? You can also sign up to receive notifications of new position postings via email or RSS feed.

 

Get Involved
About Us
The American Evaluation Association is an international professional association of evaluators devoted to the application and exploration of evaluation in all its forms.

 

The American Evaluation Association's mission is to:
  • Improve evaluation practices and methods
  • Increase evaluation use
  • Promote evaluation as a profession and
  • Support the contribution of evaluation to the generation of theory and knowledge about effective human action.
phone: 1-508-748-3326 or 1-888-232-2275