|
Newsletter: September 2011 | Vol 11, Issue 9
|
|
|
|
|

Dear Colleagues,
The shenanigans of Rupert Murdoch's global media executives have faded from the news headlines. But, the main storylines of the news in today's world continue - corruption and fraud (9-11 charities, BP oil spill settlements); political unrest (Syria, England); assassinations and murders (Kabul, Georgia); and a persistently feeble economy, stagnated in part by the massive stalemate that has the US political system in lock-down.
What can evaluation contribute to such a challenging and tumultuous social-political-economic global landscape? Here is one vision.
Evaluation can offer a space where the practices of reason and deliberation are engaged, where evidence of multiple kinds displaces rhetoric, and where conversation is directed toward shared goals of social betterment and practical wisdom. Evaluation can create a space where difference is accepted, respected, and viewed as an opportunity for listening and for learning, for learning to listen. Evaluation can be a space that welcomes, even celebrates values of tolerance, dialogue, integrity, and social responsibility. With such a space, evaluation can contribute a discourse of counterpoint to the main storylines of contemporary news and can present a challenge to our stalemated elected politicians to stop whining, start listening respectfully to one another, and just get back to work.
In this envisioned evaluative space, sound bytes are banned and important questions of educational, social, and economic policy are engaged in their full complexity, contextuality, and consequentiality. Value differences are openly acknowledged and accepted as integral to democratic pluralism, to democracy on the ground. Conversation replaces confrontation. Data displace diatribe. Evidence trumps evangelism for any cause.
May I challenge all of us to think well about how we might create such evaluation spaces in our own evaluation work and practice.
And may I extend my heartiest of congratulations to all of AEA's award winners this year (see below)! Excellent work warrants public acknowledgement and reward!
Best regards,
Jennifer
Jennifer Greene
AEA President, 2011
jcgreene@illinois.edu
|
|
 |
|
|
Policy Watch - Protection of Human Research Subjects - Advice Needed
|
From George Grob, Consultant to the Evaluation Policy Task Force
In last month's column, I mentioned how useful and important it is that AEA members notify us about emerging evaluation policy issues. Well, it happened again! And this one is particularly important. An AEA member sent an email alerting me that the Department of Health and Human Services (HHS), in concert with the Office of Science and Technology Policy, is beginning a major overhaul of government-wide procedures (known as the "Common Rule") pertaining to the protection of human research subjects. HHS issued an advance notice describing current policies, outlining reforms under consideration, and inviting public comment. On September 6, I sent the AEA membership an email about this unique opportunity to comment.
This policy overhaul is quite important for several reasons. First and foremost, it is about protecting vulnerable populations - ensuring informed consent by subjects of biomedical and social and behavioral research, guaranteeing their privacy, and mostly ensuring that they will not be harmed in scientific experiments.
At the same time, it recognizes that the current procedures may impede the conduct of valuable research. The advance notice describes this fundamental clash of values. It "seeks comment on how to better protect human subjects who are involved in research, while facilitating valuable research and reducing burden, delay, and ambiguity for investigators."
Because of the extraordinary complexity of the human research subject procedures and of the issues involved, it would be inappropriate to outline or summarize them here. In my email to all of you, I highlighted policies related to exemption of most forms of social and behavioral research, simplification of procedures for review of multi-site experiments, and coverage of evaluation. Here I want to single out evaluation. HHS is explicitly "seeking comment on whether and, if so, how, the Common Rule should be changed to clarify whether oversight of quality improvement, program evaluation studies, or public health activities are covered." See question 24 in the proposed rules.
After considering input received as a result of the notice, HHS plans to redraft the Common Rule. When it does, it will again seek public comment. At that time AEA would like to be in a position to submit formal comments. To do that, we need to get a sense of your concerns now. And so does HHS. For all these reasons, the Evaluation Policy Task Force urges you to take this opportunity to weigh in on this important topic.
The first step is to re-read the email sent earlier this month. It includes links to the HHS notice and convenient explanatory materials. It also explains how to make comments and share them with us by the October 26 deadline.
Thank you in advance for your comments. Please don't hesitate to contact me at evaluationpolicy@eval.org if you need additional information about this important policy.
Go to AEA's Evaluation Policy Task Force website page
|
AEA's 2011 Awards Winners Announced
|
Join us in congratulating this year's AEA Awards winners. In all, we are recognizing seven individuals and/or groups whose works withstood the scrutiny of judges and whose contributions to the field are distinguished and enviable. In future newsletters, we will spotlight each awardee independently and tell you more about their work. They will officially be honored on Friday, Nov. 4, at our Evaluation 2011 conference. If you are there, please join us in the festivities. And, if not, please join us now in recognizing:
- Leonard Bickman, 2011 Alva and Gunnar Myrdal Evaluation Practice Award
- Margaret Hargreaves, 2011 Marcia Guttentag Promising New Evaluator Award
- David Jenkins, 2011 Outstanding Evaluation Award
- Robin Lin Miller, 2011 Robert Ingle Service Award
- Changing At-Risk Behavior (Joyce Ranney, Michael Zuschlag, Michael Coplen, Michael Harnar), 2011 Outstanding Evaluation Award
- Idaho Legislature's Office of Performance Evaluations - 2011 Alva and Gunnar Myrdal Government Award
- The Systematic Screening and Assessment Method: Finding Innovations Worth Evaluating (Laura Leviton, Laura Kettel Khan, Nicola Dawkins), 2011 Outstanding Publication Award
"Even with the most sophisticated and discerning criteria, determining the 'merit and worth' of AEA Awards nominees is an awesomely difficult task," says 2011 Awards Committee Chair Ricardo Millett. "With the experience, diligence, patience, fortitude and discipline of my colleagues Frances Lawrence and Tarek Azzam, and with support from the AEA staff, we were able to make tough calls from among an extraordinary group of talented nominees. Our thanks to all who participated in the process."
Go to AEA's Evaluation 2011 Web Page
|
TechTalk - Google's +1 Button
|
From LaMarcus Bolton, AEA Technology Director

The Google +1 button allows you to give a website or webpage your "public stamp of approval" (Google's words, not mine!). It is akin to clicking "like" in Facebook, or giving a thumbs-up on LinkedIn, and it is a way of accomplishing three things:
- Pointing colleagues to things you found useful
- Reminding yourself of things you found useful
- Influencing what Google shows in search results
 | Google +1 Button |
In order to use Google +1, you need to be signed in to a Google account while online. Then, you'll start seeing +1 buttons, like the one pictured here, when you browse. They show up in two places: - When you search with Google, they'll be to the right of each search result
- On certain webpages that have them installed, they'll appear on the page itself
When you click the +1 button, you are indicating your approval of that search item or webpage. When colleagues browse or arrive at the web page that you gave a +1, they'll be able to see that you have given it the +1 stamp. When lots of people give something a +1 stamp, Google has indicated that it is likely to appear higher in search results.
We're trying out Google +1 on aea365, AEA's Tip-a-Day website and email subscription service by and for evaluators. Look for the +1 buttons at the bottom of posts when you are on the aea365 website (or click back through to the site if you subscribe via email), and click +1 for those of most value to you. You'll then be able to find all of your +1 items on your Google profile for easy future reference.
Google launched Google +1 back in March and is still refining the functionality. To make things a little confusing, Google +1 is tied to the Google+ social network, but you need not use the Google+ service to use Google +1 buttons. Give it a try to determine for yourself whether Google +1 adds value to your searching and browsing experience.
Want to learn more about Google +1? Check out this 1-minute video from Google:
| Introducing the +1 Button Video |
Do you have questions about Google +1 or tech issues at AEA? Don't hesitate to send me an email at marcus@eval.org.
|
What's the Story with eStudy?
|
From Stephanie Evergreen, AEA eLearning Director
Haven't you heard? Professional Development eStudy is AEA's newest way to offer in-depth professional development from the field's top trainers.
Presenters offer essentially the same content as the in-person professional development workshops at forums like the annual conference, with slight modification for a webinar platform. For example, in place of small group work, eStudy courses usually include short homework assignments between sessions. The eStudy series provides equal time and content quality to our conference workshops, without the travel and hotel expenses.
We solicit presenters from our in-person professional development offerings who rate in the top half of all workshops with good attendance and whose content translates well to the online environment. We then pilot the presenters in a webinar format via our Coffee Break Demonstrations and look for strong reviews from attendees before moving to the longer-format eStudy. In other words, we use member-driven evaluation as our selection criteria.
So far, so good. Jennifer Catrambone presented an eStudy in August on Nonparametric Statistics and Michael Quinn Patton just wrapped up 6 total hours of eStudy on Utilization-Focused Evaluation. We've had great attendance and lots of interest in our offerings.
Next week, Kylie Hutchinson will be presenting an eStudy on Effective Alternatives to a Written Report. The last day to register for her course will be this Thursday, September 29. That's soon! Review and full eStudy description and sign up online.
We're not holding eStudy sessions in November and December, but will be back in full force this winter with a lineup that is sure to please. Keep an eye out for announcements, and please don't hesitate to send me any feedback or suggestions. I'm Stephanie Evergreen, your eLearning Director, and I'm here to help. You can reach me at stephanie@eval.org.
The development of the eStudy series is overseen by a member-led task force chaired by Nicole Vicinanza and including Jennifer Dewey, David Fetterman, Amanda Greene, Theresa Murphrey, and Kseniya Temnenko. Building on their leadership, we're going strong. Thanks so much!
|
AJE Rises in Thomson Reuters Journal Citation Reports' Annual Rankings
|
In Thomson Reuters Journal Citation Reports' annual ranking of academic journals, the American Journal of Evaluation (AJE) had an impact factor of 1.157 in 2010, ranking 17th among 83 publications reviewed in the social sciences category. Each year, Thomson Reuters indexes roughly 7,500 journals from more than 3,300 publishers from 60 countries, divides them into subject categories and calculates each journal's "impact factor." A journal's impact factor is the average number of times journals published in one year cite articles published in the preceding two years. In the social sciences category, anything over 1.000 is considered high-impact. AJE was recently ranked among the top interdisciplinary social sciences journals in terms of impact on the field. During 2010, AJE articles were cited a total of 437 times in AJE and approximately 100 other publications. The journal also has a 5-Year Impact Factor of 1.556. Its ranking rose this year from 21 to 17, despite the entry of 15 new journals to the social sciences category.
Current AJE Editor-in-Chief Thomas Schwandt, the journal's Associate Editors, and its publisher, SAGE Publications, have consciously worked to sustain the previous editor's efforts to maintain a high impact factor in a number of ways.
"An article might be cited for any number of reasons," says Schwandt, "but typically only when it is high quality scholarship that presents new findings or insights on either theoretical or practical issues, novel interpretations, or tackles a controversy in the field in a way that generates intellectual interest or 'buzz.' Therefore, the primary strategy for boosting impact factor is to attract and publish only the highest-quality new work available. It is important to remember that citations to an article only during the two to five calendar years following its publication have an effect on a journal's impact factor. Therefore, maintaining a consistent pipeline of important, new work from year to year is critical to sustaining a high impact factor."
What are the Top 5 most cited articles for 2010?
- The evaluation of large research initiatives - A participatory integrative mixed-methods approach (Trochim, Marcus, Masse, et al.)
- The Fairy Godmother and her warts - Making the dream of evidence-based policy come true (Weiss, Murphy-Graham, Petrosino, et al.)
- Response rates for mixed-mode surveys using mail and e-mail/web (Converse, Wolfe, Huang, et al.)
- A multidisciplinary model of evaluation capacity building (Preskill, Boyle)
- Evaluation's second act - A spotlight on learning (Preskill)
At the Evaluation 2011 conference in Anaheim, Schwandt will lead a session offering guidance for how to publish in AJE. Panel Session 277, which also includes Editor Sandra Mathison from New Directions for Evaluation, will be held from 10:45 AM to 11:30 AM on Thursday, November 3. A similar session last year attracted more than 100 participants interested in publishing in AEA's journals.
Don't forget: AEA members have 24-hour access not only to current AJE content but also to more than 20 years of archives.
Go to the AEA Online Journals Access Page
|
Face of AEA - Meet Tom Chapel, CDC Evaluation Officer
|
AEA's 6,800 members worldwide represent a range of backgrounds, specialties and interest areas. Join us as we profile a different member each month via a short Q&A. This month's profile spotlights Tom Chapel, a CDC evaluator long active with AEA who serves as a liaison both locally and within the field.

Name, Affiliation: Thomas (Tom) Chapel
Professional Position: Chief Evaluation Officer, Centers for Disease Control and Prevention (CDC), Atlanta, Georgia
Degrees: MA, MBA
Years in the Evaluation Field: Since mid-1980s
Joined AEA: 1999
AEA Leadership Includes: Chair of the Membership Committee, and several ad hoc committees as well. Convener for the Local Affiliates Collaborative (LAC), an organization of representatives from AEA's local affiliates, which aims to build mutual support and learning among affiliates. As part of his CDC job, he also coordinates the annual AEA/CDC Summer Evaluation Institute, which draws about 600 evaluators annually from around the nation.
Why do you belong to AEA?
"Networking and professional development are the main reasons. I have met folks from a variety of disciplines, all with something to teach me that can be of use in my own work. Also the opportunities for me to teach professional development courses has helped me hone my own messages and approaches."
Why do you choose to work in the field of evaluation?
"I was looking for a chance to do some good while putting my analytical skills to use. After a few tries in the social sciences, I discovered program evaluation and realized it was exactly the match I was looking for."
What's the most memorable or meaningful evaluation that you have been a part of - and why?
"I've been involved in so many that it's hard to point to one or two, but one or two are memorable because they taught me to practice what I preach, especially with regards to stakeholder engagement and clear and consensus program description. This is at the core of the approach I preach to others, yet in two cases - one an evaluation of a community asthma program and the other working on some indicators for preparedness - I neglected to do the proper groundwork. My assumption of 'clairvoyance' came back to haunt me big time."
What advice would you give to those new to the field?
"Strong evaluation is a combination of strong analytical skills as well as people skills. Getting agreement and consensus is as much a matter of facilitation and nudging others forward as it is knowing which data collection or analysis method is best suited. Also, to remember that evaluations are always case specific. We are blessed (or cursed) with a field where the right choices vary with the situation, and a design and approach that sings in one setting is completely discordant in another one."
If you know someone who represents The Face of AEA, send recommendations to AEA's Communications Director, Gwen Newman, at gwen@eval.org. |
Meet John Gargani - Incoming Board Member
|
In our last issue, we promised a quick introduction of our three incoming Board members as well as the 2013 President. We'll spotlight each individually and thank them for their commitment to service.
John Gargani is an evaluation practitioner, historian of evaluation, and business owner with 20 years of professional experience. He has been a member of AEA since 1999 and is an active presenter at annual conferences. He directs the San Francisco Bay Area Evaluators, a local AEA affiliate; chairs the Program Theory and Theory-Driven Evaluation TIG; and serves as Editor of the Historical Record section of the American Journal of Evaluation. He has served on the board of Hostelling International, a large non-profit that like AEA has adopted the Policy Governance approach.
In his ballot statement, John pledged to work to (1) build a larger, more diverse, more connected membership; (2) increase the variety of evaluation information that informs social action; and (3) strengthen outreach to other associations and philanthropic organizations that are actively promoting evaluation.
"The Board has made great strides in these areas, and building upon them is especially important now as the field undergoes significant change. The explosion of web-based technology has raised expectations regarding the availability and utility of evaluation information; radical policy shifts such as healthcare reform and the nascent implementation social impact bonds depend on evaluation in unprecedented ways; and growth in the number of philanthropic organizations has led to an increase in the number of evaluations being conducted. John would like to work with the AEA Board as it manages this change in ways that strengthen the membership, affirm AEA's leadership position, and maximize the social benefits of evaluation."
Currently, John is President of Gargani + Company, Inc., a program design and evaluation firm located in Berkeley, California. In this role, he has directed methodologically diverse evaluations throughout the U.S. These include mixed-methods evaluations of youth development programs, randomized trials of educational programs, and qualitative evaluations of arts programs. He promotes a broad view regarding methods (for example, see Gargani and Stewart Donaldson's contribution to the forthcoming volume of New Directions for Evaluation), and in his practice uses a wide range of methods to answer questions that are relevant to stakeholders.
|
In the News - JAMA & LGBT Medical Inequities
|
AEA member David Fetterman recently co-authored an article that appeared in the Journal of the American Medical Association (JAMA). According to the article, a survey of deans of medical schools in the U.S. and Canada reveals that the median amount of time in the medical school curriculum dedicated to topics related to the health care needs of lesbian, gay, bisexual and transgender (LGBT) patients is about 5 hours, though there is wide variation among the schools in quantity, content and perceived quality of instruction. Only one-quarter of the schools rated their overall coverage of LGBT-related curricular material as good or better. The article has resulted in extensive press coverage, including for example, the Associated Press, CNN, Forbes, Huffington Post, Medical News Today, U.S. News & World Report, and The Washington Post.
The study was conducted by the Stanford University School of Medicine. "The evaluation finding is so striking - only 5 hours devoted to LGBT training in medical schools across the country (and in some cases zero during the clinical years) that everyone agrees something has to be done about it," notes Fetterman. "In addition, because we found this hole in the medical education curriculum, this article will go a long ways in helping to change medical education around the country - to increase LGBT coverage across the country and Canada."
Fetterman is a Past-President of AEA, AEA Alva and Gunnar Myrdal and Paul Lazarsfeld Awardee, and the current co-chair of the Collaborative, Participatory, and Empowerment Evaluation TIG.
|
Meet the Winners of the AEA Student Case Competition at Evaluation 2011
|
Mentoring programs have long been a staple in an effort to prevent and reduce delinquency and problem behaviors in at-risk youths. But which programs work, which work best, and why? Research on mentoring offers some interesting data, but rigorous research and evaluation on more nontraditional group mentoring programs has been limited. That reality prompted a comprehensive evaluation with one of the most name-recognized youth mentoring organizations in the world - as well as a student case competition with input from some of the evaluation field's youngest and brightest. The student winners will be recognized at Evaluation 2011 during Think Tank Session 286, to be held Thursday, Nov. 3 at 10:45 a.m. Join us as Lyn Shulha (left) introduces the student winners and the case. Each student will present one feature of their design that pays particular attention to values or valuing criteria used in judging the submission, as well as challenges they encountered and how they addressed them. These will be critiqued by Gale Mentzer, who was the PI for the evaluation group awarded the opportunity.
The case competition was based on a call for an evaluation design based on an abbreviated version of a 2010 Request for Proposal issued by the Office of Juvenile Justice and Delinquency Prevention of the US Department of Justice looking for an evaluation of mentoring programs supported by the Boys and Girls Clubs of America. Students interested in participating were asked to create a three page design that attended to at least the following questions:
- What is the purpose and audience of your evaluation? And with what justification?
- By what criteria will you judge program success? And with what justification?
- What overall evaluation design will you use? And with what justification?
Participants were to pay particular attention to the values dimensions of the evaluation when justifying their thinking, in consideration of the 2011 Presidential Strand theme of Values and Valuing.
There were 12 submissions, and all responses were read by seven reviewers representing different evaluation backgrounds. The criteria used for judging quality were limited to:
- The quality of overall argument: Does it demonstrate an understanding of the nuance of the evaluand and context? Overall, is it feasible to conduct in an ethical and equitable manner and likely to provide useful results?
- The strength of and basis for the justification provided supporting their argument: Is the basis conjecture? Logic? The literature? The Guiding Principles or Program Evaluation Standards?
- The integration of the values dimension: Were values considerations explicitly and thoughtfully addressed within each justification?
Thanks to all the students who entered the challenge and thanks to all the volunteers who were a part of the evaluation process! Special congratulations go out to the award winners:
- Penny Black, University of Wisconsin
- Brandi Gilbert, University of Colorado at Boulder
- Jessica Jackson, Claremont Graduate University
- Ebun Odeneye, University of Texas, Houston
Go to AEA's Evaluation 2011 Web Page
|
Essentials of Utilization-Focused Evaluation
|
AEA member Michael Quinn Patton is author of Essentials of Utilization-Focused Evaluation, published by SAGE.
From the Publisher's Site:
"Based on Michael Quinn Patton's best-selling Utilization-Focused Evaluation, this briefer book provides an overall framework and essential checklist steps for designing and conducting evaluations that actually get used. The new material and innovative graphics present the utilization-focused evaluation process as a complex adaptive system, incorporating current understandings about systems thinking and complexity concepts. The book integrates theory and practice, is based on both research and professional experience, and offers new case examples and cartoons with Patton's signature humor."
From the Author:
"People are naturally asking how the new book is different from the 4th edition of Utilization-Focused Evaluation (2008). First is length. At 667 pages, I got lots of feedback that, outside of graduate seminars where depth of treatment was valued, the 4th edition was pretty overwhelming. Indeed, this was honest, important, and useful feedback, so -- to walk the talk of using feedback - I set out to write a book geared to practitioners, one that was more practical, straightforward, and SHORTER. I organized the book around the 17 steps of utilization-focused evaluation (up from 12 steps in the 4th edition), to provide a step-by-step practical guide.
"In Roman mythology, Janus is the god of gates, doors, beginnings, endings, and time. The month of January is named in his honor. He is depicted with two faces looking simultaneously in opposite directions, past and future. The Janus challenge I faced was portraying utilization-focused evaluation as a series of sequential steps while also capturing the complex nature of the utilization-focused process as non-linear, interactive, and dynamic. There are interactions among the steps, feedback loops, recursive iterations, interrelationships and interdependencies among those involved in the evaluation, and other complex dynamics that can be observed in any emergent system. To reflect these real world system dynamics, I created the Janus complexity interludes between steps.
"Doing the new book let me update utilization research and references, add new examples, and especially add a whole new set of cartoons to illustrate the issues that arise in evaluation. The basic utilization-focused message remains the same, but the guidance about how to actually conduct a utilization-focused evaluation is, hopefully, more concrete and accessible."
About the Author:
Michael Quinn Patton is an independent evaluation consultant with 40 years experience conducting evaluations, training evaluators, and writing about ways to make evaluation useful. He is a former President of AEA and recipient of the Alva and Gunnar Myrdal Award for "outstanding contributions to evaluation use and practice" and the Paul F. Lazarsfeld Award for lifetime contributions to evaluation theory. The Society for Applied Sociology honored him with the Lester F. Ward Award for Outstanding Contributions to Applied Sociology.
Go to the Publisher's Site
|
Data Den: AEA Membership Size and Ethnicity
| From Susan Kistler, AEA Executive Director
In last month's Data Den, we looked at the membership's composition, as it has grown since 2002, broken down by gender. This month's companion piece focuses on the membership's racial and ethnic composition over the same time period.
We have race/ethnicity data for roughly 94% of the AEA membership (thank you to all of those who share this information on your membership profile). As a percentage of the overall membership, the largest growth in representation has come from our international members - those residing outside of the US - which has risen from approximately 11.4% of the total membership in 2002 to just over 15% in 2010. Looking at the 2011 Annual Conference registrants to date, I would anticipate this trend continuing - there are already over 40 countries represented among the conference delegates!

|
Volunteer Opportunity - Evaluation Policy Task Force Members
|
AEA Evaluation Policy Task Force (EPTF): AEA's EPTF is seeking letters of interest from possible new members. The EPTF focuses on guiding the association's policy-influencing processes in the United States federal sector. EPTF members should preferably have experience with evaluation policies and practices in the U.S. government, or have a strong background of research and publication on evaluation policy. Non-federal level evaluators should not feel that they are excluded. The EPTF is a very active task force, meeting monthly via conference call with extensive email exchange in between calls, often responding to short turnarounds for vetting of draft documents. If you are interested in possibly serving on the EPTF, please send a letter of interest by Monday 10/10 to AEA Executive Director Susan Kistler at susan@eval.org. In the letter, indicate (1) your experience with evaluation policies and practices or your history of research and publications on this topic, (2) at least one key area where you believe there are opportunities for improvement of government evaluation policy and why improvement is needed, and (3) your commitment to make yourself available for monthly phone and ongoing email communication.
|
New Jobs & RFPs from AEA's Career Center
|
What's new this month in the AEA Online Career Center? The following positions and Requests for Proposals (RFPs) have been added recently:
- Monitoring and Evaluation Specialist Short Term Consultant at World Bank Institute (Washington, DC, USA)
- Manager of Program Evaluation at Boston University (Boston, MA, USA)
- Vice President for Research Operations at Child Trends (Washington, DC, USA)
- External Evaluator at Search for Common Ground (Democratic Republic of the Congo)
- Director, Health Research Center at West Virginia University (Morgantown, WV, USA)
- Monitoring and Evaluation Manager at Lutheran World Relief (Baltimore, MD, USA)
- Project Evaluation Post-Earthquake Health Promotion at American Red Cross (HAITI)
- Research Assistant at Harder+Company Community Research (San Francisco, CA, USA)
- Research Associate Senior at Virginia Department of Social Services (Richmond, VA, USA)
- Research Associate - Patient-Reported Outcomes at Mapi Values (Boston, MA, USA)
Descriptions for each of these positions, and many others, are available in AEA's Online Career Center. According to Google analytics, the Career Center received approximately 6,900 unique page views in the past month. It is an outstanding resource for posting your resume or position, or for finding your next employer, contractor or employee. Job hunting? You can also sign up to receive notifications of new position postings via email or RSS feed.
|
About Us | The American Evaluation Association is an international professional association of evaluators devoted to the application and exploration of evaluation in all its forms.
The American Evaluation Association's mission is to:
- Improve evaluation practices and methods
- Increase evaluation use
- Promote evaluation as a profession and
- Support the contribution of evaluation to the generation of theory and knowledge about effective human action.
phone: 1-508-748-3326 or 1-888-232-2275
|
|
|
|
|
|
|