Newsletter: March 2014 

Vol 14, Issue 3

Banner600

Message from the President: Reflect on Your TIG Memberships, Share Your Thoughts

 

Dear AEA Colleagues,

 

March is a great time to talk about the work of AEA's Topical Interest Groups (TIGs). In March, TIG leaders and the many volunteers they've recruited gear up to review the conference proposal submissions and recommend ones for the annual conference. Interested in how the review process works? Check out Lauren Lawson's article below my message!

 

Over the years, I have watched with fascination the gradual expansion of the number of TIGs (e.g., from 32 in 2003 to 44 in 2010 and 49 in 2013). And, earlier this month, the board approved the formation of two new TIGs: Translational Research Evaluation TIG and Program Design TIG, bringing the total to 51. 

  

To help me grasp the nature of the collective set of TIGs, I contacted Sally Bond. I recalled that she had served on a board-appointed group in 2010 that considered the TIGs' role in knowledge and professional support and made policy recommendations to support AEA's new governance structure. The following is a list of categories for clustering the TIGs that emerged from their work:

 

  • Context of Evaluation includes TIGs defined by the role/orientation of the evaluator or the sector in which the evaluation is carried out. 
  • Evaluation Design and Methodology includes TIGs devoted to approaches to evaluation design or methodology.
  • Content and Topical Areas includes TIGs focused on specific subjects, programs, or services. 
  • Foundations of Evaluation includes TIGs concerned with topics that define and sustain the profession such as theory, research, and communication.
  • Diversity in Evaluation includes TIGs formed to ensure that the perspectives of diverse evaluators and populations are considered in the conduct of evaluation.

  

Sara Vaca, an AEA member from Spain whom I met at the 2013 AEA conference, took a further step to help my understanding. Using the available TIG data on the AEA website, she created an infographic. Before you click the link, any guesses as to which of the above categories has the most TIGs? Which TIGs have the fewest members? The average number of members within a TIG?

  

Each AEA member can join up to five TIGs, thus connecting our members in multiple ways. In talking with members, I learned of a variety of approaches people use to choose their TIGs. Some choose one group because the content is an area of their expertise and choose another because the content is new to them. Other factors such as leadership, contribution, and/or networking opportunities also are influential. Some members like to stay with the same TIGs from year to year while others vary their membership. What matters to you? How do you choose? How do you participate in TIGs? What do you most value about TIGs? 

 

I'm interested in your thoughts about how the TIG structure affects the well-being of the association and the field of evaluation. Are there ways to enhance the contribution of TIGs to the ongoing development of the evaluation field and association as a whole?  

 

From a policy perspective, the board is interested in how TIGs keep us growing as individuals, as an association, and as a discipline. Please send me or other board members your thoughts and stories about the role of TIGs. 

 

Warm regards,

 

Beverly Parsons

AEA 2014 President

In This Issue
2014 Summer Institute
GEDI Host Sites
2014 Summer Institute
Walking the Talk
Face of AEA
Policy Watch
Diversity
Book Profile
eLearning
p2i
Inclusion Article
New Job Postings
Register
Get Involved
About Us
Quick Links
Important Note
To ensure this newsletter reaches you every month, add info@eval.org to your email contacts!
Join our Mailing List!
AEA TIGs to Begin Review of Over 1,800 Evaluation 2014 Proposals 

From Lauren Lawson, AEA Headquarters

 

The official start to the Evaluation 2014 planning process is underway, and I'm excited to be able to report that we received over 1,750 submissions for conference sessions and 73 professional development workshop applications, totaling 1,831 proposals! Aside from Washington, D.C., last year, this is the most proposals ever submitted for our annual conference. We hope this is an early indicator that we will have great attendance in Denver, Oct. 15-18, 2014.

 

The next step for your TIG leaders is to manage the review of all 1,831 proposals over the next month. As you can imagine, this will be a daunting task! AEA education staff is helping to streamline the review process and, for the first time, have created the ability for online reviews. Directly through the AEA website, all TIG reviewers will have immediate access to the abstracts and can rate them online using standard review criteria, including relevance to both the AEA audience and the conference theme, technical quality, innovativeness, diversity, and more.   

 

Of course, we realize that not all TIGs adhere to the same review criteria (after all, we have TIGs ranging from Disaster & Emergency Management to Nonprofit and Foundations!). We are also continuing to offer the ability for TIGs to review on their own using more customized review criteria, while still adhering to the general principals above. These TIGs will still be able to submit the paper workbooks used in past years.

 

Once the review has been completed, AEA staff will work to create a program that meets everyone's expectations. For the 1,800-plus of you who submitted, we want to thank you for your interest in the meeting. For the 500-plus of you who volunteered to help serve as reviewers, we want to applaud you for your dedication to AEA. To all of our TIG leaders, we always appreciate your commitment to advancing the AEA mission. And, of course, to all of you who plan to attend the meeting in October, we look forward to seeing you in Denver! 

AEA Call for Graduate Education Diversity Internship Program Host Sites 

We're searching for host sites for AEA's Graduate Education Diversity Internship (GEDI) Program. Host sites provide meaningful evaluation project work and mentoring to interns. GEDI interns are among the best and brightest graduate students in the country who are learning through the internship to transfer their strong inquiry skills to real-life situations in organizations, agencies, and firms. 

 

Hosting a GEDI is a unique opportunity to help build evaluation's future through fostering the professional growth of an intern from a background under-represented in the field. A number of host sites have found the GEDI experience so positive they have invited their intern to continue in a part- or full-time capacity upon completion of the internship.

 

Interns work two days per week, September through June. Finalists are selected by an advisory team based on the applicant's capacity and interests as well as the needs of the site. Sites then interview and select from the most qualified candidates in their region.

 

Check out the GEDI website to learn more about the internship and applying to serve as a host site! Email Gail McCauley by Monday, April 16, to discuss this unique opportunity to work with tomorrow's leaders today! 

Register Now: 2014 Summer Institute, June 1-4 in Atlanta

Registration for the 2014 Summer Institute is open! Join the American Evaluation Association June 1-4, 2014, in Atlanta for this year's Summer Institute. Evaluators, applied researchers, grantmakers, foundation program officers, nonprofit administrators, and social science students are all invited to attend. Sessions are filled on a first-come, first-served basis. View session availability.

 

The Summer Institute will include three keynote addresses, five rotations of three-hour and 40-minute training sessions, plus two group lunches to allow for networking among conference attendees. View the agenda-at-a-glance.

 

Presenters include experts who have conducted evaluations in a variety of settings, nationally known authors, those working on the cutting edge, evaluation experts, and outstanding trainers. View the workshop and course descriptions.  

 

Register today!

AEA Values - Walking the Talk with Corrie Whitmore

Are you familiar with AEA's values statement? What do these values mean to you in your service to AEA and in your own professional work? Each month, we'll be asking a member of the AEA community to contribute her or his own reflections on the association's values.  

 

AEA's Values Statement

The American Evaluation Association values excellence in evaluation practice, utilization of evaluation findings, and inclusion and diversity in the evaluation community.

 

             i.  We value high quality, ethically defensible, culturally responsive evaluation practices that lead to effective and humane organizations and ultimately to the enhancement of the public good.

             ii. We value high quality, ethically defensible, culturally responsive evaluation practices that contribute to decision-making processes, program improvement, and policy formulation.

            iii. We value a global and international evaluation community and understanding of evaluation practices.

            iv. We value the continual development of evaluation professionals and the development of evaluators from under-represented groups.

             v. We value inclusiveness and diversity, welcoming members at any point in their career, from any context, and representing a range of thought and approaches.

            vi. We value efficient, effective, responsive, transparent, and socially responsible association operations.

 

 


 

I came to evaluation from organizational and developmental psychology, where I studied the development of trust between individuals and groups. That experience informs my evaluation practice and makes me grateful for an employer that explicitly focuses on "working together with the Native community to achieve wellness" and a professional association that values "culturally responsive evaluation practices that lead ... to the enhancement of the public good." 

 

One of the first programs I came to work with at Southcentral Foundation (SCF) was the Nutaqsiivik mother-baby home visiting program. "Nutaqsiivik" means "a place of renewal" in Yupik, and the program connects nurse home visitors with mothers who meet certain criteria (including first time moms, those who meet income requirements, and those with other needs). As I worked with the nursing staff during a grant-funded transition, they were generous in their welcome but uncertain how evaluation could benefit the program.  

 

One way I was able to build trust with and show respect for them was by  focusing my efforts on collecting data that will be used, rather than accumulated and "round-filed." Sharing how the evaluation process would "contribute to decision-making processes, program improvement, and policy formulation" (as the AEA Values Statement says) in concrete ways helped them understand that their time was valued and their contributions to the evaluation were important drivers for program operations.   

 

AEA's inclusive values are idealistic in the best way, highlighting the noble possibilities of our work as evaluators. The AEA Values Statement motivates my work and links small decisions I make as a practitioner to larger conversations we are having as a profession.    

 

Corrie Whitmore, Ph.D., is the program evaluator for Southcentral Foundation's Nutaqsiivik Nurse Family Partnership Home Visiting Program, CDC-funded Screening and Prevention Programs, and Research Center for Alaska Native Health. Whitmore is a developmental psychologist by training and a lifelong Alaskan, committed to improving the health of her home state. In addition to her work at Southcentral Foundation, Whitmore teaches in the University of Alaska Anchorage's Honors College and serves as President of the Alaska Evaluation Network. 

Face of AEA -  Meet Bess Rose

AEA's more than 7,800 members worldwide represent a range of backgrounds, specialties, and interest areas. Join us as we profile a different member each month via a short question-and-answer exchange. This month's profile spotlights Bess Rose.

  

Name: Bess A. Rose

Affiliation: IES Pre-Doctoral Training Fellow and Doctoral Candidate, Johns Hopkins University School of Education

Degrees: B.A., Sociology, Johns Hopkins University; M.A., Comparative Literature and Critical Theory, University at Buffalo; M.Ed., Measurement and Evaluation, Western Governors University; Ed.D. (expected 2015), Johns Hopkins University School of Education

Joined AEA: 2010

  

How did you hear about AEA? 

  

I think I must have heard about AEA when I was a research and evaluation coordinator at the Maryland State Department of Education, before I left to pursue my doctorate full time. In 2010, we had contracted with Measurement Inc. for a statewide evaluation of 21st Century Community Learning Centers. I think it was the project director, Shelly Menendez at MI, who suggested I join AEA. Looking back it seems crazy that it took me that long to join. I had been working on evaluations at MSDE since 2003, but it took us a while to build up an organizational culture around evaluation practice.

  

What do you hope to gain from being a member?

  

I have already learned so much from being a member of AEA. I love the diversity of the resources available to members. I participated in two eStudy classes and a Coffee Break Demonstration last year that were helpful but all for different reasons. Dale Berger's "Applications of Correlation and Regression" helped cement the terms "mediation" and "moderation" for me. Gail Barrington's "Introductory Consulting Skills for Evaluators" was a great entryway into some of the basics of independent consulting. "Evaluation and GIS" by David Robinson was a quick but fruitful introduction to using geographic data, which I hope to do some day. I also love how many ways there are to be involved. Participating in TIGs is a nice way to feel at home in such a large organization. Presenting at the annual conference last year pushed me to communicate some complex methodological material to a roomful of people who needed tools they could use in the field. AEA has really helped me grow. 

 

What inspires you about your field?

 

The people. Evaluators and educators are some of the most caring and thoughtful — in every sense of the word — people you will ever meet. I have been privileged to be associated with two No. 1 organizations: Maryland public schools (which were rated No. 1 in the U.S. for five years in a row) and the Johns Hopkins School of Education (which was just rated No. 1 by U.S. News and World Reports). Working at the intersection of evaluation and education, I get to see all kinds of dynamic work. Evaluators bring a valuable set of tools to some of the most pressing social problems we face today, and what I find especially inspiring is the innovative ways they work to put these tools in the hands of teachers and others who work with children and youth. 

 

What do you hope to do after receiving your doctorate? 

 

Get a job! I have enjoyed the freedom to focus on my dissertation and research projects while I've been in school, but I really miss working with all the folks at the state education department, schools and afterschool programs. In my experience, program staff are eager to learn more about evaluation and how to make evaluations work for them to make their programs better. I look forward to being part of that work again. 

Policy Watch - President's 2015 Budget Evaluation Guidance  

OrosFrom Cheryl Oros, Consultant to the Evaluation Policy Task Force (EPTF)

 

With spring comes renewed strong signals of this administration's commitment to building an "evidence culture" that encourages evaluation. This year brings a rare combination of the President's Budget and the Economic Report of the Council of Economic Advisors. Both are remarkably expansive and detailed in their treatment of evaluation. Both put impact evaluation into the broader context of performance management. And both acknowledge the importance of formative evaluation before narrowing their attention on impact evaluation. 

 

The President's 2015 Budget (see Analytical Perspectives, Chapter 7: "Program Evaluation and Data Analytics") discusses the roles of program evaluation and performance management, operationalizing the evidence infrastructure, examples of evaluations and innovative pilots, evaluation capacity, sharing best practices, common evidence standards, "what works" repositories, acting on evidence, tiered-evidence grant programs and innovation funds, the Social and Behavioral Sciences Team and the Pay for Success initiative. 

 

The administration advocates building a culture that views program evaluation, statistical series, data analytics, and performance measurement as valuable, complementary tools, since each has different strengths. It encourages the use of impact evaluations to provide strong evidence about whether a program works or alternative practices might work better. The budget supports new impact evaluations, including how to structure student aid to increase college access; strengthen technical assistance to small businesses; and use flexibility in housing assistance to increase employment and self-sufficiency.  

 

The administration is also promoting Pay for Success (in housing, workforce and education programs) in which private investors provide up-front funding for preventive services for which the government does not pay unless and until there are results. This budget re-proposes $300 million in incentives for states, localities, and not-for-profits to invest in programs that will produce savings as well as better outcomes. 

 

The Council of Economic Advisers' Economic Report of the President, Chapter 7: "Evaluation as a Tool for Improving Federal Programs" complements the president's budget guidance. It provides an overview of the implementation and use of impact evaluation, discussing inherent challenges. It supports the administration's efforts to build and use evidence, including actions taken on lessons learned from completed evaluations, launching new evaluations in areas where not enough is known, and creating a culture of evidence-building, especially in grant programs. 

 

It also identifies opportunities for further progress, including embedding evaluation into routine program operations, and using existing program data to measure outcomes and impacts. For example, it suggests building randomization into the design of the program, so that data on program performance can be tracked and evaluated on an ongoing basis. The Economic Report also highlights the DOL set aside for evaluation in the Consolidated Appropriations Act, extending the deadline for obligating transferred evaluation funds to two years. This is important because designing rigorous evaluations takes time, a window beyond the standard one-year for obligating evaluation funds can in some cases enable agencies to plan and execute more thorough, higher quality evaluations.  

 

The combination of the administration's budget documents and the Economic Report sends a powerful message about the importance of evaluation for budget decisions. This year's guidance continues the trend of identifying a broader array of methods to connect the two.  


Diversity - We Want You! Getting Involved with AEA

From Zachary Grays, AEA Headquarters

 

One of the most important duties of an AEA member is getting involved with the association. After all, this is your organization! AEA represents more than 7,000 individual practitioners, students, educators, and enthusiasts with a presence around the globe. Volunteer involvement makes this association tick!

 

From the TIGs that review the more than 1,800 proposal submissions for the annual conference to the members who volunteer on the various working groups, the touch of collaboration runs deep to AEA's roots and is responsible for AEA's contribution to the advancement of evaluation practice. Getting involved is a phenomenal opportunity to add diverse perspectives to the governing bodies of the association. 

 

Here are just a few ways you can get involved with AEA:  

  

GEDI Call for Host Sites: We're looking for good sites, and we want you! Since its creation 10 years ago, over 50 of the nation's brightest graduate students studying evaluation have graduated the Graduate Education Diversity Internship Program and have gone on to become successful leaders in the field and in the AEA community. We are looking for a few good sites that are dedicated to the advancement and application of cultural competence and diversity in evaluation practice. Visit the official call for GEDI host sites and contact AEA to discuss what you can do to introduce the discipline to tomorrow's leaders.  

  

Awards Nominations: On Oct. 18, 2013, AEA honored six individuals at its 2013 Awards Luncheon in Washington, D.C. Honored were recipients involved with cutting-edge evaluation/research initiatives that have impacted citizens around the world. Each recipient represented diverse backgrounds and made up varied bodies of impressive work. Nominating a colleague for a prestigious AEA award is a great way to honor the achievements of the very diverse spectrum of practitioners. Nominations are now being accepted for 2014. For more details on this year's call for awards nominations, click here

  

Cultural Competence Working Group: Lastly, don't forget that you can also get involved in some of the ongoing activities within the association. Fast approaching is the anniversary of the AEA Public Statement on Cultural Competence in Evaluation. On April 22, 2011, the membership of AEA approved the statement, created throughout a six-year period from 2005 to 2011. The statement's purpose is to address the complexity of needs and expectations concerning evaluators working across cultures and diverse communities. It is the vision of AEA to foster an inclusive, diverse, and international community of practice positioned as a respected source of information for and about the field of evaluation. 

 

As any good evaluator knows, this work is never done and must evolve constantly. The AEA Public Statement on Cultural Competence in Evaluation Working Group leads the charge in assuring that AEA maintains this vision. It provides resources on integrating culture and context in evaluation and champions the use and application of the concepts with the statement in evaluation practice, teaching, and policy. To get involved with the Cultural Competence in Evaluation working group, contact Cindy Crusto.    

___________________________________________________________________________________

 

Involvement in the association is a significant duty as a member of AEA. Diversity encompasses many capacities and benefits best in environments of cultural awareness, collaboration, and inclusiveness. This is where you come in! The above mentioned opportunities are just a few exciting things happening at AEA. Stay tuned for more ways to get involved, and feel free to contact us to inquire about how to volunteer. 

Book Profile - The Basics of Achieving Professional Certification: Enhancing Your Credentials

Willis Thomas is the author of The Basics of Achieving Professional Certification: Enhancing Your Credentials, a new book published by Productivity Press. 

 

From the Publisher's Site:

 

"The Basics of Achieving Professional Certification: Enhancing Your Credentials" provides clear-cut guidance on how to select a certification that is right for you and how you can continue to build your credentials in support of personal and professional goals. This easy-to-use guide can help anyone looking to achieve professional certification make informed decisions about the many options available. It can also help avoid the pitfalls of making the wrong choice as a result of being incorrectly informed. Examining the range of professional certifications offered by associations and organizations, it explains how to select the right professional certification and outlines best practices for completing the certification process. 

 

From the Author:

 

"The Basics of Achieving Professional Certification: Enhancing Your Credentials" is an excellent reference for any person pursuing certification and any organization supporting certification. It provides a comprehensive overview of the certification industry. It outlines certification in a step-by-step method to help a person realize what they need to do to obtain and maintain their professional certification. Many people have questions on whether certification is worth the time investment, when to pursue certification and how to go about it. This book answers these very important questions and much more.

 

About the Author:

 

Willis H. Thomas, PhD, PMP, CPT is a project management professional and certified  performance technologist who has been involved in organizational development  and training across the pharmaceutical and information technology industries, working for companies such as Pfizer, Xerox, Ameritech, and Brinks as an employee. He has also served as a consultant and trainer to a wide variety of organizations across industries. His functional focus areas have been in quality assurance, human resources and operations. 

eLearning Update - Discover Upcoming eStudy Courses and Coffee Break Demonstrations

Our eStudy program is made up of in-depth virtual professional development courses. Below are April's eStudy offerings: 

 

eStudy 043: Nonparametric Stats - Jennifer Catrambone  

April 1 and April 8

2-3:30 p.m. ET

 

Standard (parametric) statistics techniques require relatively robust and bell-shaped samples. When the sample is small and/or skewed, parametric statistical techniques lose their power and become inappropriate. Nonparametric statistics are crucial when working with data that "breaks the rules." This workshop provides a brief overview of the most commonly needed nonparametric statistics techniques, explains clearly when to use them versus their parametric counterparts, and shows, via screen shots, how to run these techniques in SPSS. 

 

Read more and register.

 

eStudy 041: Beginning Developmental Evaluation - Michael Quinn Patton

April 16, April 18, April 21, and April 28

2-3:30 p.m. ET

 

This eStudy is geared for an audience with beginner level expertise in developmental evaluation (DE). DE is especially appropriate for innovative initiatives or organizations in dynamic and complex environments where participants, conditions, interventions, and context are turbulent, pathways for achieving desired outcomes are uncertain, and conflicts about what to do are high. DE supports reality-testing, innovation, and adaptation in complex dynamic systems where relationships among critical elements are nonlinear and emergent. Evaluation use in such environments focuses on continuous and ongoing adaptation, intensive reflective practice, and rapid, real-time feedback. The purpose of DE is to help develop and adapt the intervention (different from improving a model). 

 

This evaluation approach involves partnering relationships between social innovators and evaluators in which the evaluator's role focuses on helping innovators embed evaluative thinking into their decision-making processes as part of their ongoing design and implementation initiatives. DE can apply to any complex change effort anywhere in the world. Through lecture, discussion, and small-group practice exercises, this workshop will position DE as an important option for evaluation in contrast to formative and summative evaluations as well as other approaches to evaluation. 

 

Read more and register.

____________________________________________________________________________________ 

 

Our Coffee Break Webinars are short, 20-minute presentations of tools or tips we think evaluators will find helpful in their work lives. Let's take a look at what's in the pipeline for April:

 

Thursday, April 3
2-2:20 p.m. ET

In this webinar, Megan Noel will share practical experience using mobile data capture for field surveys in Rwanda, Ethiopia, and Malawi. Why use mobile?
 
Thursday, April 10
2-2:20 p.m. ET
 
This Coffee Break will highlight the use of discrepancy analysis (between current and desired conditions) in determining evaluation knowledge needs of Geriatric Education Centers since 2010. The session will discuss how the ECB logic model was used to guide the NA approach. Longitudinal data (2010-2014) will demonstrate reductions in reported evaluation needs on topics selected for ongoing technical assistance.  

 

CBD177: Data Visualization in Executive Summaries - Nate Wilairat

Thursday, April 17

2-2:20 p.m. ET

 

Nate Wilairat, senior consultant at EMI Consulting, will demonstrate how to visualize findings in executive summaries. Nate found that separating analysis and interpretation in the summaries was a key aspect of meeting the needs of executives. He will cover two types of useful visualizations: annotated dashboards and matrices.

 

You can pre-register for the webinar by clicking the link above. 

Potent Presentations Initiative - We. Are. TWO!
From Stephanie Evergreen, Potent Presentations Initiative Coordinator  

 

We are putting on our party hats over here and breaking out the (healthy) birthday cake. The Potent Presentations Initiative is two years old! 

 

In our two years on Planet AEA, we've produced tons of resources to support your growth as an evaluator who often has to give presentations about data. Here are the three most popular resources we have created:

 

Training webinars on crafting your presentation's message, design, and delivery

 

We recorded our training webinars on the three areas of presentation development and posted them right on our webpage for easy access any time you need a little refresher. Don't wait until the week before conference time — watch these while you prepare any type of presentation.

 

We also posted the slides — with my speaking notes! — for each webinar: Message, Design, and Delivery. Download these, modify them, and conduct your own in-house training. 

 

Presentation Assessment Rubric
 

 

When it comes time to practice your presentation (because you ARE practicing, RIGHT?) give your audience this handy rubric to collect their feedback about your efforts. We've heard from many evaluators who are implementing this rubric with colleagues before company presentations.

 

 

 

 

Messaging Model handout

 

I recently read something that said researchers (and evaluators) want to allocate presentation time in the same proportion of time spent on that part of the evaluation - hence why so much presentation time is erroneously spent on things like data collection instead of findings and importance. The Messaging Model handout helps you proportion your presentation time around what is going to be most relevant for most audiences.  

2013 AEA Conference Presenter Writes Article on Inclusion
Kenneth Kelty won one of the AEA diversity travel awards for last year's conference and wrote one of the CC in Evaluation Dissemination working group's AEA365 entries. Kelty recently detailed his story for Think College, a national organization dedicated to developing, expanding, and improving inclusive higher education options for people with intellectual disabilities. 
New Jobs & RFPs from AEA's Career Center  
What's new this month in the AEA Online Career Center? The following positions have been added recently: 

Descriptions for each of these positions, and many others, are available in AEA's Online Career Center. Job hunting? The Career Center is an outstanding resource for posting your resume or position, or for finding your next employer, contractor, or employee. You can also sign up to receive notifications of new position postings via email or RSS feed.

Register
Get Involved
About Us
AEA is an international professional association of evaluators devoted to the application and exploration of evaluation in all its forms.

 

The association's mission is to:
  • Improve evaluation practices and methods.
  • Increase evaluation use.
  • Promote evaluation as a profession.
  • Support the contribution of evaluation to the generation of theory and knowledge about effective human action.
phone: 1-202-367-1166 or 1-888-232-2275 (U.S. and Canada only) 
websire: www.eval.org