GRG
P E R S P E C T I V E S:
Evaluation & Research News
In Brief
The second annual Bay Area Science Festival is taking place this year from October 25 to November 3. The Festival will include over 60 events all around the Bay Area. As before, GRG staff and field researchers will be collecting data at several events.

This is the final Festival funded by NSF. Festivals will continue to be offered around the country.
Check to see if there is a Science Festival in your area.
____________________

GRG is the external evaluator for the Science Festival Alliance's (SFA) newly funded scale-up project. The SFA aims to continue their work of strengthening and advancing Science Festivals across the US.
Learn more about SFA.
_____________________

Editor

Tina Lagerstedt

____________________ 

 

Want to learn more? Check us out online!

 

  Fall 2012   
Welcome from the President.....

 

Staff Retreat 2012
Most of our staff at the retreat.

Each year for the past three decades, I've savored the joys of New England autumns - crisp and clear days, tasting and even picking fresh apples, enjoying fall foliage, carving pumpkins, cheering on my favorite college football team. Like all seasonal transitions, it's a time of change from past to future. Weatherwise, I'm more fond of looking back at the fleeting summer than I am of anticipating the arrival of winter cold. On the other hand, at this week's annual GRG retreat, we spent less time reflecting back and more time thinking forward. "Change" is a de rigeur word in our evaluation profession; there's even a Theory of Change.

 

Recently we established a GRG book group, and we've been reading Influencer: The Power to Change Anything by Kerry Patterson, Joseph Grenny, David Maxfield, Ron McMillan, and Al Switzer (McGraw Hill: 2007). While the book has been discussed by folks in a variety of endeavors (e.g., marketing, sales, dieting, program management, education), we have focused on how we can use its principles in our communication regarding evaluations, with both our clients and the end users of their programs, particularly when it comes to recruiting the latter to participate in our evaluation research.

 

The basic premise of Influencer is that in order to make desired changes, we must first carefully determine which "vital" behaviors are most important to change, and then discover what issues, relating to those behaviors, people believe are in their best interest, and finally appeal to those beliefs. This motivation works better than trying to directly change behavior. Often, reinforcing just one belief can change the one vital behavior, which can then solve a complex problem and create remarkable changes. When asked to change their behavior, people ponder two questions: Is it worth it for me to change? (Motivation), and Can I do it? (Ability). Influencer discusses six strategies for approaching each question through each of three perspectives: Personal, Social, and Structural Influence:  

  1. Personal Motivation: Make the Undesirable Desirable (overcome reluctance and resistance)
  2. Personal Ability: Surpass Your Limits (learn to master necessary skills for success)
  3. Social Motivation: Harness Peer Pressure (enlist the help of others)
  4. Social Ability: Find Strength in Numbers (teamwork)
  5. Structural Motivation: Design Rewards and Demand Accountability (reward yourself early)
  6. Structural Ability: Change the Environment (surround yourself with supportive physical environments)

The book is highly readable, full of examples and case studies, and relevant to anyone interested in making positive change.

 

Wishing you a wonderful season,

  'Irene' signature
Irene F. Goodman, Ed.D.
Founder and President

What Can You Do When Key Data Are Missing?

By Karen Gareis  

GRG adds yet another cutting-edge statistical technique to its repertoire: Multiple imputation of missing data.

 

Chances are you've encountered this situation: Your program is collecting data from a set of at least two informants (e.g., students and their parents, professors and their students, families visiting a museum, clinicians and their patients) and you have missing responses from one of the pair. What to do? You don't want to simply exclude the missing pairs from analysis, because then the remaining sample would likely be biased. That is, people who complete surveys probably have some characteristics in common, and these are probably different from the characteristics of people who skip surveys.

 

Instead, we need a way to estimate missing responses on key variables for purposes of analysis. Multiple imputation is a statistically principled way of creating a set of "best guess" estimates based on the information we do have, including how variables are interrelated within the existing paired data and what variables predict missing survey data. The method also adjusts for the fact that we should be less confident in an estimated or imputed value than in an actual value provided by the participants. Analyses are performed on a set of imputed values, and results are combined.

 

Extensive studies have shown that multiple imputation is an effective and accurate method for dealing with missing data on key variables, provided there are enough relevant variables in the existing data to allow for good estimates of missing data. It can be used even with as much as half the data missing.

 

(For an accessible introduction to multiple imputation, click here.)

 

Spotlight on New Evaluations

Success at the Core Project  spotlight

Traditionally, professional development (PD) in education has taken the form of special "in-service" days set aside three or four times across the school year. This narrow view has recently been expanded, with more frequent interactive PD sessions (e.g., monthly, weekly). Yet, in some schools and districts, educators still view PD primarily as logging required hours of seat time, rather than an opportunity to improve their teaching skills.

 

When school culture shifts to regarding life-long learning as the norm, PD can take on a myriad of forms. Success at the Core (SaC), a web-based PD toolkit, focuses on improving classroom practice. The toolkit, with videos as the centerpiece, builds on research showing how instruction, school leadership, learning culture, and community collaboration can all impact student outcomes. Launched in 2009 as a partnership between EDC and Vulcan Productions, with additional funding from the Stuart Foundation and the Paul Allen Family Foundation, SaC allows schools and teachers to create their own PD experience customized to meet their specific needs.  

 

In May 2012, GRG began an evaluation of SaC aimed at gauging the impact of toolkit use on instructional quality and, subsequently, on student achievement.    

 

Two NSF CCEP Projects Funded 

In September 2012, the National Science Foundation (NSF) Climate Change Education Partnership (CCEP) Program awarded over $18 million to six five-year Phase II projects. Prior to that, 15 projects had each received two-year Phase I funding. The CCEP program seeks to develop and sustain large-scale interdisciplinary partnerships with the goal of increasing the adoption of effective, high-quality educational programs and resources related to the science of climate change and its impacts. The program also seeks to prepare today's U.S. citizens to understand global climate change and its implications in ways that can lead to informed, evidence-based responses and solutions. 

 

GRG serves as the external evaluator for two of the six currently funded projects. Columbia University's Polar Learning and Responding (PoLAR) is a thematic project while The Franklin Institute's Climate & Urban Systems Partnership (CUSP) is a regional project. We were previously the Phase I evaluator of both projects. 

 

In Other News...
GRG Staff Presentation at IPSEC 
GRG's Director of Research, Colleen Manning, gave a presentation at this year's International Public Science Events Conference
(IPSEC) on October 11. Colleen shared GRG's findings in What We Know About Science Festivals.

NSF Revises Merit Review Criteria

To those of you who apply for funds from NSF, please take note that on October 4, NSF revised their Proposal & Award Policies & Procedures Guide (PAPPG), effective starting with proposals submitted this January 14, 2013 (NSF 13-1). A summary of all the changes is here, but prospective PIs may wish to review the merit criteria revisions. Proposals will still be evaluated on their Intellectual Merit and Broader Impacts, but "guidance has been provided to clarify and improve the function of the criteria."

 

A Sample of a Client's Product
Given the focus on "greening activities" around the country, we thought you might be interested in the Cambridge Green Audio Tour, which contains 13 clips about Cambridge points of interest and the recent initiatives to green the city. This is part of a series including tours of the newly renovated high school, the library's new green building (and green restoration of the old building), and Boston Children's Museum. This product has been produced by GRG's client, MIT's Terrascope Youth Radio (TYR), in which urban teens develop, report, write, produce, and host audio pieces on environmental science and "green" initiatives across the country.

Are there similar green initiatives in your city? Let us know by posting on our Facebook page!
    
________________________________________
Thank you for reading our newsletter! For more information about our exciting work, check out our website,

Find us on Facebook
 
View our profile on LinkedIn