|Sharing the results of your evaluation or data collection beyond your own staff helps generate support for your work. Broader audiences typically respond to succinct communications, so we will often turn a comprehensive research report into an evaluation or data "Brief." A Brief presents the highlights of your research in terms that everyone from staff to funders to clients and the press can understand, and thus activates their interest. But how do you choose these highlights? Let's look at a few examples from some recent work we did turning a 68-page Report into a 4-page Brief!|
In our example report, data were available for many outcomes that could have potentially been achieved with participating youth, such as: improved academic interest, school attendance, avoidance of alcohol consumption, behavior, etc. To decide which ones to highlight in the Brief, we followed these:
Three Useful Parameters
1) The finding is relevant to what the program aims to achieve.
2) The finding is of interest to a broader audience.
3) There is a solid methodology supporting the credibility of the finding.
A finding needed to meet all three parameters before making it into the Brief. Here's how we assessed the school attendance outcome section of the report against each parameter:
1) Low or decreasing school attendance is often one of the first signs that a student is headed towards academic failure or drop-out, and one of the project's long-term goals was to reduce academic failure.
✓ Relevant to program aims
2) The project's services were provided within the school setting, so project staff shared the concern of school attendance with parents and all school/district staff working with students.
✓ Relevant to broader audience
3) The school attendance finding in our report relied on: a) actual school records, b) two points in time - one before and one after participation, and c) data for the majority of the participants being tracked. A statistical test was also used to see if the difference between the two timepoints was significant (if not significant, change is more likely due to random chance).
This combination of choices made for a solid methodology - much more solid than if we had asked if attendance changed or if we received data on only a few of the participants.
✓ Solid methodology
So, school attendance clearly satisfied all three parameters. To share this finding with a wider audience, we eliminated all the details about how we collected and tested the data. We highlighted just one sentence "School attendance rates increased among students participating in the Program." Then, we created a chart to show the actual data. All the detail was still in the report for anyone interested, but the shorter, more widely disseminated document contained its essence.
We found another possible highlight related to grade point averages. Once again, we checked the finding against our three parameters. Like attendance, grades clearly met parameters one and two. The full report included several types of statistical tests we had run to identify a difference in participants' grades. Our methodology also checked out, but the level of detail in the report (half a page!) would only distract from the point in the Brief. Here's how we took a long explanatory paragraph from the report and turned it into a succinct highlight.
When examining differences in GPAs for students who attended the Program compared to those who were referred but did not participate, an interesting pattern emerged. An ANOVA indicated no significant difference between the average baseline GPA between students who participated (1.99) and those who did not (1.84). However, there was a significant difference (p<.05) in average year-end GPA. Students who participated in the Program had a higher average year-end GPA (2.19), compared to their counterparts (1.76) who were referred but did not participate.
We believe just as much in good research as in good communication. Taking the time to identify your highlights transforms your report-on-the-shelf into research in action!